Skip to content

Commit

Permalink
PERF: optimise new/unread query
Browse files Browse the repository at this point in the history
This is part 1 of the work, need some internal cleanup to follow

new query seems to handle very large sets very efficiently
  • Loading branch information
SamSaffron committed Sep 29, 2015
1 parent 42925b4 commit bc8c6d1
Showing 1 changed file with 8 additions and 8 deletions.
16 changes: 8 additions & 8 deletions app/models/topic_tracking_state.rb
Expand Up @@ -128,13 +128,9 @@ def self.report(user_id, topic_id = nil)
# cycles from usual requests # cycles from usual requests
# #
# #
sql = report_raw_sql(topic_id: topic_id) sql = report_raw_sql(topic_id: topic_id, skip_unread: true, skip_order: true)

sql << "\nUNION ALL\n\n"
sql = <<SQL sql << report_raw_sql(topic_id: topic_id, skip_new: true, skip_order: true)
WITH x AS (
#{sql}
) SELECT * FROM x LIMIT #{SiteSetting.max_tracked_new_unread.to_i}
SQL


SqlBuilder.new(sql) SqlBuilder.new(sql)
.map_exec(TopicTrackingState, user_id: user_id, topic_id: topic_id) .map_exec(TopicTrackingState, user_id: user_id, topic_id: topic_id)
Expand Down Expand Up @@ -198,7 +194,11 @@ def self.report_raw_sql(opts=nil)
sql << " AND topics.id = :topic_id" sql << " AND topics.id = :topic_id"
end end


sql << " ORDER BY topics.bumped_at DESC" unless opts && opts[:skip_order]
sql << " ORDER BY topics.bumped_at DESC"
end

sql
end end


end end

0 comments on commit bc8c6d1

Please sign in to comment.