Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with
or
.
Download ZIP
Browse files

lists the options for find_each and find_in_batches

  • Loading branch information...
commit 3f18fa008c352a2f3dbfcda6d5044225050b891c 1 parent 348b982
@thejamespinto thejamespinto authored
Showing with 37 additions and 17 deletions.
  1. +37 −17 activerecord/lib/active_record/relation/batches.rb
View
54 activerecord/lib/active_record/relation/batches.rb
@@ -19,8 +19,26 @@ module Batches
# person.party_all_night!
# end
#
- # You can also pass the +:start+ option to specify
- # an offset to control the starting point.
+ # ==== Options
+ # * <tt>:batch_size</tt> - Specifies the size of the batch. Default to 1000.
+ # * <tt>:start</tt> - Specifies the starting point for the batch processing.
+ # This is especially useful if you want multiple workers dealing with
+ # the same processing queue. You can make worker 1 handle all the records
+ # between id 0 and 10,000 and worker 2 handle from 10,000 and beyond
+ # (by setting the +:start+ option on that worker).
+ #
+ # # Let's process for a batch of 2000 records, skiping the first 2000 rows
+ # Person.find_each(start: 2000, batch_size: 2000) do |person|
+ # person.party_all_night!
+ # end
+ #
+ # NOTE: It's not possible to set the order. That is automatically set to
+ # ascending on the primary key ("id ASC") to make the batch ordering
+ # work. This also means that this method only works with integer-based
+ # primary keys.
+ #
+ # NOTE: You can't set the limit either, that's used to control
+ # the batch sizes.
def find_each(options = {})
find_in_batches(options) do |records|
records.each { |record| yield record }
@@ -28,31 +46,33 @@ def find_each(options = {})
end
# Yields each batch of records that was found by the find +options+ as
- # an array. The size of each batch is set by the +:batch_size+
- # option; the default is 1000.
- #
- # You can control the starting point for the batch processing by
- # supplying the +:start+ option. This is especially useful if you
- # want multiple workers dealing with the same processing queue. You can
- # make worker 1 handle all the records between id 0 and 10,000 and
- # worker 2 handle from 10,000 and beyond (by setting the +:start+
- # option on that worker).
- #
- # It's not possible to set the order. That is automatically set to
- # ascending on the primary key ("id ASC") to make the batch ordering
- # work. This also means that this method only works with integer-based
- # primary keys. You can't set the limit either, that's used to control
- # the batch sizes.
+ # an array.
#
# Person.where("age > 21").find_in_batches do |group|
# sleep(50) # Make sure it doesn't get too crowded in there!
# group.each { |person| person.party_all_night! }
# end
#
+ # ==== Options
+ # * <tt>:batch_size</tt> - Specifies the size of the batch. Default to 1000.
+ # * <tt>:start</tt> - Specifies the starting point for the batch processing.
+ # This is especially useful if you want multiple workers dealing with
+ # the same processing queue. You can make worker 1 handle all the records
+ # between id 0 and 10,000 and worker 2 handle from 10,000 and beyond
+ # (by setting the +:start+ option on that worker).
+ #
# # Let's process the next 2000 records
# Person.find_in_batches(start: 2000, batch_size: 2000) do |group|
# group.each { |person| person.party_all_night! }
# end
+ #
+ # NOTE: It's not possible to set the order. That is automatically set to
+ # ascending on the primary key ("id ASC") to make the batch ordering
+ # work. This also means that this method only works with integer-based
+ # primary keys.
+ #
+ # NOTE: You can't set the limit either, that's used to control
+ # the batch sizes.
def find_in_batches(options = {})
options.assert_valid_keys(:start, :batch_size)
Please sign in to comment.
Something went wrong with that request. Please try again.