Skip to content

Commit

Permalink
chore: fix linters
Browse files Browse the repository at this point in the history
  • Loading branch information
baygeldin committed Jun 17, 2024
1 parent 5357e95 commit 5584681
Show file tree
Hide file tree
Showing 3 changed files with 5 additions and 5 deletions.
2 changes: 1 addition & 1 deletion lib/tantiny/errors.rb
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ class TantivyError < StandardError; end

class IndexWriterBusyError < StandardError
def initialize
msg = "Failed to acquire an index writer. "\
msg = "Failed to acquire an index writer. " \
"Is there an active index with an exclusive writer already?"

super(msg)
Expand Down
4 changes: 2 additions & 2 deletions lib/tantiny/index.rb
Original file line number Diff line number Diff line change
Expand Up @@ -103,10 +103,10 @@ def search(query, limit: DEFAULT_LIMIT, **smart_query_options)

private

def slice_document(document, fields, &block)
def slice_document(document, fields, &)
fields.inject({}) do |hash, field|
hash.tap { |h| h[field.to_s] = resolve(document, field) }
end.compact.transform_values(&block)
end.compact.transform_values(&)
end

def resolve(document, field)
Expand Down
4 changes: 2 additions & 2 deletions lib/tantiny/schema.rb
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ class Schema
:facet_fields,
:field_tokenizers

def initialize(tokenizer, &block)
def initialize(tokenizer, &)
@default_tokenizer = tokenizer
@id_field = :id
@text_fields = []
Expand All @@ -23,7 +23,7 @@ def initialize(tokenizer, &block)
@facet_fields = []
@field_tokenizers = {}

instance_exec(&block)
instance_exec(&)
end

def tokenizer_for(field)
Expand Down

0 comments on commit 5584681

Please sign in to comment.