[Proposal] Schema cache dump #5162

Merged
merged 5 commits into from Mar 8, 2012

Conversation

Projects
None yet
4 participants
@kennyj
Contributor

kennyj commented Feb 25, 2012

In my experience, if we had many models (ex. one hundred), Rails boot was slowly.
According to production log, it seems that AR's schema data loading is slowly especially.

Thus I've implemented schema cache dumping. Please review it.
I guess this implementation has many fixing point ;)

Usage:

$ edit config/environments/production.rb
config.use_schema_cache_dump = true
$ RAILS_ENV=production bundle rake db:schema:cache:dump
=> generate db/schema_cache.dump
$ RAILS_ENV=production rails s
@tenderlove

This comment has been minimized.

Show comment
Hide comment
@tenderlove

tenderlove Feb 27, 2012

Member

I like this idea, but can we change a few things?

First, can we just implement marshal_dump and marshal_load on the SchemaCache object? Second, I'm not sure that loading every model is the best idea for the schema cache. What about asking for all the tables and populating the cache that way? For example:

schema_cache.connection.tables.each do |table|
  schema_cache.populate(table)
end

Maybe not a populate method, but something. I don't really like the idea of requiring every model in order to get the schema cache.

I have another idea that is related to this: can we enable schema caching by default? We can use the migration version to determine if the cache should be expired. Maybe add a version method to the schema cache object.

Anyway, I really like this feature.

Member

tenderlove commented Feb 27, 2012

I like this idea, but can we change a few things?

First, can we just implement marshal_dump and marshal_load on the SchemaCache object? Second, I'm not sure that loading every model is the best idea for the schema cache. What about asking for all the tables and populating the cache that way? For example:

schema_cache.connection.tables.each do |table|
  schema_cache.populate(table)
end

Maybe not a populate method, but something. I don't really like the idea of requiring every model in order to get the schema cache.

I have another idea that is related to this: can we enable schema caching by default? We can use the migration version to determine if the cache should be expired. Maybe add a version method to the schema cache object.

Anyway, I really like this feature.

@kennyj

This comment has been minimized.

Show comment
Hide comment
@kennyj

kennyj Feb 27, 2012

Contributor

Thank you for comment ! I'll improve the implement :)

Contributor

kennyj commented Feb 27, 2012

Thank you for comment ! I'll improve the implement :)

@kennyj

This comment has been minimized.

Show comment
Hide comment
@kennyj

kennyj Feb 29, 2012

Contributor

Hi @tenderlove

Done!
Please review new some commits.

Contributor

kennyj commented Feb 29, 2012

Hi @tenderlove

Done!
Please review new some commits.

@kennyj

This comment has been minimized.

Show comment
Hide comment
@kennyj

kennyj Mar 2, 2012

Owner

A hash with default_proc can't be dumped.

A hash with default_proc can't be dumped.

tenderlove added a commit that referenced this pull request Mar 8, 2012

@tenderlove tenderlove merged commit 447ecb0 into rails:master Mar 8, 2012

@josevalim

This comment has been minimized.

Show comment
Hide comment
@josevalim

josevalim Aug 1, 2012

Contributor

This configuration should not be here. It is specific to Active Record and therefore should be defined in Active Record railtie.

This configuration should not be here. It is specific to Active Record and therefore should be defined in Active Record railtie.

This comment has been minimized.

Show comment
Hide comment
@kennyj

kennyj Aug 1, 2012

Contributor

Certainly, I agree with you.. Do you mean kennyj/rails@82bd05a ?

Contributor

kennyj replied Aug 1, 2012

Certainly, I agree with you.. Do you mean kennyj/rails@82bd05a ?

@dhh

This comment has been minimized.

Show comment
Hide comment
@dhh

dhh Sep 11, 2012

Member

Can you provide some benchmarks for this optimization? How much does it actually speed things up?

Member

dhh commented Sep 11, 2012

Can you provide some benchmarks for this optimization? How much does it actually speed things up?

@kennyj

This comment has been minimized.

Show comment
Hide comment
@kennyj

kennyj Sep 11, 2012

Contributor

I'll provide it, but I've many works during this week. Please, just wait a moment a few days.

Contributor

kennyj commented Sep 11, 2012

I'll provide it, but I've many works during this week. Please, just wait a moment a few days.

@kennyj

This comment has been minimized.

Show comment
Hide comment
@kennyj

kennyj Sep 16, 2012

Contributor

Sorry for keeping you waiting for this reply.

I tested about this performance.
But this result was not expected one.

・building environment steps
https://gist.github.com/3730757
・test result
https://gist.github.com/3730759

In my experience on Oracle, the queries to data dictionary were very slow when having many data.
Thus, by similar approache, we solved that problem.

I'll try to research a little more.

Contributor

kennyj commented Sep 16, 2012

Sorry for keeping you waiting for this reply.

I tested about this performance.
But this result was not expected one.

・building environment steps
https://gist.github.com/3730757
・test result
https://gist.github.com/3730759

In my experience on Oracle, the queries to data dictionary were very slow when having many data.
Thus, by similar approache, we solved that problem.

I'll try to research a little more.

@kirs kirs referenced this pull request Nov 14, 2016

Merged

Schema cache in YAML #27042

@metaskills metaskills referenced this pull request in customink/secondbase Dec 18, 2016

Closed

Support For Schema Cache File #37

@metaskills metaskills referenced this pull request in customink/secondbase Jan 24, 2017

Merged

Schema Cache Support #40

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment