-
Notifications
You must be signed in to change notification settings - Fork 24
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SearchableDataObject DB table has multiple keys/indexes #23
Comments
That's a problem I was speaking about with @wernerkrauss on the IRC channel. The problem is that it isn't possible to hook the dev/build since SS v. 3.1.17, and I'm actually building my keys using a workaround. I can maybe create a new branch that drop support for older installations, and go ahead with the new version... What do you think about that? |
Hi @g4b0 I am using SS v3.1.15 which I believe is the most up to date stable release at the current time. Do you mean that your module has already been updated to work with a newer version of SS which is still in beta and because of that you have started to use this work around? I don't really fully understand what the issue is, or what causes it to occur (I understand the error message itself, but not why the code attempts to create multiple indexes which seem to be identical) it seems to happen to me after some random period of time when doing a /dev/build (or perhaps after a fixed number of /dev/builds based on my DB contents?). I presume simply removing the augmentDatabase() method all together would mean that the /dev/tasks/PopulateSearch task would need running any time I wanted newly added data objects to be added to the SearchableDataObjects table in the DB? Not really too sure when the augmentDatabase() method is called? Kind regards, HARVS1789UK |
@g4b0 wasn't there another solution discussed by hooking in init() on dev/build contoller? |
@wernerkrauss yep, I have to further investigate about it... |
I've came up with this same issue and I found a way to solve it. Change the line 62 on searchable-dataobjects/code/SearchableDataObject.php: DB::query("ALTER TABLE SearchableDataObjects ADD FULLTEXT ( to:
The problem was that the Index were created over an over everytime we executed dev/build until it ran out of limit. |
Nice workaround @diegopego86, it could be useful for maintaining compatibility with old installations. |
included in version 2.1.3 In a future 3.* version I will hook into dev/build loosing compatibility with older SS version |
I am getting the following error when attempting to /dev/build with the silverstripe-searchable-dataobjects module installed:
Based on the error I am assuming the issue is in someway related to the following line in SearchableDataObject::augmentDatabase():
DB::query("ALTER TABLE SearchableDataObjects ADD FULLTEXT (
Title,
Content)");
I am guessing this query is being run multiple times and a new index/key is being created in the DB each time it is run.
As you can see from the following screenshots (taken in Navicat as MySQL DB management tool) after initially installing the module running /dev/build and /dev/tasks/PopulateSearch I have 24 indexes, then some time later (possibly having changed the current locale for the site) I run another /dev/build and this attempts to increase to > 63 indexes which is what triggers the SQL error.
After initial set up:
![screen shot 2015-10-06 at 19 11 28](https://cloud.githubusercontent.com/assets/5114645/10318050/679e9b6a-6c5f-11e5-8287-e86b1358abff.png)
After a subsequent /dev/build some time later (having switched locale in my config files):
![screen shot 2015-10-06 at 18 59 13](https://cloud.githubusercontent.com/assets/5114645/10318062/78b03bd4-6c5f-11e5-8cc4-b456ecb79093.png)
I am not familiar with the augmentDatabase() method or where and when it is called otherwise I would investigate further and propose a fix.
This article may be of some help, at very least it helped me track down the issue itself - https://www.safaribooksonline.com/blog/2012/11/06/avoiding-too-many-keys-in-mysql/
Any ideas?
The text was updated successfully, but these errors were encountered: