Replies: 2 comments 1 reply
-
@tridoxx the default importer is parsing the data into a datastore table row by row, and if the file is large it may exceed the batch limit, once the limit is hit the remaining rows are re-queued and wait for the next cron run to resume. If you are using a mysql database and your db user has permission to use LOAD DATA LOCAL INFILE, then you can enable the datastore_mysql_import module, this will import the file in a single step and is MUCH faster than the default option. |
Beta Was this translation helpful? Give feedback.
-
@janette Thank you very much for your answer, so just by enabling the module and that the user has the permission, I execute the harvest normally, and it should make the import much faster? |
Beta Was this translation helpful? Give feedback.
-
hello, I am doing dataset imports, with the command drush queue:run datastore_import
, and I see that some datasets show this, I would like to know why this happens. Thanks a lot
Beta Was this translation helpful? Give feedback.
All reactions