-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Do Tablite Support different datasets Concurrently ? #57
Comments
Hi @akash-goel - It could probably be done. Could you describe your use case in a little more detail? |
Hi , we have a user case , in which we have multiple users which are using a webapp and are trying to work on different files. when i have checked the Tablite i am not able to see where tablite is storing the data , is it storing in a different file or the same file and can we change the location of storage. Please let me know if this usecase works with tablite. Regards, |
Data is stored in tmp/tablite.hdf5. This file - just like sqlite3 - can contain an infinite number of datasets (as long as you have disk space). |
Thanks for your response , I have tried on small dataset , Functionality working fine but when i am trying with big dataset getting below error. Code Block
Error
Can you please guide what configuration we can do to load the file. |
As noted in the config in line 21 the single processing limit is 1_000_000 rows. When the data exceeds this number of rows, tablite switches to multiprocessing. As you are using windows, this means you need to make your module importable for the windows subprocess 1. The easiest way to do this, is to wrap your code block in a function, such as this: def main():
Table.reset_storage()
t3 = Table.import_file('Data_test.csv')
t3.show()
if __name__ == "__main__":
main() |
Closing this issue as there has been no news since April 21st. |
Hi Team,
Can we support different datasets and can store at different location and process it concurrently.
Regards,
Akahs
The text was updated successfully, but these errors were encountered: