-
-
Notifications
You must be signed in to change notification settings - Fork 503
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
dumping large pandas dataframe (~ >10k rows) times out #77
Comments
Great - can you open a pull request or do want to paste the function here? although the issue will not be dataframe specific but also apply to lists and numpy arrrays (they are all being transformed into lists before transferred over). |
here's the function I wrote.
|
The function posted by EmilyBarbour is bugged and causes an empty row to appear between chunks. I modified the function and now it works fine. I posted it on StackOverflow, but I'm posting it here too for easy access.
For me, a chunk size of 100k is fine, however you may change it depending on your needs. |
Great thanks! Still hope it'll make it into one of the next releases, but as pointed out above, we'll need to implement this further down to cover numpy arrays and lists also. |
Thanks a lot for the function Emily and Zenadix, really helpful for moving large datasets between Excel and Python. I wanted to share a slight modification of Zenadix's function that copies in the column headers as well. Seems to be working well for my purposes so thought it might help others as well :)
|
The Excel cursor might need to be switched to "busy" to prevent editing cells while xlwings is still trying to write out data. |
see also: ericremoreynolds/excelpython#60 |
For everyone who is keen on also correctly handling the header and not relying on the Range, I modified the code a bit:
|
With 0.23.1 you can now do: |
a check to see how large the array is and then some code that chunks it into <=10k sections would work. I wrote a function that does that and it is working fine.
The text was updated successfully, but these errors were encountered: