You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There is a bug in parso or spark-sas7bdat packages when it reads a really wide file. When I read a sas7bdat with only 18 columns, it works perfectly, but when I read my sas7bdat with ~13000 columns, it breaks. For some reason, any of the values that aren’t zeros are replaced with null.
The text was updated successfully, but these errors were encountered:
There is a bug in parso or spark-sas7bdat packages when it reads a really wide file. When I read a sas7bdat with only 18 columns, it works perfectly, but when I read my sas7bdat with ~13000 columns, it breaks. For some reason, any of the values that aren’t zeros are replaced with null.
The text was updated successfully, but these errors were encountered: