We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I have a huge dataframe that I would like to iterate over and make prediction using contextual bandits. The dataframe fits in my machine (32 GB RAM).
Python quickly crashes because the RAM used rapidly increases every time a call to predict or learn is made.
Run this code:
import pandas as pd import numpy as np from vowpalwabbit import pyvw vw = pyvw.vw("--cb_explore 2 --cover 5 --quiet") n = 8000000 data = np.random.normal(size=[n, 8]) data = pd.DataFrame(data) data["label"] = np.random.binomial(1, 0.5, size=n) for f0, f1, f2, f3, f4, f5, f6, f7, label in data.itertuples(index=False): features = f"f0:{f0} f1:{f1} f2:{f2} f3:{f3} f4:{f4} f5:{f5} f6:{f6} f7:{f7}" vw.predict(features)
Code to execute the dataframe row iteration should run to completion without blowing up RAM usage.
RAM rapidly increase to max available RAM when the for loop is executing and then machine hangs.
What version of VW did you use? 8.7.0.post1 What OS or language did you use? Ubuntu 18.04.3 LTS with Python bindings
Are there any alternative to use vw to train a contextual bandit on very long log records?
The text was updated successfully, but these errors were encountered:
@jackgerrits Is building from source the only way to use this fix now?
Sorry, something went wrong.
Yes building from source is the only way to get this until we release 8.8.0.
Please see here for instruction about how to install the python bindings from source: https://github.com/VowpalWabbit/vowpal_wabbit/wiki/Python
Is the link updated?
The Linux part says:
It looks to me like its installing from PyPI
Ah yes sorry about that. Follow the instructions here to get dependencies installed: https://github.com/VowpalWabbit/vowpal_wabbit/wiki/Dependencies#ubuntu
Then in the root of the repo run:
python setup.py install
Thanks. I can confirm that this issue is solved.
Successfully merging a pull request may close this issue.
Describe the bug
I have a huge dataframe that I would like to iterate over and make prediction using contextual bandits. The dataframe fits in my machine (32 GB RAM).
Python quickly crashes because the RAM used rapidly increases every time a call to predict or learn is made.
To Reproduce
Run this code:
Expected behavior
Code to execute the dataframe row iteration should run to completion without blowing up RAM usage.
Observed Behavior
RAM rapidly increase to max available RAM when the for loop is executing and then machine hangs.
Environment
What version of VW did you use? 8.7.0.post1
What OS or language did you use? Ubuntu 18.04.3 LTS with Python bindings
Additional context
Are there any alternative to use vw to train a contextual bandit on very long log records?
The text was updated successfully, but these errors were encountered: