Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Batch functionality #528

Open
Jostarndt opened this issue Nov 4, 2022 · 0 comments
Open

Batch functionality #528

Jostarndt opened this issue Nov 4, 2022 · 0 comments
Labels
feature request A tag for feature requests

Comments

@Jostarndt
Copy link

Jostarndt commented Nov 4, 2022

Is your feature request related to a problem? Please describe.

I would like to call the ARIMA on batches of data, i.e. the data has the shape [batchsize, timesteps], and therefore generate output that has the shape [batchsize, output_size].

Describe the solution you'd like

The auto_arima.predict() function would be cooler if it would take also batched inputs.

Describe alternatives you've considered

Currently I am doing it with a very ugly for-loop:

output_list
for i in range(batchsize):
        model = pm.auto_arima(data[i,:])
        output = model.predict(output_size) 
        output_list.append(output)
output = np.array(output)

However this is not very pythonic and quite slow - maybe I have only missed a batch functionality?

Additional Context

I hope i havend missed that this feature already exists.
Thanks alot! :)

@Jostarndt Jostarndt added the feature request A tag for feature requests label Nov 4, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature request A tag for feature requests
Projects
None yet
Development

No branches or pull requests

1 participant