Skip to content

[BUG] allotted memory #614

@bkawakami

Description

@bkawakami

Environment details

  • OS type and version: Ubuntu 22.04
  • Python version: 3.9.13
  • pip version: 22.04
  • pandas-gbq version: 1.5.3

Steps to reproduce

  1. Download a really big dataset, such as this dataset
  2. Convert to a pandas.DataFrame
  3. Try upload using pandas_gbq

Code example

pandas_gbq.to_gbq(df, 'datasets.table' , project_id='project_id', if_exists = 'replace', chunksize=100)

Stack trace

raise GenericGBQException("Reason: {0}".format(ex))
pandas_gbq.exceptions.GenericGBQException: Reason: 400 Resources exceeded during query execution: The query could not be executed in the allotted memory. Peak usage: 155% of limit.
Top memory consumer(s):
  input table/file scan: 100%

Considerations

Even though I am using chunksize parameter, with a low value, I always get the allotted memory error

Metadata

Metadata

Assignees

No one assigned

    Labels

    api: bigqueryIssues related to the googleapis/python-bigquery-pandas API.status: will not fixInvalid (untrue/unsound/erroneous), inconsistent with product, not on roadmap.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions