Skip to content

pandas_gbq.to_gbq always replace the destination table. #607

@rmoralesd

Description

@rmoralesd

I'm trying to run a Cloud Function for inserting data in a BigQuery table. The table is already created, it is monthly partitioned by a datetime column and It was like 4000 rows. The cloud function should insert a new row, but instead it drops the table and I end up with a table with just the new row.

  • Cloud Function
  • 1st gen
  • Python version: 3.10

My requeriments.txt file

google-cloud-bigquery
pandas
pandas_gbq
db-dtypes

Steps to reproduce

  1. Create a dataset in BigQuery
  2. Create a table
  3. Fill the table with some data
  4. Execute the cloud function in order to insert a new row.

Expected result: new row inserted in the table, keeping the old rows.
Actual result: existing rows are deleted, the table only have one row

Code example

# example
df = pd.DataFrame({
        "ticketid":'777777777',
        "ticket_num":12345,
        "Placa":"ABC123",
        "Entrada":"Prueba",
        "FechaHoraIngreso":pd.Timestamp(year=2022,month=1,day=31,hour=20)        
    },index=[0])   

    pandas_gbq.to_gbq(df,'dataset_name.table_name','project_id',if_exists='append')

Stack trace

# example
There are no errors during the execution. 

Thanks!

Metadata

Metadata

Assignees

No one assigned

    Labels

    api: bigqueryIssues related to the googleapis/python-bigquery-pandas API.priority: p2Moderately-important priority. Fix may not be included in next release.type: bugError or flaw in code with unintended results or allowing sub-optimal usage patterns.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions