write_database() - Insert many rows with sql server using fast_executemany #16053
Labels
A-io-database
Area: reading/writing to databases
accepted
Ready for implementation
enhancement
New feature or an improvement of an existing feature
Description
I would like to be able to insert many rows quickly in an sql server database table with df.write_database()
I can currently achieve this with the fast_executemany option in pandas but I can’t find a way to do this with polars.write_database() which only accepts a connection string and not an sql alchemy engine. The pandas code is much faster than the polars equivalent without fast_executemany.
Example code with pandas:
engine = create_engine(
"mssql+pyodbc://{user}:{password}@{server}/{db_name}?driver=ODBC+Driver+17+for+SQL+Server",
fast_executemany=True,
)
ret = curve.to_pandas().to_sql(
name="curve",
con=engine,
if_exists="append",
index=False,
)
Example code with polars:
curve.write_database(
table_name="curve",
connection="mssql+pyodbc://{user}:{password}@{server}/{db_name}?driver=ODBC+Driver+17+for+SQL+Server",
if_table_exists="append",
engine="sqlalchemy",
)
The text was updated successfully, but these errors were encountered: