Skip to content

Fetch all on large tables #448

@thanuj11

Description

@thanuj11

Hello,
I am trying to fetch 4 million records from an oracle table and convert it into a pandas dataframe. I tried using fetchall() method but it is very slow for some reason and not returning any result.

               cursor.arraysize=5000
                **statement  = f"""
                SELECT * from sales
                """
                print(statement)
                cursor.execute(statement)
                res= cursor.fetchall() **

This table has around 4.5 million rows, I want to fetch all the rows and convert them to dataframe.

When I run a similar query with rows <200k, it returns the result in 2-5 seconds but when I run the query on tables with large data then I didn't get any results back.

@cjbj , Is there other ways I can try to fetch millions of records?

Additional info
Python Version: 3.6
Cx_Oracle version: 7.3.0

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions