Fetching rows from million rows table(Optimization)

by Krishna Kumar S   Last Updated June 12, 2019 08:06 AM

I have millions of rows in a table of Greenplum database out of which I have to fetch around 45k rows and store them in a python List.

It's taking more than 2hrs to fetch the data. How can I optimize the time taken to fetch data?

resultList = []
for(item in list):
  result = SELECT column_1, ... column_n from TABLE WHERE column = item
  resultList.append(result)


Related Questions



Is a partition a relation?

Updated April 26, 2016 08:02 AM

What could cause a text comparison to fail?

Updated May 25, 2016 08:02 AM

Postgres Query Cross Schema

Updated October 12, 2018 15:06 PM

How to get the sequence name for a serial column

Updated March 22, 2017 16:06 PM