Memory Issues Python

I have a huge csv file and it does not fit into the memory in python . I dont want to use pandas or any library because i have some speed issues . How could i handle this .

Hi,

I think this StackOverflow question might have some relevant information.
https://stackoverflow.com/questions/25508510/fastest-way-to-parse-large-csv-files-in-pandas

Best,
D

It would help to understand what you need to do with the CSV in order to make recommendations. Off the top of my head, you could look at:

  • Using a database
  • Using command line tools
  • Using Dask (which has a pandas-like interface but allows for distributed operation)

Which of these (or whether another option is preferred) really depends on what you need to do with the data.

I’m also interested to understand exactly how big the CSV is, because I guess that’s a key thing to know as well.

1 Like