Working on guided project outside dataquest

please, how can I work with the dataset for the projects on dataquest locally on my personal computer. I have downloaded the dataset files on my system but not opening on my jupyter notebook. I get error each time I try to read any of the dataset into pandas. I tried to covert the data to csv by editing the name, I still get error reading it into pandas. How can I resolve this as I want to analyze the data locally on my PC. Thanks

You will have to share the errors you face and the corresponding code for others to be able to help you properly.

But make sure you are providing the correct path to your data file. If your Notebook is in the same folder as the data file, then you shouldn’t face any issues loading the date. If they are not in the same folder, then you will have to provide the path to that file to be able to load it.

This is the error am getting
ParserError: Error tokenizing data. C error: Expected 1 fields in line 4, saw 2.

I make sure I open the notebook where the file is located i.e downloads

Not really sure why you would get this error, but this should help -

Check out the solutions in the above link and one of them should help you. The second one might be more relevant to understand why this error would happen.

In most cases, it might be an issue with:

  • the delimiters in your data.
  • confused by the headers/column of the file.

To solve pandas.parser.CParserError: Error tokenizing data , try specifying the sep and/or header arguments when calling read_csv.

pandas.read_csv(fileName, sep='you_delimiter', header=None)

Also, the Error tokenizing data may arise when you’re using separator (for eg. comma ‘,’) as a delimiter and you have more separator than expected (more fields in the error row than defined in the header). So you need to either remove the additional field or remove the extra separator if it’s there by mistake. The better solution is to investigate the offending file and to fix it manually so you don’t need to skip the error lines.