I am trying to import a rather small (217 rows, 87 colums, 15k) csv file for analysis in Python using Panda. The file is rather poorly structured, but I would like to still import it, since it is the raw data which I do not want to manipulate manually outside Python (e.g. with Excel). Unfortunately it always leads to a crash "The kernel appears to have died. It will restart automatically".
https://www.wakari.io/sharing/bundle/uniquely/ReadCSV
Did some research which indicated possible crashes with read_csv, but always for really large files, thus I do not understand the problem. Crash happens both using local installation (Anaconda 64-bit, IPython (Py 2.7) Notebook) and Wakari.
Can anybody help me? Would be really appreciated. Thanks a lot!
Code:
# I have a somehow ugly, illustrative csv file, but it is not too big, 217 rows, 87 colums.
# File can be downloaded at http://www.win2day.at/download/lo_1986.csv
# In[1]:
file_csv = 'lo_1986.csv'
f = open(file_csv, mode="r")
x = 0
for line in f:
print x, ": ", line
x = x + 1
f.close()
# Now I'd like to import this csv into Python using Pandas - but this always lead to a crash:
# "The kernel appears to have died. It will restart automatically."
# In[ ]:
import pandas as pd
pd.read_csv(file_csv, delimiter=';')
# What am I doing wrong?
