2

After having stored the data frame in h5 file i was trying to access the file in another jupyter notebook but when loading the h5 file in read mode in pandas i encountered an error.

I read the file this way

data_frames = pd.HDFStore('data_frames.h5', mode='r')

Error :

HDF5ExtError                              Traceback (most recent call last)
~/.conda/envs/be_project/lib/python3.6/site-packages/pandas/io/pytables.py in open(self, mode, **kwargs)
    586         try:
--> 587             self._handle = tables.open_file(self._path, self._mode, **kwargs)
    588         except (IOError) as e:  # pragma: no cover

~/.conda/envs/be_project/lib/python3.6/site-packages/tables/file.py in open_file(filename, mode, title, root_uep, filters, **kwargs)
    319     # Finally, create the File instance, and return it
--> 320     return File(filename, mode, title, root_uep, filters, **kwargs)
    321 

~/.conda/envs/be_project/lib/python3.6/site-packages/tables/file.py in __init__(self, filename, mode, title, root_uep, filters, **kwargs)
    783         # Now, it is time to initialize the File extension
--> 784         self._g_new(filename, mode, **params)
    785 

tables/hdf5extension.pyx in tables.hdf5extension.File._g_new (tables/hdf5extension.c:5940)()

HDF5ExtError: HDF5 error back trace

  File "H5F.c", line 604, in H5Fopen
    unable to open file
  File "H5Fint.c", line 1087, in H5F_open
    unable to read superblock
  File "H5Fsuper.c", line 277, in H5F_super_read
    file signature not found

End of HDF5 error back trace

Unable to open/create file 'data_frames.h5'

During handling of the above exception, another exception occurred:

OSError                                   Traceback (most recent call last)
<ipython-input-2-aa601ba65d4f> in <module>()
----> 1 data_frames = pd.HDFStore('data_frames.h5', mode='r')

~/.conda/envs/be_project/lib/python3.6/site-packages/pandas/io/pytables.py in __init__(self, path, mode, complevel, complib, fletcher32, **kwargs)
    446         self._fletcher32 = fletcher32
    447         self._filters = None
--> 448         self.open(mode=mode, **kwargs)
    449 
    450     @property

~/.conda/envs/be_project/lib/python3.6/site-packages/pandas/io/pytables.py in open(self, mode, **kwargs)
    617             # is not part of IOError, make it one
    618             if self._mode == 'r' and 'Unable to open/create file' in str(e):
--> 619                 raise IOError(str(e))
    620             raise
    621 

OSError: HDF5 error back trace

  File "H5F.c", line 604, in H5Fopen
    unable to open file
  File "H5Fint.c", line 1087, in H5F_open
    unable to read superblock
  File "H5Fsuper.c", line 277, in H5F_super_read
    file signature not found

End of HDF5 error back trace

Unable to open/create file 'data_frames.h5'

Please help if possible.

5
  • The error message seems to be suggesting that the data_frames.h5 file is in an invalid format -- "unable to read superblock" and "file signature not found". It would be helpful to see the code which creates data_frames.h5. A runnable example which recreates the problem would be even better. Commented Sep 23, 2017 at 19:29
  • Yeah just saw that i was trying to store data frame with a column possesing a pandas series as value Commented Sep 23, 2017 at 19:39
  • @unutbu How can i store a data frame in a file where a column possess a pandas series like : column A row 1 -> [1,2,3], column A row 2 -> [2,6,9,8], and so on...? Commented Sep 23, 2017 at 19:40
  • 1
    Pandas HDF5 file support relies on PyTables, and PyTables supports only certain types of of data: bool, int, uint, float, complex, string, time, enum. In particular, it does not support Python lists. So to store your data in a HDF5 file compatible with Pandas, I would flatten the lists into multiple rows so that there is only one value per row. Commented Sep 23, 2017 at 19:59
  • How do you compare storing the data frame as a pickle file or flatten the list and store it as h5 file? Commented Sep 23, 2017 at 20:02

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.