I have a dataset written to an h5 file, and I want to convert it totff.simulation.datasets.ClientData,That is, after pre-processing it becomes this form <tensorflow_federated.python.simulation.datasets.client_data.PreprocessClientData at 0x7f00947f6f50>. A week ago, I was able to read it with this
train_path='FederatedClients/dataTrain.h5'
train_data = tff.simulation.HDF5ClientData(train_path)
but now, I get the following error with this statement again.
----------------------------------------------------------
AttributeError Traceback (most recent call last)
<ipython-input-6-f4c34c62ca50> in <module>
1 train_path='FederatedClients/dataTrain.h5'
2 #train_data=pd.read_hdf(train_path)
----> 3 train_data = tff.simulation.HDF5ClientData(train_path)
4 test_path='FederatedClients/dataTest.h5'
5 test_data=pd.read_hdf(test_path)
AttributeError: module 'tensorflow_federated.python.simulation' has no attribute 'HDF5ClientData'
I don't know what to do anymore, I use the most basic panda to read the H5 file and I also get the error, TAT
train_data=pd.read_hdf(train_path)
----------------------------------------------------------
ValueError Traceback (most recent call last)
<ipython-input-7-151fe098668c> in <module>
1 train_path='FederatedClients/dataTrain.h5'
----> 2 train_data=pd.read_hdf(train_path)
3 train_data = tff.simulation.HDF5ClientData(train_path)
4 test_path='FederatedClients/dataTest.h5'
5 test_data=pd.read_hdf(test_path)
~/anaconda3/envs/tff/lib/python3.7/site-packages/pandas/io/pytables.py in read_hdf(path_or_buf, key, mode, errors, where, start, stop, columns, iterator, chunksize, **kwargs)
437 if len(groups) == 0:
438 raise ValueError(
--> 439 "Dataset(s) incompatible with Pandas data types, "
440 "not table, or no datasets found in HDF5 file."
441 )
ValueError: Dataset(s) incompatible with Pandas data types, not table, or no datasets found in HDF5 file.
I would be grateful if you could solve my problem