I have time series data from three completely different sensor sources as CSV files and want to combine them into one big CSV file. I've managed to read them into numpy using numpy's genfromtxt, but I'm not sure what to do from here.
Basically, what I have is something like this:
Table 1:
timestamp val_a val_b val_c
Table 2:
timestamp val_d val_e val_f val_g
Table 3:
timestamp val_h val_i
All timestamps are UNIX millisecond timestamps as numpy.uint64.
And what I want is:
timestamp val_a val_b val_c val_d val_e val_f val_g val_h val_i
...where all data is combined and ordered by timestamps. Each of the three tables is already ordered by timestamp. Since the data comes from different sources, there is no guarantee that a timestamp from table 1 will also be in table 2 or 3 and vice versa. In that case, the empty values should be marked as N/A.
So far, I have tried using pandas to convert the data like so:
df_sensor1 = pd.DataFrame(numpy_arr_sens1)
df_sensor2 = pd.DataFrame(numpy_arr_sens2)
df_sensor3 = pd.DataFrame(numpy_arr_sens3)
and then tried using pandas.DataFrame.merge, but I'm pretty sure that won't work for what I'm trying to do now. Can anyone point me in the right direction?

merge, for instance it should work if you didmerged = pd.merge(df_sensor1, df_sensor_2, on='timestamp')and then repeat fordf_seonsor3, or if you set the index to timestamp on all the dfs then you could just dopd.concat([df_sensor_1, df_seonsor2, df_sensor3])mergeexactly like you wrote, but that apparently does an inner join, so only data points that have timestamps in all tables are written to the merged table. I've tried an outer join as well, which does include all the data but also doesn't get the ordering right. I did just tryconcatthough. I didmerged = pd.concat([df_sensor1, df_sensor2, df_sensor3], axis=1)andmerged.to_csv('out.csv', sep=';', header=True, index=True, na_rep='N/A')and that seems to have done the job. I'll have to verify it tomorrow.