I have a lot of sensors (around 1000) and want to save their values. The sensors will mostly work around a sampling rate of 10Hz.
Regular queries will be used for a live-view application. It can be selected which sensors to plot (around 1-10 sensors at a time) and the time range (15min, 30min,...). The plot will be updated in an Interval like 10 seconds, possibly accessed by multiple users.
Furthermore, for the future the sensor values will be queried for like a whole day, week, month. But these queries are not very common and don't need to be super fast.
The sensors will generate about 36,000,000 readings per hour. The idea was to keep around 1 week worth of readings in "hot storage". Older readings will be archived and compressed.
The environment is PostgreSQL with TimescaleDB integration.
I have now 2 concepts that will differ vastly from one another.
Single-Table or a distinct table for each sensor.
Single table: These 1000 sensors are saved in a single table with timestamp, value, and 2 foreign keys to identify them. Timescale does great on getting a time range but I will still have to filter the dataset for the wanted entries (sensor-id).
Multi table: Creating a table for each sensor. So I don't need foreign keys (just timestamp and value) and if I query just a subset of sensors, I only need to find the wanted range per sensor in a dedicated table.
I did some performance tests between the two ideas and the result was not that far apart from one another to conclude something. Inserting for example was faster on the multitable approach, however while selecting values a definitive answer couldn't be found.
However I don't have enough database experience to decide which of the two designs holds up better in the long run.
Thank you in advance for any help.