I wrote a configurable script that subscribes to The Things Network via MQTT and writes payload data to a PostgreSQL database (with the option of using TimescaleDB’s “hypertables”).
It supports multiple devices from a single application and creates a table per device. The devices, columns and data types are configurable via the config file and initialized for you on first run.
I’ve tried to make it as simple as possible to configure and use and have spent some time writing some documentation.
Here it is:
I hope it helps someone get their app up and running quicker.
Let me know your thoughts, comments and suggestions.
Why does it say
ttn2 if it is written for V3? This is really useful, you could definitely add this to TTN Labs section!
Welcome to the multi-lingual forum with all of its nuances.
TTN to PostgreSQL = TTN2 PostgreSQL
If there was to be an observation on the naming, it should be TTS2PostgreSQL
TTN = Community = This forum
TTS = The Things Stack
TTS CE = The Things Stack Community Edition aka TTN v3.
Very interesting, we do something similar and I am using it to reconstruct our time series data from the earliest messages I can find in our backups.
Your TODO on line 97 about using the message timestamp is spot on. We made a similar mistake, which is why I’m working on rebuilding our time series. It’s trivial to do, but if you want to use pythons built in datetime parser you need to chop off the last 3 fractional seconds digits from the message timestamp.
I’d consider adding a jsonb column to hold the message as received over the MQTT connection. Then you can go back and analyse it further if you find you want more info from it.
I’ve just been calling almost everything The Things Network, so thanks for the insight.
@DavidTaylor_DPI, for my purposes using the now() function in PostgreSQL was sufficient, but I can understand the need for the accuracy to be closer to the sensor level. I’ll likely implement a CLI option to change the datetime source.