Payload Formatter versus Webhook calculation

I am considering simplifying my payload formatters and data capture from my nodes so that “raw” data is passed to my webhooks. The reason is to have more control of the conversion (eg F->C) in the webhook code where I can easily change it once the nodes are deployed. Has anyone thought about this?

You can always store the uplink message as raw data (un-decoded) with a time stamp, once you recall the data you decode it and display it.

But this pose other issues, like just recall the highest value (C) for the last week. You need to recall all the values and then decode all of them and then find the highest value.

I have no idea if my approach is worthy, but I decode everything I receive in my webhook and save it decoded.

The other question you can ask is if it’s un-decoded, what takes up less storage space in a DB. This will be worth if you have billions of data points.

But then you trade off DB CPU against DB Size.

So : What takes up less storage space in a DB - Decimal 2345643 or Hex 23CAAB?

That comes as standard anyway - the payload formatter is a convenience function but sometimes times out if the application servers are under load and is another place to alter things when payloads change.

As for storing, hard disks are stupid cheap as is web space for web hooks. I store everything so I can always go back to re-parse data and re-decode if I’ve got out of sync with the decoder vs payload firmware updates.

Neither of them - or is that both of them - they both fit in to a 32 bit number and they are both the same number …

I have a web hook, as I describe in my conference videos, that takes no configuration and saves minimal data that I point anything & everything at. The hosting is free with the annual domain rental of £6/year. I get 10GB of space AND 1GB mySQL database AND, and this is really pushing it, I get 100 email accounts with 10GB of storage each, so if I’m feeling really insane, I can bundle up older data sets and email it to an account.

So for £6/year, I can have 10GB of raw data, 1GB of active data in a mySQL database plus, in theory, 1TB of other storage.

The office random database of IoT data from various devices scattered around the office & outside for trying stuff out runs to 858,758 records which is 78MB of data exported, averaging 4 data points per line. In it’s database with a full set of decimation records at hour, day & week level to speed up reporting with indexes on both that & the raw data, it takes up ~900MB.