Does it scale?


#1

I realise this will be a ‘it depends’ answer.

But based on your Amsterdam deployment;

If you had 100 simultaneous users checking the water level in their boats, feeding cats, or whatever would there be network congestion? 1000 users? 10,000 users? What’s a ballpark guesstimate for how many devices the network could handle?

Presumably, you could mitigate this by just increasing the density of the internet connected nodes?

Thanks

Sam


(Thomas Telkamp) #2

It depends :wink:

Scalability is achieved by:

  • using 8 frequencies
  • 6 spreading factors (SF)
  • limiting the duty-cycle per node (fair access policy)
  • Adaptive Dara Rate (ADR)
  • multiple gateways

If two nodes are transmitting at the same frequency and SF, a gateway will still decode the one with the strongest signal (with some probability). Because both nodes and gateways are geographically distributed, there is a fair chance both transmissions will be picked up this way.

If there is too much packet loss, the network can be scaled up by adding gateways. Doubling the amount of gateways will increase the network capacity by a factor 6 or so, due to the nodes adapting their data rates (and power) to the closest gateways.

TTN will publish guidelines/policies that ensure good connectivity for at least 1000 nodes per gateway.


Parse Data with Node-RED
#3

I only understood about half of that :slight_smile: But it sounds reassuring! I’ll digest it more and come back with any questions…

Is there any kind of network monitoring in existence yet? Could you deploy Nagios/ Smokeping or would it have to be written from scratch?

Thanks

Sam