People who know more than me on this may well chip in here but its my understanding that ADR is a ‘dynamic’ scheme where the network & nodes adjust over time to improve overall reception. In the case of TTN I believe that around 20 messages are monitored to establish an appropriate TX Pwr & SF to be used + a ‘safety’ margin to improve probability of signal good enough to get through. If it is seen that a node misses signals then it is assumed that it is in a weaker/more interfered area and its SF will be backed off over time along with TX increasing, with nodes ‘shuffling around’ to help balance and ‘densify’ (*F.Hede et al) the network. Yes increased SF or TX pwr increases power consumption but at least signal should get through…it is an inherent trade off in any RF network not just a LoRa phenomenon, and users have to decide what is more important getting signals through or conserving power or finding a balance.
This mechanism is one of the advantages of LoRa based deployments in that you can start off with a given number & array of GW’s and as you see network starting to get congested you can then drop in additional intermediate GW’s to help further ‘densify’ the network, with the NS recognising the availability of this additional resource and starting to re-balance node behaviour under ADR. That is in part why it is recommended/good practice to enable ADR when deploying even if not expected to be used when 1st deployed, for a static node, as you never know how the local RF environment will evolve in teh future of what new GW’s may be come available (or indeed may go offline/get de-commissionsed…a big risk with a hobbyist/hacker led network like TTN!)
If main concern is every packet MUST get through then dont use any RF solution…use a wire!
@arjanvanb as you point out there are ‘rules’ which guide on behaviour and performance but note I said ‘can’ co-exist…not ‘will’ ;-)…there are limits…with foreign/alien interferers also having different impacts.