I’m currently working with a MultiTech’s mDot module.
By default, Adaptive Data Rate is off and Tx Data Rate is set as SF10 (Spreading Factor 10) with Bandwidth 125KHz. SF10 means a payload size is max. 12 bytes.
With this setting, I can successfully send up to 12 bytes using an AT command such as ‘at+send=123456789012’ on my mDot module. But, I get an error when I try to send 13 bytes of data using an AT command such as ‘at+send=1234567890123’ on my mDot module.
And, I turned on the ‘Adaptive Data Rate’ on my mDot module to see if I can send more than 12 bytes of payload. Under this setting, if I simply try to send 13 bytes of payload using an AT command such as ‘at+send=1234567890123’ on my mDot module, it still fails.
How can I see with my eyes and verify if ADR (Adaptive Data Rate) is working with my end device (MultiTech’s mDot module)? In other words, how can I verify if a TTN network server sent a command to my mDot module end device to change the data rate?
For example, let’s say my end device application always sends 100 bytes to a TTN server. And, let’s say I have turned on the Adaptive Data Rate on my mDot module. Does my end device application simply send 100 bytes of payload to the LoRa chip and the LoRa chip will divide 100bytes into several small frames and send them to a TTN server based on the current data rate set by the Adaptive Data Rate scheme?
Or, does my end device application have to detect the current data rate set on the LoRa module and divide 100 bytes of payload into smaller data frames and send them to a TTN server in sequence?
Is there any good documentation that explains how Adaptive Data Rate works in detail (possibly using an example)?