If we are forcing a ADRACKReq, even though the device is at SF12 (should not need to do this according to LoRaWan specification), then the ADR is working as expected.
Can’t see why TTN is using the ADRACKReq from the device as an indication from how the ADR should be sent via downlink.
And why should the device need to set the bit when it is at SF12? At SF12 the NS should be aggresive and try to push the device to a better SF.
This should also cost very much in battery etc as the device takes long time to change to a more accurate SF.
It seems to be 64 uplinks where is has been completly quiet on the downlink. After 64 uplink, it seems to be some request from the NS to raisen its Datarate.
Why is this so? As mentionend earlier, should’nt this be the state where the NS is aggressive and sending downlinks to raise the Datarate to minimize the airtime?
This seems to happen when we are in DR0 as default value. But I can’t see any “good” from by using SF11 or .ex SF7.
Example. Installer mounts LoRa-device in a basement, concrete and a thick door. When the installer is there, the thick door is open, and the Device starts to send at SF7.
When the door is closed, it will take long time before the device by self goes to SF12. In the “real world” it should start on SF12 and go down.