TTN Mapper: Data present but heatmap is not shown

ABPSelect

Just now found something which helps me to be in SF7 itself without going to SF12.

Use adaptive data rate (ADR) - check this in console.

This enables ADR but now the request from the server is only for SF7 and there is also status request along with ADR request when ADR enabled in console.

But now the server is accepting the SF7 confirmation mac commands and server doesn’t keep asking for same ADR again as seen when it was disabled.

So for my GPS node the problem is solved in near range locations. (But this would let server to change data rate when the signal is low. So still I would be forced to go to higher airtime)

I can confirm this behaviour. I had similar problems with my TTGO T-beam (lmic 4.1.1, ABP) for TTNmapper. After activating “ADR” in the console my T-beam seems to be stopped from switching from SF7 to SF12.

1 Like

btw: if my T-beam is not heard by a gateway when switching it on, it uses only the 3 basic-frequencies. If a gateway is reachable, the other frequencies are send by TTS and all frequencies are used.

Excellent stuff, we can now enter a scientific realm as we have two observations to form a working hypothesis from which we can perform a test.

I’ll work on the basis that LMIC & the NS are getting in a muddle (OK, maybe not so scientific, but I’m not accusing either party of being a fault). To evaluate that I’ll spin up my ATmega4808 with RFM95 which has enough space to allow me to use an untouched LMIC (mine has things turned on & off) but with printf and full debugging.

As a reference / point of comparison, I’ll use the same device on a STM32 B-L072Z aka Murata ABZ board using LoRaMAC-node.

I can decrypt/decoded stuff using the Runkit site, but does anyone know of a tool to decode the FOpts?

This is normal unless you have explicitly turned on the channels in LMIC.

It’s normal for the NS to send a channel list unless you have them configured on the device, then the NS regards this as a configuration the device has and doesn’t send it.

Because of the early excitement of the transition to TTS and at the bidding of our overlords, all my ABP devices have a full configuration on the server and I haven’t really used ABP since, if only because after a bun-fight with nonces on OTAA on devices with LoRaMAC-node, I’ve got the tools to reset the heck out of things - although there is a button on the console for that now.

I’ll try an ABP device with various sets of settings to see what occurs.

3 Likes

I have just tried to make a decoder. It works for status request and data rate, power request right now. All other request will only be named.

[Link to Google Drive .exe removed by moderator]

Example entry:

06030500FF01

Which says 06 - status request and 03 - ADR request with power

you can try if that helps, as only those two would be required.

Perhaps you could share the source on GitHub - if you look around you’ll find that’s where almost everyone else puts their code. Lots of Python for semi-command line stuff, PHP / Go / node.js / Python for web stuff, C for embedded …

Also I’m not a habitual Windows user so I’d have to power up a machine, sand box it, update anti-virus etc etc.

And what do the ADR bytes say?

When you have time you can give it a try. Doesn’t promise much but might help rather than going to lorawan specification pdf for each adr. It might help as the current experiment is on ADR

06030500FF01 - fopts field
06 - requesting status from node
030500ff01 - adr command which says 0 order power (max power), 5 order ADR (SF7), LSB 8 channel usable, no masking of frequency so on, if thats what you asked.

Why not extend the excellent lorawan packet decoder so we have one solution that does it all?

1 Like

I had two days so thought to of some help for this problem. lorawan packet decoder is so professional. I might take months to get merge to that code. But an excellent suggestion. Will definitely try the repository :innocent:

Just dug out the ATmega4808 + RFM95 breadboard for the first test. Debug set to level 2 with printf so I can see the details.

Using LMIC 4.1.1 with a pre-existing device setup on the console that has ADR turned on, the uplink is at SF7 from the start, quickly receives some MAC commands but none that are changing the SF.

I’ll create a new device later to see if that makes a difference, but at present both LMIC & TTS are working together nicely.

1 Like

The question here should also be:

Radio spectrum - what is the local conditions like?
SNR values?

Node Settings - ADR margin setting?

Just started to read about it, there is a bit to understand here.

There is no way a device that can be heard at SF7 should be commanded to jump to SF12.

Looking at the LMIC source it occurs to me I can use the MAC processing function to print what’s going on for debugging - which will be educational - and abstract it as a standalone. And do the same with the LoRaMAC-node. And ask for the TDD source because surely the unit tests will be very informative …

The SF12 request was seen when ADR was disabled. When I enable ADR from console, the server never ask to go to SF12 but the request are for SF7 only as mentioned in post 36

And this makes 50.

Where - console or device or both?

Just observed the same problem; being forced to switch fromSF7 to SF12 after join. Taking the hint of switching ADR on console (The Things stack) it did not force me to SF12. I have now ADR enabled both sides. Next step is to disable ADR on device and see if it gets honored by network by not sending ADRREQ with SF12
This cost me some airtime and now it is down from 1100 to 46ms as the payload is 4 bytes only

The NS won’t know it’s turned off on the device so it’s not got anything to honour as such.

As this is the third, possibly the fourth, I’ll complete my tests this morning so we can collectively raise an issue if need be.

Hmmm, I can see that upstream packets have in header ADR = true/false, so isn’t this the way to tell upstream that the device does not support ADR like it started to move in a vehicle after being stationary.

My fresh out the box LMIC 4.1.1 with ADR turned off in the firmware and on the console, starting out at SF10 is commanded to SF12.

But what I don’t know if this is a bug in LMIC as I haven’t yet decoded the downlink.

If I start at SF10 with ADR on it gets set to SF7.

What I see on my GW packetmonitor is an ADRREq coming from NS and requesting for SF12. LMIC responds with ADRAnsw and ACKs the DataRate. When LMIC has ADR off and NS has ADR off (my starting point) NS sends ADRRReq with SF12 after join and LMIC duly acts accordingly. If I then set DR back to SF7 on LMIC manually and SF7-packet goes upstream, comes immediately ADRREq with SF12 - so for me it seems that it gets triggered by upstream SF7-packet.
Since this is a local setup I have RSSI -70 and SNR +7.

1 Like