Downlink Frame

Hi everyone,

i need to do some Energy Consumption Calculations.

Therefore i need to know how much time it takes to transmit/receive a downlink message.

My Question in this case is: Is for a downlink-frame the same structure used like for an uplink-frame?

See the LoRaWAN specification. It is indeed very similar, but don’t overlook that the bandwidth may be different.

Remember you have to consider not only the actual downlinks but the length of the receive windows opened for possible downlinks.

Presumably you are looking only at node power consumption, a gateway transmit power consumption isn’t notably different from its always-on receive processing. It’s even vaguely possible it is lower, since all of those parallel receivers can not typically do anything while the transmit signal path is active.

So does a Node consume the same amount of power when it is just “listening” without receiving a downlink? So it doesnt matter if a downlink is send, it only matters when there is a rx-window opened for the rx-power-consumption?

That depends on how well tuned the node firmware is.

Ideally the receiver would run only long enough to detect a packet preamble, plus a timing error allowance on each side. If nothing is heard, then the radio would shut off and the node would go back to sleep until the 2nd receive window, and then again until it’s time to transmit (or do some purely local task) again.

Also in theory a node’s attention could be captured by something else (including radio noise) falsely triggering the preamble detector and thus tricking the radio into receiving for a full packet length, with the garbage only discarded when either the hardware checksum or the LoRaWAN cryptographic checksum is verified. This kind of false reception happens fairly frequently with gateways, presumably it happens with nodes as well though they just aren’t receiving that much to begin with so it’s not often seen.