Spreading Factor, Coding Rate and Bandwith

Hello Im new to the LoRa Community,

Im already did a lot of research because Im writing a paper about LoRa.
Im especially interested about the Spreading Factor in combination with Coding Rate and Bandwith.
I prepared 2 examples which I will include in my research so maybe the community can also give me their opinion about this. Both examples uses payload with 64bytes (51bytes+13bytes header).

  1. example: Can changing the coding rate outperform another spreadfactor?

Using SF7 with normal coding rate of 4/5 will result in an airtime of 118ms. To have a more reliable signal you could change now the spreadfactor to SF8 with coding rate 4/5 which will result in 216ms of airtime. So my idea would be instead of changing the spreading factor you would increase the coding rate to 4/8 which will still have lower airtime with 187ms and the signal could be even more reliable than with spreadfactor SF8.

  1. example: Same airtime but more range with higher bandwith

Using SF7 and coding rate 4/7 and 125KHz Bandwidth will result in 164ms of airtime.
Using SF10 and coding rate 4/5 and 500KHz Bandwidth will also result of 164ms of airtime.
Using SF10 and coding rate 4/5 and 125KHz Bandwidth will result in 698ms of airtime.

So far as i understand increasing the Bandwidth will reduce the range but does it also increase the power consumption? And if yes, could the power consumption be less than sending with 125KHz because the airtime is a lot shorter?

Sadly i cannot test the second example by myself with the TTN network because in my region the maximum bandwidth is limited to 250KHz and SF7.
So my last question would also be why the bandwidth of LoRaWAN is limited to 125KHz for most spreading factors.

What would be the downside of having all nodes send in 500KHz which would result in lower airtime?

Not to pick nits, but TTN is LoRaWAN. For plain Lora you need to go elsewhere.

Power consumption is (assuming maximum transmission power and no reduction because of ADR) a function of the power required for that transmission power and the time the transmission lasts. In those cases where the duration is the same the power consumption will be the same.

Shorter airtime equals less consumed power. The only exception would again be ADR driven reduction in transmission power which could happen at SF7 with good radio conditions.

You will not be able to test any coding rates apart from the ones specified in the regional standard for LoRaWAN when using TTN because it makes your node non LoRaWAN compliant. And TTN likes nodes to be standards compliant (especially V3).

You might find some insight into that in the application notes on the Semtech website. My guess is that it is a happy medium between range, energy consumption and use of radio spectrum (number of possible bands in the available spectrum)

The power consumption during transmission depends only on the output-power of the transmitter. If you look into the datasheet of Semtech you will see the relationship between output-power and current consumption. The consumed (needed) power can be calculated by multiplying this current with the supply voltage.

When you transmitt a data-package, the transmission of this package needs power for a time. The result is called energy. The transmission of a LoRa package consumes energy (J, Wh). The battery is discharged by this energy.

1 Like

Well, if your writing a paper, you need to prove that your supositions are correct in reality, and you can only possibly do that if you carry out the practical tests yourself, plain and simple.

You cannot go fiddling with this within TTN, the parameters there are fixed, apart from spreading factor, so the questions are really to do with point to point LoRa.

Such practical testing is fairly trivial, but outside the scope of TTN, so I would get testing.

Based on some years of practical LoRa link testing (since 2014), I would suggest the range you get is directly proportional to the air time, LoRa seems well behaved in this respect. There is no ‘magic bullet’ combination of settings that I have come across.

2 Likes

If you haven’t already, do a search for energy on this forum. You’ll find threads like Overview > Modeling the Energy Performance of LoRaWAN and LoRaWAN Node Energy Calculator

Thanks for all the Feedback.

When i finish my paper I will update this post and that will maybe help someone in the future.

1 Like

I got a few more questions that i couldnt find answers and i want to clarify:

  1. Is it correct that the Timestamp like 2021-06-16T13:11:58.144892384Z have nanosecond accuracy?

  2. I can find 4 different Timestamps in every Uplink message

    • “gs recieve” which has the latest timestamp
    • in “settings” and “rx_metadata” which are equal and the oldest timestamp
    • “recieved at” which is newer than “settings” but older than “gs recieve”

As far as i understand the “settings” and “rx_metadata” shows the time of sending from the endnode
The “recieved at” then the message is recieved by the gateway and “gs.recieve” then the gatewayserver has processed the message.

Is this correct or is am I missing something?

  1. What is the difference between RSSI and Channel RSSI. So far every message i sent has the same value for both. What could be a reason the RSSI and Channel RSSI is different from each other.

  2. When im sending SF7 with 250KHz I can find this value in “rx_metadata”:

channel_index": 8

What does it mean and why it only shows up with SF7 250KHz.

The precision might well be to nS level, but the accuracy in relation to real time is unlikely to be.

Why are you asking ?

1 Like

I was just curious why the time is displayed in ns because for real time use ms should be enough.
The difference between sending and recieving for me is about 2-3ms. Showing something like this would be much more useful than

“time”: “2021-06-18T11:48:04.577204Z”,

and

received_at": “2021-06-18T11:48:04.604929071Z”