How to measure packet delay or latency?

hi! my project is lorawan-based wireless sensor network applied to precision agriculture and i want to determine the delay or latency of the packets along with RSSI and SNR.

do you guys know how to calculate packet delay?

What exactly do you mean by ‘packet delay’ and why do you want to know what it is ?

And what do you mean by ‘precision agriculture’ ?

i want to determine elapsed when payload transmission is initiated up to when the payload is sent.

“Precision agriculture is a farming management concept based on observing, measuring and responding to inter- and intra-field variability in crops.”

Serial log from device?

That timing is internal to the node and is therefore determined by the specific instantiation of the LoRaWAN stack, the time to prepare/pass payload and the specifics of the hardware (e.g. MCU type. clock frequency etc. and will obviously vary by type on a piece of string basis…

Actual over the air payload deliver/transmission time (inc LW overhead is shown in the message metadata…(usually to millisecond precision (if not accuracy).

1 Like

Per forum policy, no OP shall share blindly obvious information until at least post #10 - so no device, firmware, version, stack, mode etc.

Not clear to me exactly what you mean by ‘payload transmission is initiated’ but it sounds like you are talking about the air time of the packet, which is small, in the tens of mS normally, and of little consequence.

There is a circumstance whereby the sending of a packet can be delayed in order to keep to legal duty cycle limits.

i want to know how much time it takes for a data packet to travel from one designated point to another.

so basically here the delay is only 1 sec?

sorry i’m new to this where do i exactly find this?

(inc LW overhead is shown in the message metadata…

Depends, the two devices showing clocks\times might not by sychronised.

Why do you want to know ?

it is part of my experiment, i have to record latency along with rssi and snr

Indeed but I know how you like playing 20 questions :wink:

So in the abscence of a specifc detail lets consider say wind speed measurement…

The transit time for an aneometer to meaure on one side of a field c/w an aneometer on the other side of the field or indeed in the neighbouring field can easily be calculated of course but as this will normally be measured in seconds the message sending time is almost inconsequention on that scale.

If interested in the variability of say temp, this is normally done over many minutes/hours let alone seconds so again Tx time for payload not of consequence, and if a need for comparing temp (or humidity or soil moisture of anything else of consequence) at different points this can be done by precision scheduling of tx but in practice you REALLY want to dither the measures as you want to minimise risk of packet collisions and spectrum congestion so as Nick hints need real details to understand your concerns and what is of interest (and possibly why?)

Ok so not quite in a vacuum but devide distance between points by c and its a good approximation! And anything on the scale of LoRaWAN range the transit time of the payload as RF is meaninglessly small in the scheme of things!

No depending on clock sources the delay here is an offset of time and will be determined by how well in sync the clocks are and how long each end takes to process and recognise the fact of TX and Rx…

Where and relative to what and are the source and destination synchronised wrt their clock?

Again coming back to earlier post what is it that is changing/varying that you want to measure that 1sec (or fractions therefof) become significant (context), then we can perhaps better understand or advise…

It is shown in the metadata for each message received by a Gateway - click on the individual message in the TTN console and it will expand to the show the full set of metadata :slight_smile:

Also if you know the details of your payload message you can use online estimators to predict what the time would likely be e.g.
https://avbentem.github.io/airtime-calculator/ttn/eu868/10

1 Like

And actually for this one the values (given lack of resolution) suggest it may only be 27mS!

28.973 vs 29?!

1 Like

i want to compare the transmission time of raw packets of data versus processed ones or if there’s a noticeable difference in the time it takes for transmission between the two.
my project involves edge computing so i want to determine whether data processing at the end devices impacts the overall transmission latency.

Then that is a different peice of string you are trying to measure and is not related to or impacted by LoRaWAN! What you want to know is how long that data processing takes as part of your edge computing process (perhaps detecting thresholds, tracking averages, sending an alarm vs data, running a machine learning algorythm, or what ever)… only you can determine that, no one here on the Forum…once you have a result and are sending the result then it goes over to being a LoRaWAN question and influenced by distance and speed of light, payload/overhead size etc…

okay, thanks for the clarification. but to clarify, my objective is to measure the time it takes for raw data to be received, not just transmitted. so i want to test if there’s a significant difference in the latency between raw and processed data when using LoRaWAN.