I am studying the network performance of my sensor network, in the dataset I downloaded from my sensor network there is a field called uplink_message.packet_error_rate.
The API docs say: Packet error rate of the recent uplinks in the current session. Calculated by the Network Server.
Could someone clarify:
- Does it count only missing uplinks (based on frame counter gaps) or also frames received with bad MIC/CRC?
- What is the exact time window or number of messages used for the calculation?
- Is there public documentation or source code explaining the formula?
- I assume packet loss rate is different with packet error rate, they are two things, but in my dataset the downloaded PER are very similar to my calculated PLR with frame counts.
Any insight would help me interpret this metric correctly.
Thanks!