I’m trying to combine some information found in other posts, as a reference for a future FAQ or Wiki page.
It seems:

Moving nodes should switch off Adaptive Data Rate (or do they?), and always send with the slowest rate (SF12) (or do they? as fixed SF11 or SF12 is not allowed!), and hence are bound to a maximum application payload length of 51 bytes?

On SF12 a maximum frame of 51 bytes will take almost 2.5 seconds to transmit?

On SF12, when all 51 bytes are needed, so when sending for 2.5 seconds, then TTN’s fair access policy of 30 seconds a day would only allow for (on average) sending one location every two hours. But even sending just 6 bytes for two 24 bit coordinates would need 1.3 seconds air time and one could, on average, send fewer than one GPS coordinate per hour on The Things Network. But also an empty “alive” packet would probably still need the 13 bytes LoRaWAN header, and still take 1.16 seconds air time on SF12.

If a GPS unit gives you decimal degrees, then its actual precision might only be 4 decimals, even if it gives you some 32 bit value. How to measure the accuracy of latitude and longitude? claims:
 The sign tells us whether we are north or south, east or west on the globe.
 […]
 The tens digit gives a position to about 1,000 kilometers. It gives us useful information about what continent or ocean we are on.
 The units digit (one decimal degree) gives a position up to 111 kilometers (60 nautical miles, about 69 miles). It can tell us roughly what large state or country we are in.
 The first decimal place is worth up to 11.1 km: it can distinguish the position of one large city from a neighboring large city.
 The second decimal place is worth up to 1.1 km: it can separate one village from the next.
 The third decimal place is worth up to 110 m: it can identify a large agricultural field or institutional campus.
 The fourth decimal place is worth up to 11 m: it can identify a parcel of land. It is comparable to the typical accuracy of an uncorrected GPS unit with no interference.
 The fifth decimal place is worth up to 1.1 m: it distinguish trees from each other. Accuracy to this level with commercial GPS units can only be achieved with differential correction.
 The sixth decimal place is worth up to 0.11 m: […] This can be achieved by taking painstaking measures with GPS, such as differentially corrected GPS.
 […]

When only sending a few 24 bit numbers, using Google’s [Protocol Buffers encoding]((Encoding  Protocol Buffers Documentation) does not seem to decrease the data size (but it does require more computing power).
So, I guess best practices are about limiting the packet size, and to not require more than a few coordinates a day:

Like with all use cases: never send JSON, or not even ASCII, but encode your data into pure bytes.

In decimal degrees, a longitude with 4 decimals, 180.0000…+180.0000 might need 9 bytes when sending as plain characters (or 8 when leaving out the decimal dot), and probably another byte for some separator. But it also nicely fits in 3 bytes (like 8,388,608 to 8,388,607 as a 24 bit signed integer if you first multiply by 10,000). When one needs more decimals, using 4 bytes for a standard 32 bit float, or multiplying by 100,000 and sending as a standard 32 bit signed long, will give more than 7 decimals.

With a custom binary encoding one is not limited to an exact multiple of 8 bits.


Do not send altitude for nodes that are not airborne. (Like for nodes following roads, your application should be able to determine the sealevel altitude given its coordinates, unless you’re afraid someone might take it inside a tall building. But then you might not get a GPS fix to start with.)

Include position error information if needed. (Good thinking, darrenoc.)

Do not send a precision that is higher than your use case needs, or the GPS offers.

For a regular GPS module, 24 bits might be good enough to match its actual accuracy, so just 6 bytes are needed to send both latitude and longitude. (As an alternative calculation: with the equatorial circumference of the earth being 40,075,017 meters, the accuracy of a 24 bit integer would be 40,075,017 / 2^24, about 2.39 meters.)

For some use cases, like a pet finder, one can assume the target is within 50 km of some known place, like the known location of a mobile phone. An application can then figure out the missing details
if the node only sends the four decimals, without any sign. Those 4 decimals would fit in the 2 bytes of an unsigned integer (0…65,535)(see discussion below) so only 4 bytes are needed to determine both latitude and longitude. Alternatively, a node could send the full coordinates only, say, every 10th time. 
If the location of the gateway is known (true for The Things Network) and given the limited range of gateways and nodes then the above even applies to most, if not all, use cases.


If on the lowest data rate sending 6 bytes for two 24 bit coordinates takes 1.3 seconds air time and hence is limited to only 23 coordinates per day: be smart about when to send the coordinates.
 If it’s about finding lost things (rather than continuously tracking something), assuming a LoRaWAN Class A device (where a device can receive some data right after sending some), and assuming the node will not get out of range:
make the node send a small “alive” packet quite often, and only if the application responds in some specific way, power the GPS (and some light and a buzzer?) and start sending coordinates more frequently(it seems even “alive” packets are very limited). This saves both bandwidth and battery.
 If it’s about finding lost things (rather than continuously tracking something), assuming a LoRaWAN Class A device (where a device can receive some data right after sending some), and assuming the node will not get out of range:
Any other thoughts, or errors in my assumptions?