What is the difference between 'bit rate' and 'throughput'?

According to the AN1200.22 LoRa Modulation Basics, the bit rate is calculated by sf*BW/2^sf, which can reach a few kbps.
However, when I divide the received data by the total running time of the program to obtain the average throughput, the value is far less than the above magnitude.
Do I confuse two different concepts?

Yes!

Bit rate is the rate whilst actually transmitting/receiving - the physical layer (LoRa) bit rate, you are then using that burst over the loop duration of your programme - effectively a self imposed programme throughput, which itself will then be moderated by any applicable legal e.g. duty cycle or dwell time limits, LBT constraints, etc, inherent in the regulations of whereever you are in the world, and that in turn, when using TTN, should be limited by the TTN FUP ‘message’ throughput limits inherent in the time on air constraints/number of permitted downlinks etc. LoRaWAN IS NOT a constant running schema so OTA bit rate, derived from symbols/second for any given SF, =/= throughput and vice versa.

You mean that the bit rate is actually a theoretical fixed value independent of the packet length. It seems that it has no practical physical significance because it cannot be measured through experiments.
It’s a simple idea that the device’s ‘bit rate’ must be higher when transmitting longer data packets in the same length of time, isn’t it?

Bit rate is fixed and is inherent within the physical layer and is SF dependent and is independent of message length. The Higher the SF the Lower the Bit Rate…SF 12 has a bit rate of approx 300bps (IIRC actually ~280bps). As in all communications systems Throughput is effectively message rate and that will vary with both message length and bit rate.

Its certainly possible to set up a LoRa device to tranmit a packet of say 32 bytes length and time how long it takes to transmit, so you can measure the actual on-air datarate.

But perhaps give us some context and explain why you are asking the question ?

Dont pay too much attention to the bitrates quoted in the LoRa calculators.

The quoted bit rate for SF12, BW125000, is 293bps, but a real world 32byte packet has an on-air time of 1581ms, so the achieved data rate is only 161bps, due to preambles, headers, CRC etc.

Bit rate is irrelevant in the world of IoT - we pass data messages a few times an hour/day/week.

Bit rate is relevant if you are trying to move a file - even 1024 bytes - which would realistically need at least 5 uplinks, at which point the duty cycle restrictions, and if using TTN, the Fair Use Policy, would make the bit rate pitiful - 5 x 3 minutes = 12 minutes = 1.4 bits per second.

As @LoRaTracker says, context is everything. If this is a student question then it’s a good discussion, if it’s in a business context, we’ll need to know what your use case is.

Just because I see in some academic papers, the concept of 'energy efficiency is calculated as (Bit rate * Packet delivery ratio)/ Transimison power and the unit is bit/MJ. So I am wondering how to get this value in a real experiment because of the confusing data rate.

Like I mentioned, its fairly easy to measure the actual transmit time of packet, and the LoRa calculators will tell you what the transmit time is anyway.

But data rate is not really an issues as such for TTN, the amount of data being moved around is very small.

There is a LoRa library that has examples of moving large blocks of data (files\images) around with LoRa, but discussions around that are out of scope for this TTN forum.