You cannot change that. But why would you? Just decode it, and you’ll get the binary data, exactly as the node has sent it.
Like, for example, the 32 bits binary data 00001001 10100110 00001100 11010010
2:
-
cannot be printed on the screen as plain text, as in the ASCII standard binary
00001001
2 is the tab character,00001100
2 is a form feed, and10100110
2 and11010010
2 are too large and not defined (but in ISO/IEC 8859-1 could be¦
andÒ
, or in ISO/IEC 8859-2 would beŚ
andŇ
, and in ISO/IEC 8859-5 beІ
andв
; even for “plain” text, one would need to agree on the exact encoding that is used) -
could be two unsigned 16 bits integer numbers
2470
and3283
, when assuming MSB (or2470
10, or2470
dec to make it very explicit it’s decimal) -
could be four unsigned 8 bits integer numbers
9
,166
,12
and210
-
could be four signed 8 bits integer numbers,
9
,-90
,12
and-46
-
could be a single 32 bits integer number
161877202
for MSB, or either3524044297
or-770922999
for LSB -
could be a 32 bits floating point value
3.997510e-33
, or0.0000000000000000000000000000000039975
-
could be any combination of the above, like one 16 bits signed integer, a 8 bits unsigned integer, and a signed 8 bits integer, and in any order
-
could be written as hexadecimal
09A60CD2
16,09a60cd2
16,09a60cd2
hex, or0x09A60CD2
and so on (much shorter and easier to read than ones and zeroes, isn’t it?) -
could also be written in Base64 as
CaYM0g==
, and that is what TTN uses -
…and could be written in many other ways in many more encodings
But no matter how you print it on the screen, when being transmitted and in your computer’s memory it’s still the binary ones and zeroes 00001001 10100110 00001100 11010010
2.
Decoding the Base64 value will give you the binary data again, which you can then decode into, for example, the two integer numbers 2470 and 3283 (which could be temperature and humidity).
Like to see the hexadecimal representation of the Base64 encoded binary data, on a Mac one could type:
echo -n CaYM0g== | base64 --decode | xxd
0000000: 09a6 0cd2 ....
or:
echo -n CaYM0g== | base64 --decode | hexdump -C
00000000 09 a6 0c d2 |....|
00000004
…or to see it in bits:
echo -n CaYM0g== | base64 --decode | xxd -b
0000000: 00001001 10100110 00001100 11010010 ....
The 4 dots ....
indicate that the Mac cannot print the given bytes as plain text. That would be different if your node would send plain text (which it should never do):
echo -n MjQ3MDszMjgz | base64 --decode | xxd
0000000: 3234 3730 3b33 3238 33 2470;3283
or:
echo -n MjQ3MDszMjgz | base64 --decode | hexdump -C
00000000 32 34 37 30 3b 33 32 38 33 |2470;3283|
00000009
How the Base64 decoding is done in your application depends on the programming language you’re using; Google Search is your friend for that.