You cannot change that. But why would you? Just decode it, and you'll get the binary data, exactly as the node has sent it.
Like, for example, the 32 bits binary data
00001001 10100110 00001100 110100102:
cannot be printed on the screen as plain text, as binary
000010012 is the tab character, and some values might be too large for the 7-bit ASCII standard
could be two 16 bits decimal numbers
2470dec to make it very explicit it's decimal)
could be a 32 bits value
could be written as hexadecimal
0x09A60CD2 and so on (much shorter and easier to read than ones and zeroes, isn't it?)
could also be written in Base64 as
CaYM0g==, and that is what TTN uses
...and could be written in many other ways in many more encodings
But no matter how you print it on the screen, when being transmitted and in your computer's memory it's still the binary ones and zeroes
00001001 10100110 00001100 110100102.
Decoding the Base64 value will give you the binary data again, which you can then decode into, for example, the two decimal numbers 2470 and 3283 (which could be temperature and humidity).
Like to see the hexadecimal representation of the Base64 encoded binary data, on a Mac one could type:
echo -n CaYM0g== | base64 --decode | xxd
0000000: 09a6 0cd2 ....
echo -n CaYM0g== | base64 --decode | hexdump -C
00000000 09 a6 0c d2 |....|
...or to see it in bits:
echo -n CaYM0g== | base64 --decode | xxd -b
0000000: 00001001 10100110 00001100 11010010 ....
The 4 dots
.... indicate that the Mac cannot print the given bytes as plain text. That would be different if your node would send plain text (which it should never do):
echo -n MjQ3MDszMjgz | base64 --decode | xxd
0000000: 3234 3730 3b33 3238 33 2470;3283
echo -n MjQ3MDszMjgz | base64 --decode | hexdump -C
00000000 32 34 37 30 3b 33 32 38 33 |2470;3283|
How the Base64 decoding is done in your application depends on the programming language you're using; Google Search is your friend for that.