I am currently examining an Android application that uses AudioRecord to record audio in 16-bit PCM format:
byte [] buffer = new byte[1600];
audioRecord.read(buffer, 0, 1600);
It stores the recorded audio into buffer.The documentation of this read function describes that this function should only be used with 8-bit PCM. However, the Android application uses it with 16-bit PCM (and it seems to work without issues; another overloaded read variant using a byte array also mentions that the use of 16-bit PCM with this method is possible, but deprecated).
Now I am unsure if each sample (consisting of 2 bytes) is stored in little or in big endian format. The documentation section about the audio encoding says that using a ByteBuffer instead of byte array results in native endian (instead of big endian).
I suspect that a short is stored in big endian format but I can not find evidence for this.
I have the same situation in my recent project.
Method
public int read (byte[] audioData,
int offsetInBytes,
int sizeInBytes)
actually calls
public int read (byte[] audioData,
int offsetInBytes,
int sizeInBytes,
int readMode)
with AudioRecord.READ_BLOCKING as the 4th parameter. Since 16-bit PCM is deprecated, I think it's still ok to use this method just not recommended.
The read method finally calls native method native_read_in_byte_array to fill the audio buffer. Android is natively little-endian so native_read_in_byte_array stores audio data in little endian in C/C++ layer and then send audio data back to Java layer through JNI.
I did a quick test to find that byte buffer passing through JNI won't change the order in which it stores its bytes.




Basically you get your Java byte[] in the same order as in the C/C++ jbyteArray. So a short in native is still stored in little endian in Java layer.
That's all I can reason from my test. Hope it helps and let me know if something is wrong in there.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With