Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to use h264 live stream with websocket?

Tags:

websocket

Most websocket examples I have seen use either mp4 or wbem container data. Here is some sample javascript client code:

var ms = new MediaSource();
...
var buf = ms.addSourceBuffer('video/mp4; codecs="avc1.64001E"');

In my case, my server sends raw h264 data (video only, no audio). As there is no mp4/avc container for my data, I am wondering what is the proper way to define the parameter for addSourceBuffer(). Do I simply omit video/mp4 tag as follows? Regards.

var buf = ms.addSourceBuffer('codecs="avc1.64001E"');
like image 877
Peter Avatar asked Oct 15 '25 08:10

Peter


1 Answers

I worked on a h264 play based on MediaSource several months ago. I didn't expect getting ups after such a long after the original answer, and I think I should edit this post to be more helpful. BTW I'm not a pro, this post is just based on my experience of using MediaSource API. Comments are welcome to correct me. Thanks!

var buf = ms.addSourceBuffer('video/mp4; codecs="avc1.64001E"');

After buf is created, I think buf expects fragmented MP4 data chunk each time when SourceBuffer.appendBuffer is called.

However you passed RAW H264 data to it which I think browser should give you an exception.

In my case, I used ffmpeg to read from a RTSP stream, convert the data to fMP4 format (without encoding) and send the output to stdout and then let other application to send the data to the browser. (I used WebSocket in fact.)

Here's the parameters:

ffmpeg -i rtsp://example.com/ -an -c:v copy -f mp4 \
       -movflags +frag_keyframe+empty_moov+default_base_moof pipe:1

There's one more thing I want to share. I'm not sure how ffmpeg works, but it doesn't output a completed fragment each time I read from stdout. So in my backend program, I cached the data first. Here's pseudocode in Java:

byte[] oldbuf;
byte[] buffer = ReadDataFromFfmpegStdout();
if (buffer[4] == 'm' && buffer[5] == 'o' && buffer[6] == 'o' && buffer[7] == 'f') {
    send(oldbuf);            // the old buffer is a completed fragment now
    oldbuf = buffer;
} else {
    append(oldbuf, buffer);  // append data to the old buffer
}

[ORIGINAL ANSWER]

You may checkout this project 131/h264-live-player on GitHub, which is based on mbebenita/Broadway, a JavaScript H.264 decoder.

The example of node server-static.js streams a raw h264 video over WebSocket, and the client code render it in a canvas. Git clone that repo, follow the installation instruction, put you h264 file in the samples folder, modify video_path to your video file in server-static.js#L28, execute the node server-static.js and you will see the video played in your browser.

Please be aware that, Broadway can only work with baseline profile.

like image 178
IronBlood Avatar answered Oct 17 '25 20:10

IronBlood