Spydroid-ipcamera is an open-source Android code that streams the camera and microphone of the phone to the browser or to VLC. Its website is here: http://code.google.com/p/spydroid-ipcamera/
It is a typical example in stackoverflow when someone asks about streaming the camera of an Android phone.
Looking into its code:
(1) It builds two sockets as a pair: mSender, mReceiver
private LocalServerSocket mLss = null;
mLss = new LocalServerSocket("net.majorkernelpanic.librtp-"+sId);
mReceiver = new LocalSocket();
mReceiver.connect( new LocalSocketAddress("net.majorkernelpanic.librtp-" + mSocketId ) );
mReceiver.setReceiveBufferSize(500000);
mSender = mLss.accept();
mSender.setSendBufferSize(500000);
(2) The output of the camera is written to mSender: In streaming/MediaStream.java
// We write the ouput of the camera in a local socket instead of a file !
setOutputFile(mSender.getFileDescriptor());
(3) It builds another object, mPacketizer, which includes a member of RtpSocket. The object mPacketizer receives the camera inputStream from mReceiver, encapsulates the camera stream to RTP packets, and then send packets over the network.
// the packetizer encapsulates this stream in an RTP stream and send it over the network
mPacketizer.setInputStream(mReceiver.getInputStream());
In this sense, the whole camera stream flow is:
camera stream --> mSender --> mReceiver --> mPacketizer, RtpSocket --> send out to the network.
My question is: Why does it need two sockets, mSender and mReceiver? Is one socket is sufficient to mediate camera stream and RtpSocket?
MediaRecorder allows you to write camera stream to a file or socket. Now when you need live feed for broadcasting you send it to a LocalSocket(to yourself) and receive at receiver socket for further processing.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With