I can capture with ffmpeg from a device, I can transcode the audio/video, I can stream it to ffserver.
How can I capture and stream with ffmpeg while showing locally what is captured?
Up to now I've been using VLC to capture and stream to localhost, then ffmpeg to get that stream, transcode it again, and stream to ffserver.
I would like to do this using ffmpeg only.
Option A: Use ffmpeg
with multiple outputs and a separate player:
output 2: transcode and send to server
Example using ffplay
ffmpeg -f x11grab [grab parameters] -i :0.0 \
[transcode parameters] -f [transcode output] \
-f rawvideo - | ffplay -f rawvideo [grab parameters] -i -
Option B: ffmpeg
only with OpenGL and a SDL window (requires SDL and --enable-opengl
)
ffmpeg -f x11grab [grab parameters] -i :0.0 \
[transcode parameters] -f [transcode output] \
-f opengl "Window title"
You can also use tee
separately what is more error prone for me (I couldn't get aergistal's solution to work):
cat file | tee >(program_1) [...] >(program_n) | destination
In this case:
ffmpeg -i rtsp://url -codec:a aac -b:a 192k -codec:v copy -f mpegts - | \
tee >(ffplay -f mpegts -i -) | \
ffmpeg -y -f mpegts -i - -c copy /path/to/file.mp4
(Tested with ffmpeg v:3.2.11
[current in Debian stable])
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With