2. Live Streaming in Android
Ahmet Oğuz Mermerkaya
Software Developer
Aselsan
@mekya84
ahmetmermerkaya@gmail.com
3. *
Ahmet Oğuz Mermerkaya
● Writer of an Android Application Programming book,
Merhaba Android, in Turkish
● Member of GDG Ankara
● Software Developer in Aselsan, Defense Industry
Company in Turkiye
4. *
Outline
● How live streaming works
● Building FFmpeg(with x264 and fdk-aac)
● Using Vitamio MediaPlayer
● Implementing a RTSP Server
● Sending Previews and Audio From RTSP Server
● Receiving and Playing Audio On Client
5. *
How live streaming works
● RTSP (Real Time Streaming Protocol)
● RTP (Real Time Protocol)
6. *
How live streaming works
● RTSP (Real Time Streaming Protocol)
● RTP (Real Time Protocol)
This is the
part FFmpeg
takes place
7. *
What is FFmpeg?
● Open-source, cross-platform multimedia framework
● Supports almost all codecs and formats(H264, H263,
AAC, AMR, MP4, MP3, AVI)
● Streams Audio and Video
ffmpeg.org
8. *
Building FFmpeg
● Download
o FFmpeg (http://fmpeg.org)
o fdk_aac for using AAC encoder
(http://sourceforge.net/projects/opencore-amr/files/fdk-aac/)
o libx264 source code for H264 encoder
(http://www.videolan.org/developers/x264.html)
o Android NDK for cross compiling(http://developer.android.com)
● Configure & Make
10. *
Using Vitamio Player
● Get it from vitamio.org and extract
Line 1 VideoView videoView = (VideoView) findViewById(R.id.videoView);
Line 2 videoView.setVideoPath("rtsp://IP_OF_ANDROID_STREAM_SERVER:PORT/live.ts");
● To start vitamio play with partial buffer, follow the instructions on http://vitamio.org/topics/104?locale=en
Then we need a RTSP server
12. *
Sending Previews From RTSP Server
Line 1 public void startVideo(String address, int port) {
Line 2 String videoCommand = "path/to/ffmpeg -analyzeduration 0 -pix_fmt nv21
Line 3 -s 480x360 -vcodec rawvideo -f image2pipe -i - -s 320x240 -crf
18
Line 4 -preset ultrafast -vcodec libx264 -f rtp
rtp://"+address+":"+port;
Line 5 Process ffmpegVideoProcess = Runtime.getRuntime().exec(videoCommand);
Line 6 OutputStream ostream = ffmpegVideoProcess.getOutputStream();
Line 7
Line 8 getCamera().setPreviewCallback(new PreviewCallback(){
Line 9 public void onPreviewFrame(byte[] buffer, Camera cam) {
Line 10 ostream.write(buffer);
Line 11 ostream.flush();
Line 12 }
Line 13 });
Line 14 }
13. *
Sending Audio From RTSP Server
Line 1 public void startAudio(String address, int port) {
Line 2 String audioCommand = "path/to/ffmpeg -analyzeduration 0 -f s16le -ac
Line 3 44100 -ac 1 -i - -ac 1 -acodec libfdk_aac -f adts -vbr
3
Line 4 udp://"+address+ ":" + port +"/";
Line 5 Process ffmpegAudioProcess = Runtime.getRuntime().exec(audioCommand);
Line 6 OutputStream ostream = ffmpegAudioProcess.getOutputStream();
Line 7 prepareAudioRecord();
Line 8 new Thread(){ public void run(){
Line 9 while(true) {
Line 10 int len = audioRecord.read(audioBuffer, 0,
audioBuffer.length);
Line 11 ostream.write(audioBuffer, 0, len);
Line 12 ostream.flush()
Line 13 }
Line 14 }}.start();
14. *
Receiving Audio On Client
Line 1 public void receiveAudio() {
Line 2 String audioCommand = "path/to/ffmpeg -analyzeduration 0 -f aac -strict -2 -
acodec
Line 3 aac -b:a 120k -ac 1 -i - -ac 1 -acodec pcm_s16le -ar 44100 -f s16le -";
Line 5 Process ffmpegAudioProcess = Runtime.getRuntime().exec(audioCommand);
Line 6 OutputStream ostream = ffmpegAudioProcess.getOutputStream();
Line 7 DatagramSocket udpsocket = new DatagramSocket(PORT);
Line 8 DatagramPacket packet = new DatagramPacket(new byte[2048], 2048);
Line 9 new Thread(){ public void run(){
while(true) {
Line 10 udpsocket.receive(packet);
Line 11 ostream.write(datagramPacket.getData(), 0,
datagram.getLength());
Line 12 ostream.flush();
Line 13 }}}.start();
15. *
Playing Audio On Client
Line 1 public void playAudio(final InputStream istream) {
Line 2 byte[] buffer = new byte[2048];
Line 3 int bufferSize = AudioTrack.getMinBufferSize(44100,...);
Line 5 audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC,44100,
Line 6 AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT,
Line 7 bufferSize, AudioTrack.MODE_STREAM);
Line 8 audioTrack.play();
Line 9 while (true) {
Line 10 int len = istream.read(buffer, 0, buffer.length);
Line 11 audioTrack.write(buffer, 0, len);
Line 12 audioTrack.flush()
Line 13 }
Line 14 }
17. *
Android Developer Days
● Expected 1500~ participants
● Partner of Droidcon.com
● 15 co-organizers from 7 countries
● Free of charge
● This year more inspiration, more
networking and more fun
● ADD 2012 web site ->
www.androiddeveloperdays.com/2012
Date: June 14, 15 2013
Venue: METU, Ankara, Turkiye
www.androiddeveloperdays.com
18. Thank you for listening
Live Streaming in Android
Ahmet Oğuz Mermerkaya
@mekya84
ahmetmermerkaya@gmail.com