Broadcasting video with Android - without writing to local files
2010-10-15 09:09
1051 查看
One of the weaker points of the Android platform is the Media API.
When compared to the J2ME API, one important feature is missing: the
ability to record to a stream and to playback from a stream.
Why is this important? There are a number of use cases.
For recording:
post-processing audio / video data before writing out to the file system
broadcasting audio / video without writing out the data first into
the file system, which also limits the broadcast to the available free
space on the device.
For playback:
pre-processing the audio / video data before playing
streaming using protocols that are not supported by the built-in media player
In this blog entry we will show a method to broadcast video (and
audio) from an Android phone to a network server, without writing to the
file system.
There is one promising method in the MediaRecorder
class
.
We know that in Linux also network sockets have file descriptors. But how could we access the file descriptor of a regular
?
Luckily, ParcelFileDescriptor
comes to the rescue, where we can use the
static method to create a ParcelFileDescriptor instance from a Socket
object. From this instance, we may now grab the badly needed
FileDescriptor.
It all boils down to these few lines (in pseudocode):
Using this concept we created a small proof of concept application
(together with an even simpler server), which is able to broadcast
videos not limited in length by the available space on the SD Card.
There are a few gotchas, if you want to try this out yourself:
The MediaRecorder records either in 3GPP or in MP4 format. This file format consists of atoms
, where each atom starts with its size. There are different kinds of atoms in a file, mdat
atoms store the actual raw frames of the encoded video and audio. In
the Cupcake version Android starts writing out an mdat atom with the
encoded frames, but it has to leave the size of the atom empty for
obvious reasons. When writing to a seekable file descriptor, it can
simply fill in the blanks after the recording, but of course socket file
descriptors are not seekable. So the received stream will have to be
fixed up after the recording is finished, or the raw video / audio
frames have to be processed by the server.
For some reason, the MediaRecorder also leaves the header of the file blank, which also has to be handled on the server.
High latency connections will cause the video to be choppy.
Obviously some buffering is necessary. One method is to use a local mini
server on the phone which receives the stream, buffers it, and sends to
the remote server as fast as the network allows it. However, if using
native code is an option, we can simply create a pipe to receive the
data from the MediaRecorder. We will show this method in a future blog
entry.
When compared to the J2ME API, one important feature is missing: the
ability to record to a stream and to playback from a stream.
Why is this important? There are a number of use cases.
For recording:
post-processing audio / video data before writing out to the file system
broadcasting audio / video without writing out the data first into
the file system, which also limits the broadcast to the available free
space on the device.
For playback:
pre-processing the audio / video data before playing
streaming using protocols that are not supported by the built-in media player
In this blog entry we will show a method to broadcast video (and
audio) from an Android phone to a network server, without writing to the
file system.
There is one promising method in the MediaRecorder
class
setOutputFile(FileDescriptor)
.
We know that in Linux also network sockets have file descriptors. But how could we access the file descriptor of a regular
java.net.Socket
?
Luckily, ParcelFileDescriptor
comes to the rescue, where we can use the
fromSocket(Socket)
static method to create a ParcelFileDescriptor instance from a Socket
object. From this instance, we may now grab the badly needed
FileDescriptor.
It all boils down to these few lines (in pseudocode):
String hostname = "your.host.name"; int port = 1234; Socket socket = new Socket(InetAddress.getByName(hostname), port); ParcelFileDescriptor pfd = ParcelFileDescriptor.fromSocket(socket); MediaRecorder recorder = new MediaRecorder(); // Additional MediaRecorder setup (output format ... etc.) omitted recorder.setOutputFile(pfd.getFileDescriptor()); recorder.prepare(); recorder.start();
Using this concept we created a small proof of concept application
(together with an even simpler server), which is able to broadcast
videos not limited in length by the available space on the SD Card.
There are a few gotchas, if you want to try this out yourself:
The MediaRecorder records either in 3GPP or in MP4 format. This file format consists of atoms
, where each atom starts with its size. There are different kinds of atoms in a file, mdat
atoms store the actual raw frames of the encoded video and audio. In
the Cupcake version Android starts writing out an mdat atom with the
encoded frames, but it has to leave the size of the atom empty for
obvious reasons. When writing to a seekable file descriptor, it can
simply fill in the blanks after the recording, but of course socket file
descriptors are not seekable. So the received stream will have to be
fixed up after the recording is finished, or the raw video / audio
frames have to be processed by the server.
For some reason, the MediaRecorder also leaves the header of the file blank, which also has to be handled on the server.
High latency connections will cause the video to be choppy.
Obviously some buffering is necessary. One method is to use a local mini
server on the phone which receives the stream, buffers it, and sends to
the remote server as fast as the network allows it. However, if using
native code is an option, we can simply create a pipe to receive the
data from the MediaRecorder. We will show this method in a future blog
entry.
相关文章推荐
- tigase客户端通过ip访问 Not able to connect Android client with local XMPP server
- with ffmpeg to encode video for live streaming and for recording to files for on-demand playback
- cocos2dx-3.13.1编译android报错/Android.mk:gnustl_static: LOCAL_SRC_FILES points to a missing file
- Files to be needed by importing the android application with eclipse
- Android display local image/CSS files in HTML with WebView
- cocos编译android时出现LOCAL_SRC_FILES points to a missing file
- Generate mp4 video with image files using Jcodec in Android
- Uploading audio, video or image files from Android to server
- Uploading audio, video or image files from Android to php server.
- Generate mp4 video with image files using MediaCodec in Android
- How to get Android local files URI
- Correct Smartphone Video Orientation and How To Rotate iOS and Android Videos with ffmpeg
- RHEL5 vsftpd :451 Failure writing to local file.
- Unable to execute dex: Multiple dex files define Lcom/android/rfid/CardInfoParser
- Android Studio Installation failed with message Failed to establish session.
- Unable to add window -- token android.app.LocalActivityManager$LocalActivityRecord@435def20 is not v
- Java, ProGuard, and Ant - How to obfuscate Java class files with ProGuard
- android提示Field can be converted to a local varible
- 关于 Android导出apk时碰到的[Unable to execute dex: Multiple dex files define]
- android问题:conversion to dalvik format failed with