您的位置:首页 > 移动开发 > Android开发

Broadcasting video with Android - without writing to local files

2010-10-15 09:09 1051 查看
One of the weaker points of the Android platform is the Media API.
When compared to the J2ME API, one important feature is missing: the
ability to record to a stream and to playback from a stream.

Why is this important? There are a number of use cases.

For recording:

post-processing audio / video data before writing out to the file system

broadcasting audio / video without writing out the data first into
the file system, which also limits the broadcast to the available free
space on the device.

For playback:

pre-processing the audio / video data before playing

streaming using protocols that are not supported by the built-in media player

In this blog entry we will show a method to broadcast video (and
audio) from an Android phone to a network server, without writing to the
file system.

There is one promising method in the MediaRecorder
class
setOutputFile(FileDescriptor)

.

We know that in Linux also network sockets have file descriptors. But how could we access the file descriptor of a regular
java.net.Socket

?

Luckily, ParcelFileDescriptor
comes to the rescue, where we can use the
fromSocket(Socket)

static method to create a ParcelFileDescriptor instance from a Socket
object. From this instance, we may now grab the badly needed
FileDescriptor.

It all boils down to these few lines (in pseudocode):

String hostname = "your.host.name";
int port = 1234;

Socket socket = new Socket(InetAddress.getByName(hostname), port);

ParcelFileDescriptor pfd = ParcelFileDescriptor.fromSocket(socket);

MediaRecorder recorder = new MediaRecorder();

// Additional MediaRecorder setup (output format ... etc.) omitted

recorder.setOutputFile(pfd.getFileDescriptor());

recorder.prepare();

recorder.start();


Using this concept we created a small proof of concept application
(together with an even simpler server), which is able to broadcast
videos not limited in length by the available space on the SD Card.

There are a few gotchas, if you want to try this out yourself:

The MediaRecorder records either in 3GPP or in MP4 format. This file format consists of atoms
, where each atom starts with its size. There are different kinds of atoms in a file, mdat
atoms store the actual raw frames of the encoded video and audio. In
the Cupcake version Android starts writing out an mdat atom with the
encoded frames, but it has to leave the size of the atom empty for
obvious reasons. When writing to a seekable file descriptor, it can
simply fill in the blanks after the recording, but of course socket file
descriptors are not seekable. So the received stream will have to be
fixed up after the recording is finished, or the raw video / audio
frames have to be processed by the server.

For some reason, the MediaRecorder also leaves the header of the file blank, which also has to be handled on the server.

High latency connections will cause the video to be choppy.
Obviously some buffering is necessary. One method is to use a local mini
server on the phone which receives the stream, buffers it, and sends to
the remote server as fast as the network allows it. However, if using
native code is an option, we can simply create a pipe to receive the
data from the MediaRecorder. We will show this method in a future blog
entry.
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: