您的位置:首页 > 其它

Real time H.264 Encoding and Decoding-Using FFmpeg and x264

2015-05-18 09:41 351 查看


Real time H.264 Encoding and Decoding

Using FFmpeg and x264

Introduction

Thisdocumentation serves as a quick overview on the first development phaseof our wireless video compression and transmission testbed, i.e. H.264realtime encoding, transmission, and decoding using two open sourceprojects FFmpeg and x264.

FFmpeg is a comprehensive multimediaencoding and decoding library that consists of numerous audio, video,and container formats. For H.264 video encoding, FFmpeg uses externalopen source library x264. In our testbed, FFmpeg works as an encodingand decoding
front-end, while x264 is the encoding engine. At theencoder side, frames are captured by camera and then encoded into H.264bitstream in real time. Encoded packets are transmitted over UDPpackets to the receiver. At the receiver side, packets are received,decoded,
and displayed in real time too. We have also pruned FFmpegsuch that source codes of unrelated codecs and formats have beenremoved from the package. It will make our future development easier.

Infollowing sections, we will give brief introduction on FFmpeg and x264,as well as necessary external libraries in Section 2. We will alsointroduce how to use our software to perform real time H.264 encoding,transmission, and decoding in Section 3. This
documentation is notintended to be comprehensive. Instead, hopefully it will provide enoughinformation to interested readers to quickly set up and try out thesoftware, or even to do their own development.

Source Packages and Libraries

x264

x264 is a free library for encoding H264/AVC video streams. It is used by many other projects. Below is its official website.

Website: http://www.videolan.org/developers/x264.html
Download: git clone git://git.videolan.org/x264.git

Followingcommands are used to build debug and release version of x264. Aftersuccessful configurations, use “make” and “make install” to compile andinstall x264.

debug version: ./configure --disable-asm --enable-debug --enable-pic

--enable-shared --disable-mp4-output

release version: ./configure --enable-pic --enable-shared

--disable-mp4-output

FFmpeg

FFmpegis a complete open source solution to record, convert and stream audioand video. It includes libavcodec, the leading audio/video codeclibrary used by several other projects. It also includes libavformat,an audio/video container mux and demux library.
FFmpeg is developedunder Linux, but it can compiled under most operating systems,including Windows. For detailed

information and downloading, please visit following website.

Website: http://ffmpeg.mplayerhq.hu/
Download: git clone git://git.mplayerhq.hu/ffmpeg/

cd ffmpeg

git clone git://git.mplayerhq.hu/libswscale/

Following commands are used to build debug and release version of FFmpeg. Note that we must

configureFFmpeg to use x264 library. After successful configurations, use “make”and “make install” to compile and install FFmpeg.

debug version: ./configure --enable-gpl --enable-libx264 --enable-swscale

--enable-debug --disable-optimizations --disable-stripping

release version: ./configure --enable-gpl --enable-libx264 --enable-swscale

Git

Git is an open source version control system designed to handle very large projects with speed

and efficiency, but just as well suited for small personal repositories. We need Git to download

FFmpeg and x264 source codes.

Website: http://git.or.cz/
Download: http://www.kernel.org/pub/software/scm/git/
Subversion (SVN)

Subversion is an open source version control system. It is used to maintain current and historical

versions of files such as source code, web pages, and documentation. Subversion is well-known in

the open source community and is used on many open source projects and we need Subversion to down Yasm and SDL.

Website: http://subversion.tigris.org/
Download: http://subversion.tigris.org/getting.html#source-release
Yasm

Yasmis a complete rewrite of the NASM assembler under the “new” BSD License(some portions are under other licenses, see COPYING for details). BothFFmpeg and x264 uses Yasm.

Website: http://www.tortall.net/projects/yasm/
Download: svn co http://www.tortall.net/svn/yasm/trunk/yasm yasm

Simple DirectMedia Layer (SDL)

SimpleDirectMedia Layer is a cross-platform multimedia library designed toprovide low level access to audio, keyboard, mouse, joystick, 3Dhardware via OpenGL, and 2D video framebuffer. It is used by MPEGplayback software, emulators, and many popular games and
FFmpeg usesSDL for displaying decoded video contents.

Website: http://www.libsdl.org/
Download: svn checkout http://svn.libsdl.org/branches/SDL-1.2
To Use FFmpeg and FFplay

Encoding

Encoding is done by binary ffmpeg in our trimmed
FFmpeg package.It calls x264 library to encode frames captured by camera into H.264bitstreams. We have modified ffmpeg such that encoded bitstreams aresent to receiver over UDP packets, or saved locally, or both. A commoncommand line input is shown below.
In this example, the input of theencoder is camera captured frames with CIF resolution. Frame rate is10fps. The output is H.264 bitstream with bitrate at 50kbps. Intraframe interval is set to 100. The encodeded bitstream is both sent tolocalhost with port
number 12349, and saved as a local file out.264 in/bs directory.

./ffmpeg -g 100 -f video4linux -b 50k -s cif -r 10 -i /dev/video0

-vcodec libx264 -y -dest_ip 127.0.0.1 -dest_port 12349 -f h264../bs/out.264

The command below only sends encoded bitstream over UDP packets:

./ffmpeg -g 100 -f video4linux -b 50k -s cif -r 10 -i /dev/video0

-vcodec libx264 -y -dest_ip 127.0.0.1 -dest_port 12349 -f h264

The command below only saves encoded bitstream to a local file:

./ffmpeg -g 100 -f video4linux -b 50k -s cif -r 10 -i /dev/video0

-vcodec libx264 -y -f h264 ../bs/out.264

In Table 1 we list some frequently used options of ffmpeg.

Decoding

Decoding is done by binary ffplay in our trimmed
FFmpeg package.It decodes H.264 bitstream and displays in real time. We have modifiedffplay such that it can both decodes local H.264 file and UDP packetscontaining H.264 bitstream, but not simultaneously. The command belowdecodes H.264 bitstream in UDP
packets assuming port number 12349 isused. Combined with the first ffmpeg command line example, real timeH.264 encoding and decoding will be performed on the same PC.

./ffplay -dest_port 12349 -f h264

The command below decodes local H.264 bitstream assuming the out.264 file is located in /bs directory.

./ffplay ../bs/out.264

Below we list some frequently used options and hot keys of ffplay.

Options:

-h: show help
-f: force format
-s: set frame size (WxH or abbreviation)
-dest_port: specify destination port number
-fs: force full screen
-x: force displayed width
-y: force displayed height

Hot keys:

q/ESC: quit
f: toggle full screen
p/SPC: pause

Last
updated on Mar 9, 2009 by Qin Chen.Contact:

内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: