0、本文概述
MediaCodec是android api 16以後開放的硬編解碼接口,英文文檔參照這個連結,中文翻譯可以參考這個連結。本文主要記錄的是如何使用MediaCodec對視訊進行編解碼,最後會以執行個體的方式展示如何将Camera預覽資料編碼成H264,再把編碼後的h264解碼并且顯示在SurfaceView中。本例不涉及音頻的編解碼。
1、MediaCodec編碼視訊
使用MediaCodec實作視訊編碼的步驟如下:
1.初始化MediaCodec,方法有兩種,分别是通過名稱和類型來建立,對應的方法為:
MediaCodec createByCodecName (String name);
MediaCodec createDecoderByType (String type);
具體可用的name和type參考文檔即可。這裡我們通過後者來初始化一個視訊編碼器。
2.配置MediaCodec,這一步需要配置的是MediaFormat,這個類包含了比特率、幀率、關鍵幀間隔時間等,其中比特率如果太低就會造成類似馬賽克的現象。
mMF = MediaFormat.createVideoFormat(MIME_TYPE, width, height);
mMF.setInteger(MediaFormat.KEY_BIT_RATE, bitrate);
mMF.setInteger(MediaFormat.KEY_FRAME_RATE, framerate);
if (mPrimeColorFormat != 0){
mMF.setInteger(MediaFormat.KEY_COLOR_FORMAT, mPrimeColorFormat);
}
mMF.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1); //關鍵幀間隔時間 機關s
mMC.configure(mMF, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
其中mPrimeColorFormat為本機支援的顔色空間。一般是yuv420p或者yuv420sp,Camera預覽格式一般是yv12或者NV21,是以在編碼之前需要進行格式轉換,執行個體可參照文末代碼。代碼是最好的老師嘛。
3.打開編碼器,擷取輸入輸出緩沖區
mMC.start();
mInputBuffers = mMC.getInputBuffers();
mOutputBuffers = mMC.getOutputBuffers();
4.輸入資料,過程可以分為以下幾個小步:
1)擷取可使用緩沖區位置得到索引
int inputbufferindex = mMC.dequeueInputBuffer(BUFFER_TIMEOUT);
如果存在可用的緩沖區,此方法會傳回其位置索引,否則傳回-1,參數為逾時時間,機關是毫秒,如果此參數是0,則立即傳回,如果參數小于0,則無限等待直到有可使用的緩沖區,如果參數大于0,則等待時間為傳入的毫秒值。
2)傳入原始資料
ByteBuffer inputBuffer = mInputBuffers[inputbufferindex];
inputBuffer.clear();//清除原來的内容以接收新的内容
inputBuffer.put(bytes, 0, len);//len是傳進來的有效資料長度
mMC.queueInputBuffer(inputbufferindex, 0, len, timestamp, 0);
此緩沖區一旦使用,隻有在dequeueInputBuffer傳回其索引位置才代表它可以再次使用。
5.擷取其輸出資料,擷取輸入原始資料和擷取輸出資料最好是異步進行,因為輸入一幀資料不代表編碼器馬上就會輸出對應的編碼資料,可能輸入好幾幀才會輸出一幀。擷取輸出資料的步驟與輸入資料的步驟相似:
1)擷取可用的輸出緩沖區
其中參數一是一個BufferInfo類型的執行個體,參數二為逾時時間,負數代表無限等待(可見,不要在主線程進行操作)。
2)擷取輸出資料
3)釋放緩沖區
2、MediaCodec解碼視訊
解碼視訊的步驟跟編碼的類似,配置不一樣:
1.執行個體化解碼器
2.配置解碼器,此處需要配置用于顯示圖像的Surface、MediaFormat包含視訊的pps和sps(包含在編碼出來的第一幀資料)
int[] width = new int[1];
int[] height = new int[1];
AvcUtils.parseSPS(sps, width, height);//從sps中解析出視訊寬高
mMF = MediaFormat.createVideoFormat(MIME_TYPE, width[0], height[0]);
mMF.setByteBuffer("csd-0", ByteBuffer.wrap(sps));
mMF.setByteBuffer("csd-1", ByteBuffer.wrap(pps));
mMF.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, width[0] * height[0]);
mMC.configure(mMF, surface, null, 0);
3.開啟編碼器并擷取輸入輸出緩沖區
mMC.start();
mInputBuffers = mMC.getInputBuffers();
mOutputBuffers = mMC.getOutputBuffers();
4.輸入資料
1)擷取可用的輸入緩沖區
傳回值為可用緩沖區的索引
ByteBuffer inputBuffer = mInputBuffers[inputbufferindex];
inputBuffer.clear();
2)然後輸入資料
inputBuffer.put(bytes, 0, len);
mMC.queueInputBuffer(inputbufferindex, 0, len, timestamp, 0);
5.擷取輸出資料,這一步與4同樣應該異步進行,其具體步驟與上面解碼的基本相同,在釋放緩沖區的時候需要注意第二個參數設定為true,表示解碼顯示在Surface上
3、編解碼執行個體
下面是一個MediaCodec編解碼執行個體,此例子Camera預覽資料(yv12)編碼成H264,再把編碼後的h264解碼并且顯示在SurfaceView中。
##3.1布局檔案
布局檔案非常簡單,兩個SurfaceView分别用于顯示編解碼的圖像,兩個按鈕控制開始和停止,一個TextView用于顯示捕捉幀率。布局檔案代碼就不展示了,界面如下

3.2編碼器類Encoder
package com.example.mediacodecpro;
import android.media.MediaCodec;
import android.media.MediaCodecInfo;
import android.media.MediaFormat;
import android.util.Log;
import java.io.IOException;
import java.nio.ByteBuffer;
/**
* Created by chuibai on 2017/3/10.<br />
*/
public class Encoder {
public static final int TRY_AGAIN_LATER = -1;
public static final int BUFFER_OK = 0;
public static final int BUFFER_TOO_SMALL = 1;
public static final int OUTPUT_UPDATE = 2;
private int format = 0;
private final String MIME_TYPE = "video/avc";
private MediaCodec mMC = null;
private MediaFormat mMF;
private ByteBuffer[] inputBuffers;
private ByteBuffer[] outputBuffers;
private long BUFFER_TIMEOUT = 0;
private MediaCodec.BufferInfo mBI;
/**
* 初始化編碼器
* @throws IOException 建立編碼器失敗會抛出異常
*/
public void init() throws IOException {
mMC = MediaCodec.createEncoderByType(MIME_TYPE);
format = MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar;
mBI = new MediaCodec.BufferInfo();
}
/**
* 配置編碼器,需要配置顔色、幀率、比特率以及視訊寬高
* @param width 視訊的寬
* @param height 視訊的高
* @param bitrate 視訊比特率
* @param framerate 視訊幀率
*/
public void configure(int width,int height,int bitrate,int framerate){
if(mMF == null){
mMF = MediaFormat.createVideoFormat(MIME_TYPE, width, height);
mMF.setInteger(MediaFormat.KEY_BIT_RATE, bitrate);
mMF.setInteger(MediaFormat.KEY_FRAME_RATE, framerate);
if (format != 0){
mMF.setInteger(MediaFormat.KEY_COLOR_FORMAT, format);
}
mMF.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, -1); //關鍵幀間隔時間 機關s
}
mMC.configure(mMF,null,null,MediaCodec.CONFIGURE_FLAG_ENCODE);
}
/**
* 開啟編碼器,擷取輸入輸出緩沖區
*/
public void start(){
mMC.start();
inputBuffers = mMC.getInputBuffers();
outputBuffers = mMC.getOutputBuffers();
}
/**
* 向編碼器輸入資料,此處要求輸入YUV420P的資料
* @param data YUV資料
* @param len 資料長度
* @param timestamp 時間戳
* @return
*/
public int input(byte[] data,int len,long timestamp){
int index = mMC.dequeueInputBuffer(BUFFER_TIMEOUT);
Log.e("...","" + index);
if(index >= 0){
ByteBuffer inputBuffer = inputBuffers[index];
inputBuffer.clear();
if(inputBuffer.capacity() < len){
mMC.queueInputBuffer(index, 0, 0, timestamp, 0);
return BUFFER_TOO_SMALL;
}
inputBuffer.put(data,0,len);
mMC.queueInputBuffer(index,0,len,timestamp,0);
}else{
return index;
}
return BUFFER_OK;
}
/**
* 輸出編碼後的資料
* @param data 資料
* @param len 有效資料長度
* @param ts 時間戳
* @return
*/
public int output(/*out*/byte[] data,/* out */int[] len,/* out */long[] ts){
int i = mMC.dequeueOutputBuffer(mBI, BUFFER_TIMEOUT);
if(i >= 0){
if(mBI.size > data.length) return BUFFER_TOO_SMALL;
outputBuffers[i].position(mBI.offset);
outputBuffers[i].limit(mBI.offset + mBI.size);
outputBuffers[i].get(data, 0, mBI.size);
len[0] = mBI.size ;
ts[0] = mBI.presentationTimeUs;
mMC.releaseOutputBuffer(i, false);
} else if (i == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
outputBuffers = mMC.getOutputBuffers();
return OUTPUT_UPDATE;
} else if (i == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
mMF = mMC.getOutputFormat();
return OUTPUT_UPDATE;
} else if (i == MediaCodec.INFO_TRY_AGAIN_LATER) {
return TRY_AGAIN_LATER;
}
return BUFFER_OK;
}
public void release(){
mMC.stop();
mMC.release();
mMC = null;
outputBuffers = null;
inputBuffers = null;
}
public void flush() {
mMC.flush();
}
}
3.3解碼器類Decoder
package com.example.mediacodecpro;
import android.media.MediaCodec;
import android.media.MediaFormat;
import android.view.Surface;
import java.io.IOException;
import java.nio.ByteBuffer;
/**
* Created by chuibai on 2017/3/10.<br />
*/
public class Decoder {
public static final int TRY_AGAIN_LATER = -1;
public static final int BUFFER_OK = 0;
public static final int BUFFER_TOO_SMALL = 1;
public static final int OUTPUT_UPDATE = 2;
private final String MIME_TYPE = "video/avc";
private MediaCodec mMC = null;
private MediaFormat mMF;
private long BUFFER_TIMEOUT = 0;
private MediaCodec.BufferInfo mBI;
private ByteBuffer[] mInputBuffers;
private ByteBuffer[] mOutputBuffers;
/**
* 初始化編碼器
* @throws IOException 建立編碼器失敗會抛出異常
*/
public void init() throws IOException {
mMC = MediaCodec.createDecoderByType(MIME_TYPE);
mBI = new MediaCodec.BufferInfo();
}
/**
* 配置解碼器
* @param sps 用于配置的sps參數
* @param pps 用于配置的pps參數
* @param surface 用于解碼顯示的Surface
*/
public void configure(byte[] sps, byte[] pps, Surface surface){
int[] width = new int[1];
int[] height = new int[1];
AvcUtils.parseSPS(sps, width, height);//從sps中解析出視訊寬高
mMF = MediaFormat.createVideoFormat(MIME_TYPE, width[0], height[0]);
mMF.setByteBuffer("csd-0", ByteBuffer.wrap(sps));
mMF.setByteBuffer("csd-1", ByteBuffer.wrap(pps));
mMF.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, width[0] * height[0]);
mMC.configure(mMF, surface, null, 0);
}
/**
* 開啟解碼器,擷取輸入輸出緩沖區
*/
public void start(){
mMC.start();
mInputBuffers = mMC.getInputBuffers();
mOutputBuffers = mMC.getOutputBuffers();
}
/**
* 輸入資料
* @param data 輸入的資料
* @param len 資料有效長度
* @param timestamp 時間戳
* @return 成功則傳回{@link #BUFFER_OK} 否則傳回{@link #TRY_AGAIN_LATER}
*/
public int input(byte[] data,int len,long timestamp){
int i = mMC.dequeueInputBuffer(BUFFER_TIMEOUT);
if(i >= 0){
ByteBuffer inputBuffer = mInputBuffers[i];
inputBuffer.clear();
inputBuffer.put(data, 0, len);
mMC.queueInputBuffer(i, 0, len, timestamp, 0);
}else {
return TRY_AGAIN_LATER;
}
return BUFFER_OK;
}
public int output(byte[] data,int[] len,long[] ts){
int i = mMC.dequeueOutputBuffer(mBI, BUFFER_TIMEOUT);
if(i >= 0){
if (mOutputBuffers[i] != null)
{
mOutputBuffers[i].position(mBI.offset);
mOutputBuffers[i].limit(mBI.offset + mBI.size);
if (data != null)
mOutputBuffers[i].get(data, 0, mBI.size);
len[0] = mBI.size;
ts[0] = mBI.presentationTimeUs;
}
mMC.releaseOutputBuffer(i, true);
}else{
return TRY_AGAIN_LATER;
}
return BUFFER_OK;
}
public void flush(){
mMC.flush();
}
public void release() {
flush();
mMC.stop();
mMC.release();
mMC = null;
mInputBuffers = null;
mOutputBuffers = null;
}
}
3.4MainAcitivity
package com.example.mediacodecpro;
import android.content.pm.ActivityInfo;
import android.graphics.ImageFormat;
import android.hardware.Camera;
import android.os.Bundle;
import android.os.Handler;
import android.os.Looper;
import android.os.Message;
import android.support.v7.app.AppCompatActivity;
import android.util.Log;
import android.view.SurfaceView;
import android.view.View;
import android.widget.Button;
import android.widget.TextView;
import java.io.IOException;
import java.io.OutputStream;
import java.net.DatagramPacket;
import java.net.DatagramSocket;
import java.net.InetAddress;
import java.net.Socket;
import java.nio.ByteBuffer;
import java.util.Iterator;
import java.util.LinkedList;
import java.util.Queue;
import butterknife.BindView;
import butterknife.ButterKnife;
public class MainActivity extends AppCompatActivity implements View.OnClickListener, Camera.PreviewCallback {
@BindView(R.id.surfaceView_encode)
SurfaceView surfaceViewEncode;
@BindView(R.id.surfaceView_decode)
SurfaceView surfaceViewDecode;
@BindView(R.id.btnStart)
Button btnStart;
@BindView(R.id.btnStop)
Button btnStop;
@BindView(R.id.capture)
TextView capture;
private int width;
private int height;
private int bitrate;
private int framerate;
private int captureFrame;
private Camera mCamera;
private Queue<PreviewBufferInfo> mPreviewBuffers_clean;
private Queue<PreviewBufferInfo> mPreviewBuffers_dirty;
private Queue<PreviewBufferInfo> mDecodeBuffers_clean;
private Queue<PreviewBufferInfo> mDecodeBuffers_dirty;
private int PREVIEW_POOL_CAPACITY = 5;
private int format;
private int DECODE_UNI_SIZE = 1024 * 1024;
private byte[] mAvcBuf = new byte[1024 * 1024];
private final int MSG_ENCODE = 0;
private final int MSG_DECODE = 1;
private String TAG = "MainActivity";
private long mLastTestTick = 0;
private Object mAvcEncLock;
private Object mDecEncLock;
private Decoder mDecoder;
private Handler codecHandler;
private byte[] mRawData;
private Encoder mEncoder;
private CodecThread codecThread;
private DatagramSocket socket;
private DatagramPacket packet;
private byte[] sps_pps;
private byte[] mPacketBuf = new byte[1024 * 1024];
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
ButterKnife.bind(this);
//初始化參數
initParams();
//設定監聽事件
btnStart.setOnClickListener(this);
btnStop.setOnClickListener(this);
}
/**
* 初始化參數,包括幀率、顔色、比特率,視訊寬高等
*/
private void initParams() {
width = 352;
height = 288;
bitrate = 1500000;
framerate = 30;
captureFrame = 0;
format = ImageFormat.YV12;
mAvcEncLock = new Object();
mDecEncLock = new Object();
}
@Override
protected void onResume() {
if (getRequestedOrientation() != ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE) {
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
}
super.onResume();
}
@Override
public void onClick(View v) {
switch (v.getId()){
case R.id.btnStart:
mCamera = Camera.open(0);
initQueues();
initEncoder();
initCodecThread();
startPreview();
break;
case R.id.btnStop:
releaseCodecThread();
releseEncoderAndDecoder();
releaseCamera();
releaseQueue();
break;
}
}
/**
* 釋放隊列資源
*/
private void releaseQueue() {
if (mPreviewBuffers_clean != null){
mPreviewBuffers_clean.clear();
mPreviewBuffers_clean = null;
}
if (mPreviewBuffers_dirty != null){
mPreviewBuffers_dirty.clear();
mPreviewBuffers_dirty = null;
}
if (mDecodeBuffers_clean != null){
mDecodeBuffers_clean.clear();
mDecodeBuffers_clean = null;
}
if (mDecodeBuffers_dirty != null){
mDecodeBuffers_dirty.clear();
mDecodeBuffers_dirty = null;
}
}
/**
* 釋放攝像頭資源
*/
private void releaseCamera() {
if(mCamera != null){
mCamera.setPreviewCallbackWithBuffer(null);
mCamera.stopPreview();
mCamera.release();
mCamera = null;
}
}
private void releseEncoderAndDecoder() {
if(mEncoder != null){
mEncoder.flush();
mEncoder.release();
mEncoder = null;
}
if(mDecoder != null){
mDecoder.release();
mDecoder = null;
}
}
private void releaseCodecThread() {
codecHandler.getLooper().quit();
codecHandler = null;
codecThread = null;
}
private void initCodecThread() {
codecThread = new CodecThread();
codecThread.start();
}
/**
* 開啟預覽
*/
private void startPreview() {
Camera.Parameters parameters = mCamera.getParameters();
parameters.setPreviewFormat(format);
parameters.setPreviewFrameRate(framerate);
parameters.setPreviewSize(width,height);
mCamera.setParameters(parameters);
try {
mCamera.setPreviewDisplay(surfaceViewEncode.getHolder());
} catch (IOException e) {
e.printStackTrace();
}
mCamera.setPreviewCallbackWithBuffer(this);
mCamera.startPreview();
}
@Override
public void onPreviewFrame(byte[] data, Camera camera) {
/** 預覽的data為null */
if(data == null) {
Log.e(TAG,"預覽的data為null");
return;
}
long curTick = System.currentTimeMillis();
if (mLastTestTick == 0) {
mLastTestTick = curTick;
}
if (curTick > mLastTestTick + 1000) {
setCaptureFPSTextView(captureFrame);
captureFrame = 0;
mLastTestTick = curTick;
} else
captureFrame++;
synchronized(mAvcEncLock) {
PreviewBufferInfo info = mPreviewBuffers_clean.poll(); //remove the head of queue
info.buffer = data;
info.size = getPreviewBufferSize(width, height, format);
info.timestamp = System.currentTimeMillis();
mPreviewBuffers_dirty.add(info);
if(mDecoder == null){
codecHandler.sendEmptyMessage(MSG_ENCODE);
}
}
}
private void setCaptureFPSTextView(int captureFrame) {
capture.setText("目前幀率:" + captureFrame);
}
private void initEncoder() {
mEncoder = new Encoder();
try {
mEncoder.init();
mEncoder.configure(width,height,bitrate,framerate);
mEncoder.start();
} catch (IOException e) {
e.printStackTrace();
}
}
/**
* 初始化各種隊列
*/
private void initQueues() {
if (mPreviewBuffers_clean == null)
mPreviewBuffers_clean = new LinkedList<>();
if (mPreviewBuffers_dirty == null)
mPreviewBuffers_dirty = new LinkedList<>();
int size = getPreviewBufferSize(width, height, format);
for (int i = 0; i < PREVIEW_POOL_CAPACITY; i++) {
byte[] mem = new byte[size];
mCamera.addCallbackBuffer(mem); //ByteBuffer.array is a reference, not a copy
PreviewBufferInfo info = new PreviewBufferInfo();
info.buffer = null;
info.size = 0;
info.timestamp = 0;
mPreviewBuffers_clean.add(info);
}
if (mDecodeBuffers_clean == null)
mDecodeBuffers_clean = new LinkedList<>();
if (mDecodeBuffers_dirty == null)
mDecodeBuffers_dirty = new LinkedList<>();
for (int i = 0; i < PREVIEW_POOL_CAPACITY; i++) {
PreviewBufferInfo info = new PreviewBufferInfo();
info.buffer = new byte[DECODE_UNI_SIZE];
info.size = 0;
info.timestamp = 0;
mDecodeBuffers_clean.add(info);
}
}
/**
* 擷取預覽buffer的大小
* @param width 預覽寬
* @param height 預覽高
* @param format 預覽顔色格式
* @return 預覽buffer的大小
*/
private int getPreviewBufferSize(int width, int height, int format) {
int size = 0;
switch (format) {
case ImageFormat.YV12: {
int yStride = (int) Math.ceil(width / 16.0) * 16;
int uvStride = (int) Math.ceil((yStride / 2) / 16.0) * 16;
int ySize = yStride * height;
int uvSize = uvStride * height / 2;
size = ySize + uvSize * 2;
}
break;
case ImageFormat.NV21: {
float bytesPerPix = (float) ImageFormat.getBitsPerPixel(format) / 8;
size = (int) (width * height * bytesPerPix);
}
break;
}
return size;
}
private void swapYV12toI420(byte[] yv12bytes, byte[] i420bytes, int width, int height) {
System.arraycopy(yv12bytes, 0, i420bytes, 0, width * height);
System.arraycopy(yv12bytes, width * height + width * height / 4, i420bytes, width * height, width * height / 4);
System.arraycopy(yv12bytes, width * height, i420bytes, width * height + width * height / 4, width * height / 4);
}
private class PreviewBufferInfo {
public byte[] buffer;
public int size;
public long timestamp;
}
private class CodecThread extends Thread {
@Override
public void run() {
Looper.prepare();
codecHandler = new Handler() {
@Override
public void handleMessage(Message msg) {
switch (msg.what) {
case MSG_ENCODE:
int res = Encoder.BUFFER_OK;
synchronized (mAvcEncLock) {
if (mPreviewBuffers_dirty != null && mPreviewBuffers_clean != null) {
Iterator<PreviewBufferInfo> ite = mPreviewBuffers_dirty.iterator();
while (ite.hasNext()) {
PreviewBufferInfo info = ite.next();
byte[] data = info.buffer;
int data_size = info.size;
if (format == ImageFormat.YV12) {
if (mRawData == null || mRawData.length < data_size) {
mRawData = new byte[data_size];
}
swapYV12toI420(data, mRawData, width, height);
} else {
Log.e(TAG, "preview size MUST be YV12, cur is " + format);
mRawData = data;
}
res = mEncoder.input(mRawData, data_size, info.timestamp);
if (res != Encoder.BUFFER_OK) {
// Log.e(TAG, "mEncoder.input, maybe wrong:" + res);
break; //the rest buffers shouldn't go into encoder, if the previous one get problem
} else {
ite.remove();
mPreviewBuffers_clean.add(info);
if (mCamera != null) {
mCamera.addCallbackBuffer(data);
}
}
}
}
}
while (res == Encoder.BUFFER_OK) {
int[] len = new int[1];
long[] ts = new long[1];
synchronized (mAvcEncLock) {
res = mEncoder.output(mAvcBuf, len, ts);
}
if (res == Encoder.BUFFER_OK) {
//發送h264
if(sps_pps != null){
send(len[0]);
}
if (mDecodeBuffers_clean != null && mDecodeBuffers_dirty != null) {
synchronized (mAvcEncLock) {
Iterator<PreviewBufferInfo> ite = mDecodeBuffers_clean.iterator();
if (ite.hasNext()) {
PreviewBufferInfo bufferInfo = ite.next();
if (bufferInfo.buffer.length >= len[0]) {
bufferInfo.timestamp = ts[0];
bufferInfo.size = len[0];
System.arraycopy(mAvcBuf, 0, bufferInfo.buffer, 0, len[0]);
ite.remove();
mDecodeBuffers_dirty.add(bufferInfo);
} else {
Log.e(TAG, "decoder uni buffer too small, need " + len[0] + " but has " + bufferInfo.buffer.length);
}
}
}
initDecoder(len);
}
}
}
codecHandler.sendEmptyMessageDelayed(MSG_ENCODE, 30);
break;
case MSG_DECODE:
synchronized (mDecEncLock) {
int result = Decoder.BUFFER_OK;
//STEP 1: handle input buffer
if (mDecodeBuffers_dirty != null && mDecodeBuffers_clean != null) {
Iterator<PreviewBufferInfo> ite = mDecodeBuffers_dirty.iterator();
while (ite.hasNext()) {
PreviewBufferInfo info = ite.next();
result = mDecoder.input(info.buffer, info.size, info.timestamp);
if (result != Decoder.BUFFER_OK) {
break; //the rest buffers shouldn't go into encoder, if the previous one get problem
} else {
ite.remove();
mDecodeBuffers_clean.add(info);
}
}
}
int[] len = new int[1];
long[] ts = new long[1];
while (result == Decoder.BUFFER_OK) {
result = mDecoder.output(null, len, ts);
}
}
codecHandler.sendEmptyMessageDelayed(MSG_DECODE, 30);
break;
}
}
};
Looper.loop();
}
}
private void send(int len) {
try {
if(socket == null) socket = new DatagramSocket();
if(packet == null){
packet = new DatagramPacket(mPacketBuf,0,sps_pps.length + len);
packet.setAddress(InetAddress.getByName("192.168.43.1"));
packet.setPort(5006);
}
if(mAvcBuf[4] == 0x65){
System.arraycopy(sps_pps,0,mPacketBuf,0,sps_pps.length);
System.arraycopy(mAvcBuf,0,mPacketBuf,sps_pps.length,len);
len += sps_pps.length;
}else{
System.arraycopy(mAvcBuf,0,mPacketBuf,0,len);
}
packet.setLength(len);
socket.send(packet);
} catch (IOException e) {
e.printStackTrace();
}
}
private void initDecoder(int[] len) {
if(sps_pps == null){
sps_pps = new byte[len[0]];
System.arraycopy(mAvcBuf,0,sps_pps,0,len[0]);
}
if(mDecoder == null){
mDecoder = new Decoder();
try {
mDecoder.init();
} catch (IOException e) {
e.printStackTrace();
}
byte[] sps_nal = null;
int sps_len = 0;
byte[] pps_nal = null;
int pps_len = 0;
ByteBuffer byteb = ByteBuffer.wrap(mAvcBuf, 0, len[0]);
//SPS
if (true == AvcUtils.goToPrefix(byteb)) {
int sps_position = 0;
int pps_position = 0;
int nal_type = AvcUtils.getNalType(byteb);
if (AvcUtils.NAL_TYPE_SPS == nal_type) {
Log.d(TAG, "OutputAvcBuffer, AVC NAL type: SPS");
sps_position = byteb.position() - AvcUtils.START_PREFIX_LENGTH - AvcUtils.NAL_UNIT_HEADER_LENGTH;
//PPS
if (true == AvcUtils.goToPrefix(byteb)) {
nal_type = AvcUtils.getNalType(byteb);
if (AvcUtils.NAL_TYPE_PPS == nal_type) {
pps_position = byteb.position() - AvcUtils.START_PREFIX_LENGTH - AvcUtils.NAL_UNIT_HEADER_LENGTH;
sps_len = pps_position - sps_position;
sps_nal = new byte[sps_len];
int cur_pos = byteb.position();
byteb.position(sps_position);
byteb.get(sps_nal, 0, sps_len);
byteb.position(cur_pos);
//slice
if (true == AvcUtils.goToPrefix(byteb)) {
nal_type = AvcUtils.getNalType(byteb);
int pps_end_position = byteb.position() - AvcUtils.START_PREFIX_LENGTH - AvcUtils.NAL_UNIT_HEADER_LENGTH;
pps_len = pps_end_position - pps_position;
} else {
pps_len = byteb.position() - pps_position;
//pps_len = byteb.limit() - pps_position + 1;
}
if (pps_len > 0) {
pps_nal = new byte[pps_len];
cur_pos = byteb.position();
byteb.position(pps_position);
byteb.get(pps_nal, 0, pps_len);
byteb.position(cur_pos);
}
} else {
//Log.d(log_tag, "OutputAvcBuffer, AVC NAL type: "+nal_type);
throw new UnsupportedOperationException("SPS is not followed by PPS, nal type :" + nal_type);
}
}
} else {
//Log.d(log_tag, "OutputAvcBuffer, AVC NAL type: "+nal_type);
}
//2. configure AVC decoder with SPS/PPS
if (sps_nal != null && pps_nal != null) {
int[] width = new int[1];
int[] height = new int[1];
AvcUtils.parseSPS(sps_nal, width, height);
mDecoder.configure(sps_nal, pps_nal,surfaceViewDecode.getHolder().getSurface());
mDecoder.start();
if (codecHandler != null) {
codecHandler.sendEmptyMessage(MSG_DECODE);
}
}
}
}
}
}
上面的send方法可以把手機捕捉并編碼的視訊資料發送到電腦上使用ffplay播放,使用ffplay的時候記得加上參數-analyzeduration 200000減小視訊延遲。
4.要點總結
個人總結使用MediaCodec編解碼的時候主要需要注意以下事項:
- 資料的輸入和輸出要異步進行,不要采用阻塞的方式等待輸出資料
- 編碼Camera預覽資料的時候使用帶buffer的預覽回調,避免幀率過低
- 适當使用緩存隊列,不要每一幀都new一個byte數組,避免頻繁的GC
- 不要在主線程進行操作,在我學習的過程中看到有些朋友直接在預覽回調裡進行編解碼,在dequeueOutputBuffer方法還傳遞-1,也就是無限等待程式傳回
5.代碼下載下傳
有朋友求代碼,本來代碼在CSDN上,但是由于積分問題申請删除了。今天整理舊電腦,找到了這個代碼。很多年前寫的代碼,非常醜陋,随便看看就好。
Github:https://github.com/sahooz/MediaCodecSample
當時本來想完成一個Android多媒體系列,包括編解碼、RTSP、RTMP和SDL等等,但是工作變動擱淺了。或許後面會重新拾起來,再來完成吧~。感謝大家的閱讀和評論。