天天看點

基于Camera、AudioRecord 、MediaCodec 和 MediaMuxer 錄制 MP4

一.前言

在 AAC 音頻編碼儲存和解碼播放和Camera 視訊采集,H264 編碼儲存

兩篇文章中介紹了如何通過 AudioRecord 和 MediaCodec 錄制 AAC 音頻以及如何通過 Camera

和 MediaCodec 錄制 H264 視訊。本文将介紹如何通過 MediaMuxer 合成 MP4 檔案。

MP4

在 音視訊開發基礎概念中有介紹過,MP4 (或者稱 MPEG-4) 是一種标準的數字多媒體容器格式,可以存儲

音頻資料和視訊資料。對于視訊格式,常見的是 H264 和 H265; 對于音頻格式通常是 AAC 。

MediaMuxer

MediaMuxer 是 Android 用來産生一個混合音頻和視訊多媒體檔案的 API ,隻支援下面幾種格式。

public static final inMUXER_OUTPUT_3GPP = 2;
public static final inMUXER_OUTPUT_HEIF = 3;
public static final inMUXER_OUTPUT_MPEG_4 = 0;
public static final inMUXER_OUTPUT_OGG = 4;
public static final inMUXER_OUTPUT_WEBM = 1;
           

1. 初始化

mMediaMuxer = new MediaMuxer(path, MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);
           

path 表示 MP4 檔案的輸出路徑

2. 添加音頻軌和視訊軌

if (type == AAC_ENCODER) {
    mAudioTrackIndex = mMediaMuxer.addTrack(mediaFormat);
}
if (type == H264_ENCODER) {
    mVideoTrackIndex = mMediaMuxer.addTrack(mediaFormat);
}

           

傳入 MediaFormat 對象從 MediaCodec 中擷取。

3. 開始合成

mMediaMuxer.start();
           

4. 寫入資料

mMediaMuxer.writeSampleData(avData.trackIndex, avData.byteBuffer, avData.bufferInfo);
           

5. 停止并釋放資源

mMediaMuxer.stop();
mMediaMuxer.release();
           

二. 錄制 MP4

AudioTrack、Camera、MediaCodec 和 MediaMuxer 錄制 MP4 流程如下圖所示:

基于Camera、AudioRecord 、MediaCodec 和 MediaMuxer 錄制 MP4
1. 音頻錄制

音頻錄制使用 AudioRecord

public class AudioRecorder {

    private int mAudioSource;
    private int mSampleRateInHz;
    private int mChannelConfig;
    private int mAudioFormat;
    private int mBufferSizeInBytes;

    private AudioRecord mAudioRecord;
    private volatile boolean mIsRecording;
    private Callback mCallback;
    private byte[] mBuffer;

    public void setCallback(Callback callback) {
        mCallback = callback;
    }

    public interface Callback {
        void onAudioOutput(byte[] data);
    }

    public AudioRecorder(int audioSource, int sampleRateInHz, int channelConfig, int audioFormat, int bufferSizeInBytes) {
        mAudioSource = audioSource;
        mSampleRateInHz = sampleRateInHz;
        mChannelConfig = channelConfig;
        mAudioFormat = audioFormat;
        mBufferSizeInBytes = bufferSizeInBytes;

        mAudioRecord = new AudioRecord(audioSource, sampleRateInHz, channelConfig, audioFormat, bufferSizeInBytes);
        mIsRecording = false;
        int minBufferSize = AudioRecord.getMinBufferSize(sampleRateInHz, channelConfig, AudioFormat.ENCODING_PCM_16BIT);

        mBuffer = new byte[Math.min(2048, minBufferSize)];
    }

    public void start() {
        if (mIsRecording) {
            return;
        }
        new Thread(new Runnable() {
            @Override
            public void run() {
                onStart();
            }
        }).start();
    }

    public void onStart() {
        if (mAudioRecord == null) {
            mAudioRecord = new android.media.AudioRecord(mAudioSource, mSampleRateInHz, mChannelConfig, mAudioFormat, mBufferSizeInBytes);
        }
        mAudioRecord.startRecording();
        mIsRecording = true;
        while (mIsRecording) {
            int len = mAudioRecord.read(mBuffer, 0, mBuffer.length);
            if (len > 0) {
                if (mCallback != null) {
                    mCallback.onAudioOutput(mBuffer);
                }
            }
        }
        mAudioRecord.stop();
        mAudioRecord.release();
        mAudioRecord = null;
    }

    public void stop() {
        mIsRecording = false;
    }
}

           
2. 音頻編碼

音頻編碼使用阻塞隊列 BlockingQueue 來緩沖資料,編碼成 AAC 格式。

public class AacEncoder {
    public static final int AAC_ENCODER = 2;
    private MediaCodec mAudioEncoder;
    private MediaFormat mMediaFormat;
    private BlockingQueue<byte[]> mDataQueue;
    private volatile boolean mIsEncoding;
    private Callback mCallback;

    private static final String AUDIO_MIME_TYPE = "audio/mp4a-latm";//就是 aac

    public void setCallback(Callback callback) {
        mCallback = callback;
    }

    public interface Callback {
        void outputMediaFormat(int type, MediaFormat mediaFormat);

        void onEncodeOutput(ByteBuffer byteBuffer, MediaCodec.BufferInfo bufferInfo);

        void onStop(int type);
    }

    public AacEncoder(int sampleRateInHz, int channelConfig, int bufferSizeInBytes) {
        try {
            mAudioEncoder = MediaCodec.createEncoderByType(AUDIO_MIME_TYPE);
            mMediaFormat = MediaFormat.createAudioFormat(AUDIO_MIME_TYPE, sampleRateInHz, channelConfig == AudioFormat.CHANNEL_OUT_MONO ? 1 : 2);
            mMediaFormat.setInteger(MediaFormat.KEY_AAC_PROFILE, MediaCodecInfo.CodecProfileLevel.AACObjectLC);
            mMediaFormat.setInteger(MediaFormat.KEY_CHANNEL_MASK, AudioFormat.CHANNEL_IN_STEREO);//CHANNEL_IN_STEREO 立體聲
            int bitRate = sampleRateInHz * 16 * channelConfig == AudioFormat.CHANNEL_IN_MONO ? 1 : 2;
            mMediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, bitRate);
            mMediaFormat.setInteger(MediaFormat.KEY_CHANNEL_COUNT, channelConfig == AudioFormat.CHANNEL_IN_MONO ? 1 : 2);
            mMediaFormat.setInteger(MediaFormat.KEY_SAMPLE_RATE, sampleRateInHz);
            mAudioEncoder.configure(mMediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
        } catch (IOException e) {

        }
        mDataQueue = new ArrayBlockingQueue<>(10);
        mIsEncoding = false;
    }

    public void start() {
        if (mIsEncoding) {
            return;
        }
        new Thread(new Runnable() {
            @Override
            public void run() {
                onStart();
            }
        }).start();
    }

    public void stop() {
        mIsEncoding = false;
    }

    private void onStart() {
        mIsEncoding = true;
        mAudioEncoder.start();
        byte[] pcmData;
        int inputIndex;
        ByteBuffer inputBuffer;
        ByteBuffer[] inputBuffers = mAudioEncoder.getInputBuffers();

        int outputIndex;
        ByteBuffer outputBuffer;
        ByteBuffer[] outputBuffers = mAudioEncoder.getOutputBuffers();

        MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
        while (mIsEncoding || !mDataQueue.isEmpty()) {
            pcmData = dequeueData();
            if (pcmData == null) {
                continue;
            }
            long pts = System.currentTimeMillis() * 1000 - AVTimer.getBaseTimestampUs();
            inputIndex = mAudioEncoder.dequeueInputBuffer(10_000);
            if (inputIndex >= 0) {
                inputBuffer = inputBuffers[inputIndex];
                inputBuffer.clear();
                inputBuffer.limit(pcmData.length);
                inputBuffer.put(pcmData);
                mAudioEncoder.queueInputBuffer(inputIndex, 0, pcmData.length, pts, 0);
            }

            outputIndex = mAudioEncoder.dequeueOutputBuffer(bufferInfo, 10_000);

            if (outputIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
                outputBuffers = mAudioEncoder.getOutputBuffers();
            } else if (outputIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
                MediaFormat newFormat = mAudioEncoder.getOutputFormat();
                if (null != mCallback) {
                    mCallback.outputMediaFormat(AAC_ENCODER, newFormat);
                }
            }
            while (outputIndex >= 0) {
                outputBuffer = outputBuffers[outputIndex];
                if (mCallback != null) {
                    mCallback.onEncodeOutput(outputBuffer, bufferInfo);
                }
                mAudioEncoder.releaseOutputBuffer(outputIndex, false);
                outputIndex = mAudioEncoder.dequeueOutputBuffer(bufferInfo, 10_000);
            }
        }
        mAudioEncoder.stop();
        mAudioEncoder.release();
        mAudioEncoder = null;
        if (mCallback != null) {
            mCallback.onStop(AAC_ENCODER);
        }
    }

    private byte[] dequeueData() {
        if (mDataQueue.isEmpty()) {
            return null;
        }
        try {
            return mDataQueue.take();
        } catch (InterruptedException e) {
            e.printStackTrace();
        }
        return null;
    }

    public void queueData(byte[] data) {
        if (!mIsEncoding) {
            return;
        }
        try {
            mDataQueue.put(data);
        } catch (InterruptedException e) {
            e.printStackTrace();
        }
    }
}

           
3. 視訊錄制

視訊錄制通過攝像頭 Camera 采集,由于 Camera 采集的視訊預設是橫屏的,還需要通過 YUVEngine 進行轉換

public class H264VideoRecord implements CameraHelper.PreviewCallback, H264Encoder.Callback {

    private CameraHelper mCameraHelper;
    private H264Encoder mH264Encoder;

    private Callback mCallback;

    public void setCallback(Callback callback) {
        mCallback = callback;
    }

    public interface Callback {
        void outputMediaFormat(int type, MediaFormat mediaFormat);

        void outputVideo(ByteBuffer byteBuffer, MediaCodec.BufferInfo bufferInfo);

        void onStop(int type);
    }

    public H264VideoRecord(Activity activity, SurfaceView surfaceView) {
        mCameraHelper = new CameraHelper(surfaceView, activity);
        mCameraHelper.setPreviewCallback(this);
    }

    public void start() {
        mH264Encoder.start();
    }

    public void stop() {
        mH264Encoder.stop();
        mCameraHelper.stop();
    }

    @Override
    public void onFrame(byte[] data) {
        mH264Encoder.queueData(data);
    }

    @Override
    public void outputMediaFormat(int type, MediaFormat mediaFormat) {
        if (mCallback == null) {
            return;
        }
        mCallback.outputMediaFormat(type, mediaFormat);
    }

    @Override
    public void onEncodeOutput(ByteBuffer byteBuffer, MediaCodec.BufferInfo bufferInfo) {
        if (mCallback == null) {
            return;
        }
        mCallback.outputVideo(byteBuffer, bufferInfo);
    }

    @Override
    public void onStop(int type) {
        if (mCallback == null) {
            return;
        }
        mCallback.onStop(type);
    }

    @Override
    public void onOperate(int width, int height, int fps) {
        mH264Encoder = new H264Encoder(width, height, fps);
        mH264Encoder.setCallback(this);
    }
}

           
public class CameraHelper {

    private int mPreWidth;
    private int mPreHeight;
    private int mFrameRate;


    private Camera mCamera;
    private Camera.Size mPreviewSize;
    private Camera.Parameters mCameraParameters;
    private boolean mIsPreviewing = false;
    private Activity mContext;
    private SurfaceView mSurfaceView;
    private SurfaceHolder mSurfaceHolder;
    private CameraPreviewCallback mCameraPreviewCallback;
    private PreviewCallback mPreviewCallback;


    public void setPreviewCallback(PreviewCallback previewCallback) {
        mPreviewCallback = previewCallback;
    }

    public interface PreviewCallback {
        void onFrame(byte[] data);

        void onOperate(int width, int height, int fps);
    }

    public CameraHelper(SurfaceView surfaceView, Activity context) {
        mSurfaceView = surfaceView;
        mContext = context;
        mSurfaceView.setKeepScreenOn(true);
        mSurfaceHolder = mSurfaceView.getHolder();
        mSurfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
        mSurfaceHolder.addCallback(new SurfaceHolder.Callback() {
            @Override
            public void surfaceCreated(SurfaceHolder surfaceHolder) {
                doOpenCamera();
                doStartPreview(mContext, surfaceHolder);
            }

            @Override
            public void surfaceChanged(SurfaceHolder surfaceHolder, int i, int i1, int i2) {

            }

            @Override
            public void surfaceDestroyed(SurfaceHolder surfaceHolder) {
                if (mCamera == null) {
                    return;
                }
                mCamera.stopPreview();
                mCamera.release();
                mCamera = null;
            }
        });

    }

    private void doOpenCamera() {
        if (mCamera != null) {
            return;
        }
        mCamera = Camera.open();
    }

    private void doStartPreview(Activity activity, SurfaceHolder surfaceHolder) {
        if (mIsPreviewing) {
            return;
        }
        mContext = activity;
        setCameraDisplayOrientation(activity, Camera.CameraInfo.CAMERA_FACING_BACK);
        setCameraParameters(surfaceHolder);
        try {
            mCamera.setPreviewDisplay(surfaceHolder);
        } catch (IOException e) {
            e.printStackTrace();
        }
        mCamera.startPreview();
        mIsPreviewing = true;
        mPreviewCallback.onOperate(mPreWidth, mPreHeight, mFrameRate);

    }

    public void stop() {
        if (mCamera != null) {
            mCamera.setPreviewCallbackWithBuffer(null);
            if (mIsPreviewing) {
                mCamera.stopPreview();
            }
            mIsPreviewing = false;
            mCamera.release();
            mCamera = null;
        }
    }

    private void setCameraDisplayOrientation(Activity activity, int cameraId) {
        Camera.CameraInfo info = new Camera.CameraInfo();
        Camera.getCameraInfo(cameraId, info);
        int rotation = activity.getWindowManager().getDefaultDisplay().getRotation();
        int degrees = 0;
        switch (rotation) {
            case Surface.ROTATION_0:
                degrees = 0;
                break;
            case Surface.ROTATION_90:
                degrees = 90;
                break;
            case Surface.ROTATION_180:
                degrees = 180;
                break;
            case Surface.ROTATION_270:
                degrees = 270;
                break;
        }
        int result = 0;
        if (info.facing == Camera.CameraInfo.CAMERA_FACING_FRONT) {
            result = (info.orientation + degrees) % 360;
            result = (360 - result) % 360;
        } else {
            result = (info.orientation - degrees + 360) % 360;
        }
        mCamera.setDisplayOrientation(result);
    }

    private void setCameraParameters(SurfaceHolder surfaceHolder) {
        if (!mIsPreviewing && mCamera != null) {
            mCameraParameters = mCamera.getParameters();
            mCameraParameters.setPreviewFormat(ImageFormat.NV21);
            List<Camera.Size> supportedPreviewSizes = mCameraParameters.getSupportedPreviewSizes();
            Collections.sort(supportedPreviewSizes, new Comparator<Camera.Size>() {
                @Override
                public int compare(Camera.Size o1, Camera.Size o2) {
                    Integer left = o1.width;
                    Integer right = o2.width;
                    return left.compareTo(right);
                }
            });

            DisplayMetrics displayMetrics = mContext.getResources().getDisplayMetrics();
            for (Camera.Size size : supportedPreviewSizes) {
                if (size.width >= displayMetrics.heightPixels && size.height >= displayMetrics.widthPixels) {
                    if ((1.0f * size.width / size.height) == (1.0f * displayMetrics.heightPixels / displayMetrics.widthPixels)) {
                        mPreviewSize = size;
                        break;
                    }
                }
            }
            if (mPreviewSize != null) {
                mPreWidth = mPreviewSize.width;
                mPreHeight = mPreviewSize.height;
            } else {
                mPreWidth = 1280;
                mPreHeight = 720;
            }


            mCameraParameters.setPreviewSize(mPreWidth, mPreHeight);

            //set fps range.
            int defminFps = 0;
            int defmaxFps = 0;
            List<int[]> supportedPreviewFpsRange = mCameraParameters.getSupportedPreviewFpsRange();
            for (int[] fps : supportedPreviewFpsRange) {
                if (defminFps <= fps[PREVIEW_FPS_MIN_INDEX] && defmaxFps <= fps[PREVIEW_FPS_MAX_INDEX]) {
                    defminFps = fps[PREVIEW_FPS_MIN_INDEX];
                    defmaxFps = fps[PREVIEW_FPS_MAX_INDEX];
                }
            }
            //設定相機預覽幀率
            mCameraParameters.setPreviewFpsRange(defminFps, defmaxFps);
            mFrameRate = defmaxFps / 1000;
            surfaceHolder.setFixedSize(mPreWidth, mPreHeight);
            mCameraPreviewCallback = new CameraPreviewCallback();
            mCamera.addCallbackBuffer(new byte[mPreHeight * mPreWidth * 3 / 2]);
            mCamera.setPreviewCallbackWithBuffer(mCameraPreviewCallback);
            List<String> focusModes = mCameraParameters.getSupportedFocusModes();
            for (String focusMode : focusModes) {//檢查支援的對焦
                if (focusMode.contains(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO)) {
                    mCameraParameters.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO);
                } else if (focusMode.contains(Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE)) {
                    mCameraParameters.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE);
                } else if (focusMode.contains(Camera.Parameters.FOCUS_MODE_AUTO)) {
                    mCameraParameters.setFocusMode(Camera.Parameters.FOCUS_MODE_AUTO);
                }
            }
            mCamera.setParameters(mCameraParameters);
        }
    }


    class CameraPreviewCallback implements Camera.PreviewCallback {
        private CameraPreviewCallback() {

        }

        @Override
        public void onPreviewFrame(byte[] data, Camera camera) {
            if (!mIsPreviewing || mCamera == null) {
                return;
            }
            Camera.Size size = camera.getParameters().getPreviewSize();
            //通過回調,拿到的data資料是原始資料
            //丢給VideoRunnable線程,使用MediaCodec進行h264編碼操作
            if (data != null) {
                if (mPreviewCallback != null) {
                    mPreviewCallback.onFrame(data);
                }
                camera.addCallbackBuffer(data);
            } else {
                camera.addCallbackBuffer(new byte[size.width * size.height * 3 / 2]);
            }
        }
    }
}

           
4. 視訊編碼

視訊編碼使用阻塞隊列 BlockingQueue 來緩沖資料,編碼成 H264 格式。

public class H264Encoder {
    public static final String VIDEO_MIME_TYPE = "video/avc";//就是 H264
    public static final int H264_ENCODER = 1;
    private MediaCodec mMediaCodec;
    private MediaFormat mMediaFormat;
    private BlockingQueue<byte[]> mQueue;
    private MediaCodecInfo mMediaCodecInfo;
    private int mColorFormat;
    private int mWidth;
    private int mHeight;
    private int mBitRate;
    private byte[] mYUVBuffer;
    private byte[] mRotatedYUVBuffer;

    private int[] mOutWidth;
    private int[] mOutHeight;
    private ExecutorService mExecutorService;
    private volatile boolean mIsEncoding;

    private Callback mCallback;

    public void setCallback(Callback callback) {
        mCallback = callback;
    }

    public interface Callback {
        void outputMediaFormat(int type, MediaFormat mediaFormat);

        void onEncodeOutput(ByteBuffer byteBuffer, MediaCodec.BufferInfo bufferInfo);

        void onStop(int type);
    }

    public H264Encoder(int width, int height, int fps) {
        Log.e("eee", "w:" + width + "h:" + height + "fps:" + fps);
        mWidth = width;
        mHeight = height;
        mQueue = new LinkedBlockingQueue<>();
        mMediaCodecInfo = selectCodecInfo();
        mColorFormat = selectColorFormat(mMediaCodecInfo);
        mBitRate = (mWidth * mHeight * 3 / 2) * 8 * fps;
        mMediaFormat = MediaFormat.createVideoFormat(VIDEO_MIME_TYPE, mHeight, mWidth);
        mMediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, mBitRate);// todo 沒有這一行會報錯 configureCodec returning error -38
        mMediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, fps);
        mMediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, mColorFormat);
        mMediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);

        try {
            mMediaCodec = MediaCodec.createByCodecName(mMediaCodecInfo.getName());
        } catch (IOException e) {
            e.printStackTrace();
        }
        mMediaCodec.configure(mMediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
        mExecutorService = Executors.newFixedThreadPool(1);

        mYUVBuffer = new byte[YUVUtil.getYUVBuffer(width, height)];
        mRotatedYUVBuffer = new byte[YUVUtil.getYUVBuffer(width, height)];
        mOutWidth = new int[1];
        mOutHeight = new int[1];
        YUVEngine.startYUVEngine();
    }


    public void start() {
        if (mIsEncoding) {
            return;
        }


        mExecutorService.execute(new Runnable() {
            @Override
            public void run() {
                mIsEncoding = true;
                mMediaCodec.start();
                while (mIsEncoding) {
                    byte[] data = dequeueData();
                    if (data == null) {
                        continue;
                    }
                    encodeVideoData(data);
                }

                mMediaCodec.stop();
                mMediaCodec.release();
                if (mCallback != null) {
                    mCallback.onStop(H264_ENCODER);
                }

            }
        });
    }

    public void stop() {
        mIsEncoding = false;
    }

    private byte[] dequeueData() {
        if (mQueue.isEmpty()) {
            return null;
        }
        try {
            return mQueue.take();
        } catch (InterruptedException e) {
            e.printStackTrace();
        }
        return null;
    }

    public void queueData(byte[] data) {
        if (data == null || !mIsEncoding) {
            return;
        }
        try {
            mQueue.put(data);
        } catch (InterruptedException e) {
            e.printStackTrace();
        }
    }

    private void encodeVideoData(byte[] data) {
        MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
        mRotatedYUVBuffer = transferFrameData(data, mYUVBuffer, mRotatedYUVBuffer);
        ByteBuffer[] inputBuffers = mMediaCodec.getInputBuffers();
        int inputIndex = mMediaCodec.dequeueInputBuffer(10_000);
        if (inputIndex >= 0) {
            ByteBuffer byteBuffer = inputBuffers[inputIndex];
            byteBuffer.clear();
            byteBuffer.put(mRotatedYUVBuffer);
            long pts = System.currentTimeMillis() * 1000 - AVTimer.getBaseTimestampUs();
            mMediaCodec.queueInputBuffer(inputIndex, 0, mRotatedYUVBuffer.length, pts, 0);
        }

        ByteBuffer[] outputBuffers = mMediaCodec.getOutputBuffers();
        int outputIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo, 10_000);
        if (outputIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
            outputBuffers = mMediaCodec.getOutputBuffers();
        } else if (outputIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
            MediaFormat newFormat = mMediaCodec.getOutputFormat();
            if (null != mCallback) {
                mCallback.outputMediaFormat(H264_ENCODER, newFormat);
            }
        }
        while (outputIndex >= 0) {
            ByteBuffer byteBuffer = outputBuffers[outputIndex];
            if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
                bufferInfo.size = 0;
            }
            if (bufferInfo.size != 0 && mCallback != null) {
//                boolean keyFrame = (bufferInfo.flags & MediaCodec.BUFFER_FLAG_KEY_FRAME) != 0;
//                Log.i(TAG, "is key frame :%s"+keyFrame);
                mCallback.onEncodeOutput(byteBuffer, bufferInfo);
            }
            mMediaCodec.releaseOutputBuffer(outputIndex, false);
            outputIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo, 10_000);
        }
    }

    private byte[] transferFrameData(byte[] data, byte[] yuvBuffer, byte[] rotatedYuvBuffer) {
        //Camera 傳入的是 NV21
        //轉換成 MediaCodec 支援的格式
        switch (mColorFormat) {
            case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar://對應Camera預覽格式I420(YV21/YUV420P)
                YUVEngine.Nv21ToI420(data, yuvBuffer, mWidth, mHeight);
                YUVEngine.I420ClockWiseRotate90(yuvBuffer, mWidth, mHeight, rotatedYuvBuffer, mOutWidth, mOutHeight);
                //Log.i("transferFrameData", "COLOR_FormatYUV420Planar");
                break;
            case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar: //對應Camera預覽格式NV12
                YUVEngine.Nv21ToNv12(data, yuvBuffer, mWidth, mHeight);
                YUVEngine.Nv12ClockWiseRotate90(yuvBuffer, mWidth, mHeight, rotatedYuvBuffer, mOutWidth, mOutHeight);
                //Log.i("transferFrameData", "COLOR_FormatYUV420SemiPlanar");
                break;
            case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedSemiPlanar://對應Camera預覽格式NV21
                System.arraycopy(data, 0, yuvBuffer, 0, mWidth * mHeight * 3 / 2);
                YUVEngine.Nv21ClockWiseRotate90(yuvBuffer, mWidth, mHeight, rotatedYuvBuffer, mOutWidth, mOutHeight);
                //Log.i("transferFrameData", "COLOR_FormatYUV420PackedSemiPlanar");
                break;
            case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedPlanar: 對應Camera預覽格式YV12
                YUVEngine.Nv21ToYV12(data, yuvBuffer, mWidth, mHeight);
                YUVEngine.Yv12ClockWiseRotate90(yuvBuffer, mWidth, mHeight, rotatedYuvBuffer, mOutWidth, mOutHeight);
                //Log.i("transferFrameData", "COLOR_FormatYUV420PackedPlanar");
                break;
        }
        return rotatedYuvBuffer;
    }

    private MediaCodecInfo selectCodecInfo() {
        int numCodecs = MediaCodecList.getCodecCount();
        for (int i = 0; i < numCodecs; i++) {
            MediaCodecInfo codecInfo = MediaCodecList.getCodecInfoAt(i);
            if (!codecInfo.isEncoder()) {
                continue;
            }
            String[] types = codecInfo.getSupportedTypes();
            for (int j = 0; j < types.length; j++) {
                if (types[j].equalsIgnoreCase(com.example.dplayer.mediacodec.h264.H264Encoder.VIDEO_MIME_TYPE)) {
                    return codecInfo;
                }
            }
        }
        return null;
    }

    //查詢支援的輸入格式
    private int selectColorFormat(MediaCodecInfo codecInfo) {
        if (codecInfo == null) {
            return -1;
        }
        MediaCodecInfo.CodecCapabilities capabilities = codecInfo.getCapabilitiesForType(com.example.dplayer.mediacodec.h264.H264Encoder.VIDEO_MIME_TYPE);
        int[] colorFormats = capabilities.colorFormats;
        for (int i = 0; i < colorFormats.length; i++) {
            if (isRecognizedFormat(colorFormats[i])) {
                return colorFormats[i];
            }
        }
        return -1;
    }

    private boolean isRecognizedFormat(int colorFormat) {
        switch (colorFormat) {
            // these are the formats we know how to handle for this test
            case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar://對應Camera預覽格式I420(YV21/YUV420P)
            case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar: //對應Camera預覽格式NV12
            case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedSemiPlanar://對應Camera預覽格式NV21
            case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedPlanar: {對應Camera預覽格式YV12
                return true;
            }
            default:
                return false;
        }
    }
}

           
5. 合成 MP4

由于 MediaMuxer 需要先 addTrack 才能 start ,其音頻解碼回調和視訊解碼回調處于不同的線程,是以,使用 object.wait 來等待所有資料完成 addTrack 。

public class Mp4Record implements H264VideoRecord.Callback, AacAudioRecord.Callback {
    private static int index = 0;
    private H264VideoRecord mH264VideoRecord;
    private AacAudioRecord mAacAudioRecord;
    private MediaMuxer mMediaMuxer;

    private boolean mHasStartMuxer;
    private boolean mHasStopAudio;
    private boolean mHasStopVideo;
    private int mVideoTrackIndex = -1;
    private int mAudioTrackIndex = -1;
    private final Object mLock;
    private BlockingQueue<AVData> mDataBlockingQueue;
    private volatile boolean mIsRecoding;

    public Mp4Record(Activity activity, SurfaceView surfaceView, int audioSource, int sampleRateInHz, int channelConfig, int audioFormat, int bufferSizeInBytes, String path) {
        mH264VideoRecord = new H264VideoRecord(activity, surfaceView);
        mAacAudioRecord = new AacAudioRecord(audioSource, sampleRateInHz, channelConfig, audioFormat, bufferSizeInBytes);
        mH264VideoRecord.setCallback(this);
        mAacAudioRecord.setCallback(this);
        try {
            mMediaMuxer = new MediaMuxer(path, MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);
        } catch (IOException e) {
            e.printStackTrace();
        }
        mHasStartMuxer = false;
        mLock = new Object();
        mDataBlockingQueue = new LinkedBlockingQueue<>();
    }

    public void start() {
        mIsRecoding = true;
        mAacAudioRecord.start();
        mH264VideoRecord.start();
    }

    public void stop() {
        mAacAudioRecord.stop();
        mH264VideoRecord.stop();
        mIsRecoding = false;
    }


    @Override
    public void outputAudio(ByteBuffer byteBuffer, MediaCodec.BufferInfo bufferInfo) {
        writeMediaData(mAudioTrackIndex, byteBuffer, bufferInfo);
    }

    @Override
    public void outputMediaFormat(int type, MediaFormat mediaFormat) {
        checkMediaFormat(type, mediaFormat);
    }

    @Override
    public void outputVideo(ByteBuffer byteBuffer, MediaCodec.BufferInfo bufferInfo) {
        writeMediaData(mVideoTrackIndex, byteBuffer, bufferInfo);
    }

    private void writeMediaData(int trackIndex, ByteBuffer byteBuffer, MediaCodec.BufferInfo bufferInfo) {
        mDataBlockingQueue.add(new AVData(index++, trackIndex, byteBuffer, bufferInfo));
    }

    private void checkMediaFormat(int type, MediaFormat mediaFormat) {
        synchronized (mLock) {
            if (type == AAC_ENCODER) {
                mAudioTrackIndex = mMediaMuxer.addTrack(mediaFormat);
            }
            if (type == H264_ENCODER) {
                mVideoTrackIndex = mMediaMuxer.addTrack(mediaFormat);
            }
            startMediaMuxer();
        }
    }

    private void startMediaMuxer() {
        if (mHasStartMuxer) {
            return;
        }
        if (mAudioTrackIndex != -1 && mVideoTrackIndex != -1) {
            Log.e(TAG, "video track index:" + mVideoTrackIndex + "audio track index:" + mAudioTrackIndex);
            mMediaMuxer.start();
            mHasStartMuxer = true;
            new Thread(new Runnable() {
                @Override
                public void run() {
                    while (mIsRecoding || !mDataBlockingQueue.isEmpty()) {
                        AVData avData = mDataBlockingQueue.poll();
                        if (avData == null) {
                            continue;
                        }
                        boolean keyFrame = (avData.bufferInfo.flags & MediaCodec.BUFFER_FLAG_KEY_FRAME) != 0;
                        Log.e(TAG, avData.index + "trackIndex:" + avData.trackIndex + ",writeSampleData:" + keyFrame);
                        mMediaMuxer.writeSampleData(avData.trackIndex, avData.byteBuffer, avData.bufferInfo);
                    }
                }
            }).start();
            mLock.notifyAll();
        } else {
            try {
                mLock.wait();
            } catch (InterruptedException e) {

            }
        }
    }

    @Override
    public void onStop(int type) {
        synchronized (mLock) {
            if (type == H264_ENCODER) {
                mHasStopVideo = true;
            }
            if (type == AAC_ENCODER) {
                mHasStopAudio = true;
            }
            if (mHasStopAudio && mHasStopVideo && mHasStartMuxer) {
                mHasStartMuxer = false;
                mMediaMuxer.stop();
            }
        }
    }

    private class AVData {
        int index = 0;
        int trackIndex;
        ByteBuffer byteBuffer;
        MediaCodec.BufferInfo bufferInfo;

        public AVData(int index, int trackIndex, ByteBuffer byteBuffer, MediaCodec.BufferInfo bufferInfo) {
            this.index = index;
            this.trackIndex = trackIndex;
            this.byteBuffer = byteBuffer;
            this.bufferInfo = bufferInfo;
            boolean keyFrame = (bufferInfo.flags & MediaCodec.BUFFER_FLAG_KEY_FRAME) != 0;
            Log.e(TAG, index + "trackIndex:" + trackIndex + ",AVData:" + keyFrame);
        }
    }
}

           
6. 遇到的問題

錄制後的視訊播放很快

原因是視訊掉幀,通過修改關鍵幀參數解決

mMediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
           

MediaMuxer 報 Video skip non-key frame

原因是第一個視訊關鍵幀沒有寫入,因為處于不同的線程,放入隊列中後,原始的 BufferInfo 被釋放

github demo

基于Camera、AudioRecord 、MediaCodec 和 MediaMuxer 錄制 MP4