laitimes

Android Hard Codec MediaCodec Analysis - Starting from the Story of the Pork Restaurant (2)

author:Audio and video development T brother

Previous Review:

Previous articleAndroid hard codec MediaCodec analysis - starting from the story of the pork restaurant (1) has described the MediaCodec workflow and duty cycle state machine, today began to enter the actual battle, from the code perspective to analyze MediaCodec in detail. If you have not read the previous part, it is recommended to read it to seamlessly connect with this article.

Example of MediaCodec code

The code example explained this time is Google's official MediaCodec learning project grafika, grafika consists of multiple demos, such as video decoding and playback, real-time video recording and encoding video to H264 to save locally, screen recording and other functions, each demo will focus on a certain technology.

The following is the homepage of Grafika's app, each representing a demo:

Android Hard Codec MediaCodec Analysis - Starting from the Story of the Pork Restaurant (2)

Today, we will start with the most basic first demo———— decoding a local MP4 video.

Android Hard Codec MediaCodec Analysis - Starting from the Story of the Pork Restaurant (2)

As can be seen from the gif, this is a very simple video, the whole function is to decode the mp4 video, and then render the decoded data to the screen, the corresponding code is in com.android.grafika.PlayMovieActivity, the basic flow structure diagram is as follows:

Android Hard Codec MediaCodec Analysis - Starting from the Story of the Pork Restaurant (2)

Then the core decoding code is in MoviePlayer.

Demux code parsing

The first concept to understand is multiplexing, which can also be called encapsulation, that is, the video data and audio data that have been compressed and encoded are packaged together in a certain format, such as MP4, MKV, RMVB, TS, FLV, AVI, which is the multiplexed format that we are all familiar with.

For example, the data in FLV format is packaged by the H.264 encoded video stream and the AAC-encoded audio stream.

The FLV multiplexed format is composed of an FLV header and a tag. The tag contains audio data and video data. The structure of FLV is shown in the figure below (the figure is from Introduction to Video and Audio Data Processing: FLV Package Format Analysis

Android Hard Codec MediaCodec Analysis - Starting from the Story of the Pork Restaurant (2)

How to get C++ learning materials for free: Follow the audio and video development brother, click the link below to get the latest C++ audio and video development advanced exclusive learning materials in 2023 for free!

+Information Pack "Link"

Then before decoding the video, you must first take the H264 video data out of the multiplexed format, and the Android platform has provided MediaExtractor as a tool for us to easily demultiplex.

The following is the MediaExtractor usage code template provided on the official website:

MediaExtractor extractor = new MediaExtractor();
 extractor.setDataSource(...);
 int numTracks = extractor.getTrackCount();
 
 //遍历媒体复用文件中的每一条轨道数据流(音频或者视频流),得到我们需要处理的数据流的mime类型,并选中它
 for (int i = 0; i < numTracks; ++i) {
   MediaFormat format = extractor.getTrackFormat(i);
   String mime = format.getString(MediaFormat.KEY_MIME);
   if (weAreInterestedInThisTrack) {
     //选中我们需要处理的数据流的mime类型的数据流
     extractor.selectTrack(i);
   }
 }
 ByteBuffer inputBuffer = ByteBuffer.allocate(...)
 
 //循环读取选中的音频或者视频流到inputBuffer中
 while (extractor.readSampleData(inputBuffer, ...) >= 0) {
   int trackIndex = extractor.getSampleTrackIndex();
   long presentationTimeUs = extractor.getSampleTime();
   ...
   extractor.advance();
 }

 extractor.release();
 extractor = null;           

The comments have been written in more detail, and I can basically understand them.

Let's first take a look at MediaFormat, which is a class that specifically describes media file formats through a series of key-value pairs, such as the generic media format KEY:

Android Hard Codec MediaCodec Analysis - Starting from the Story of the Pork Restaurant (2)

Video proprietary format KEY:

Android Hard Codec MediaCodec Analysis - Starting from the Story of the Pork Restaurant (2)

Audio proprietary format KEY:

Android Hard Codec MediaCodec Analysis - Starting from the Story of the Pork Restaurant (2)

In the above template code, the corresponding value of the KEY_MIME is taken to determine the media file type.

The common MIME of the video is as follows:

"video/x-vnd.on2.vp8" - VP8 video (i.e. video in .webm) "video/x-vnd.on2.vp9" - VP9 video (i.e. video in .webm) "video/avc" - H.264/AVC video "video/hevc" - H.265/HEVC video "video/mp4v-es" - MPEG4 video "video/3gpp" - H.263 video

Because the encoding is mainly H264, and the mine corresponding to H264 is "video/avc".

In the construction method of MoviePlayer in grafika, com.android.grafika.MoviePlayer#MoviePlayer, is to obtain the width and height of the video through MediaExtractor:

//解复用
MediaExtractor extractor = null;
try {
    extractor = new MediaExtractor();
    //传入视频文件的路径
    extractor.setDataSource(sourceFile.toString());
    int trackIndex = selectTrack(extractor);
    if (trackIndex < 0) {
        throw new RuntimeException("No video track found in " + mSourceFile);
    }
    //选中得到的轨道(视频轨道),即后面都是对此轨道的处理
    extractor.selectTrack(trackIndex);
    //通过该轨道的MediaFormat得到对视频对应的宽高
    MediaFormat format = extractor.getTrackFormat(trackIndex);

    Log.d(TAG, "extractor.getTrackFormat format" + format);
    //视频对应的宽高
    mVideoWidth = format.getInteger(MediaFormat.KEY_WIDTH);
    mVideoHeight = format.getInteger(MediaFormat.KEY_HEIGHT);
    if (VERBOSE) {
        Log.d(TAG, "Video size is " + mVideoWidth + "x" + mVideoHeight);
    }
} finally {
    if (extractor != null) {
        extractor.release();
    }
}           

In the specific video playback method com.android.grafika.MoviePlayer#play, create a MediaCodec decoder by obtaining the mime type:

MediaFormat format = extractor.getTrackFormat(trackIndex);
Log.d(TAG, "EgetTrackFormat format:" + format);

// Create a MediaCodec decoder, and configure it with the MediaFormat from the
// extractor.  It's very important to use the format from the extractor because
// it contains a copy of the CSD-0/CSD-1 codec-specific data chunks.
String mime = format.getString(MediaFormat.KEY_MIME);
Log.d(TAG, "createDecoderByType mime:" + mime);
//通过视频mime类型初始化解码器
MediaCodec decoder = MediaCodec.createDecoderByType(mime);           

At this time, MediaCodec is in the Uninitialized state in the Stopped state, and then start MediaCodec (the boss cleans up the kitchen table and chairs, and is about to open the store):

//配置解码器,指定MediaFormat以及视频输出的Surface,解码器进入configure状态
  decoder.configure(format, mOutputSurface, null, 0);
  //启动解码器,开始进入Executing状态
  // Immediately after start() the codec is in the Flushed sub-state, where it holds all the buffers
  decoder.start();
  //具体的解码流程
  doExtract(extractor, trackIndex, decoder, mFrameCallback);           

Notice that the configure method passes the mOutputSurface object, in # Android hard codec MediaCodec parsing - starting with the story of the pork restaurant (1), for raw video data:

Video codecs support three color formats, the second of which is Native Raw Video Format: COLOR_FormatSurface, which can be used to handle data input and output in Surface mode. And this Surface object is obtained from the TextureView of the activity:
//MoviePlayer通过Surface将解码后的原始视频数据渲染到TextureView上
SurfaceTexture st = mTextureView.getSurfaceTexture();
Surface surface = new Surface(st);
MoviePlayer player = null;
try {
     player = new MoviePlayer(
            new File(getFilesDir(), mMovieFiles[mSelectedMovie]), surface, callback);
} catch (IOException ioe) {
    Log.e(TAG, "Unable to play movie", ioe);
    surface.release();
    return;
}           

Decode code parsing

At this point, MediaCodec has been launched, and at this point it has entered the big cycle stage of the input side and the output side (the mind begins to imagine the cycle of the buyer putting raw pork into the basket again and again and giving it to the chef, and the chef is finished and put on the plate to give to the customer). The key code is seen at com.android.grafika.MoviePlayer#doExtract:

/**
 * Work loop.  We execute here until we run out of video or are told to stop.
 */
private void doExtract(MediaExtractor extractor, int trackIndex, MediaCodec decoder,
                       FrameCallback frameCallback) {
    // We need to strike a balance between providing input and reading output that
    // operates efficiently without delays on the output side.
    //
    // To avoid delays on the output side, we need to keep the codec's input buffers
    // fed.  There can be significant latency between submitting frame N to the decoder
    // and receiving frame N on the output, so we need to stay ahead of the game.
    //
    // Many video decoders seem to want several frames of video before they start
    // producing output -- one implementation wanted four before it appeared to
    // configure itself.  We need to provide a bunch of input frames up front, and try
    // to keep the queue full as we go.
    //
    // (Note it's possible for the encoded data to be written to the stream out of order,
    // so we can't generally submit a single frame and wait for it to appear.)
    //
    // We can't just fixate on the input side though.  If we spend too much time trying
    // to stuff the input, we might miss a presentation deadline.  At 60Hz we have 16.7ms
    // between frames, so sleeping for 10ms would eat up a significant fraction of the
    // time allowed.  (Most video is at 30Hz or less, so for most content we'll have
    // significantly longer.)  Waiting for output is okay, but sleeping on availability
    // of input buffers is unwise if we need to be providing output on a regular schedule.
    //
    //
    // In some situations, startup latency may be a concern.  To minimize startup time,
    // we'd want to stuff the input full as quickly as possible.  This turns out to be
    // somewhat complicated, as the codec may still be starting up and will refuse to
    // accept input.  Removing the timeout from dequeueInputBuffer() results in spinning
    // on the CPU.
    //
    // If you have tight startup latency requirements, it would probably be best to
    // "prime the pump" with a sequence of frames that aren't actually shown (e.g.
    // grab the first 10 NAL units and shove them through, then rewind to the start of
    // the first key frame).
    //
    // The actual latency seems to depend on strongly on the nature of the video (e.g.
    // resolution).
    //
    //
    // One conceptually nice approach is to loop on the input side to ensure that the codec
    // always has all the input it can handle.  After submitting a buffer, we immediately
    // check to see if it will accept another.  We can use a short timeout so we don't
    // miss a presentation deadline.  On the output side we only check once, with a longer
    // timeout, then return to the outer loop to see if the codec is hungry for more input.
    //
    // In practice, every call to check for available buffers involves a lot of message-
    // passing between threads and processes.  Setting a very brief timeout doesn't
    // exactly work because the overhead required to determine that no buffer is available
    // is substantial.  On one device, the "clever" approach caused significantly greater
    // and more highly variable startup latency.
    //
    // The code below takes a very simple-minded approach that works, but carries a risk
    // of occasionally running out of output.  A more sophisticated approach might
    // detect an output timeout and use that as a signal to try to enqueue several input
    // buffers on the next iteration.
    //
    // If you want to experiment, set the VERBOSE flag to true and watch the behavior
    // in logcat.  Use "logcat -v threadtime" to see sub-second timing.


    //获取解码输出数据的超时时间
    final int TIMEOUT_USEC = 0;
    //输入ByteBuffer数组(较高版本的MediaCodec已经用getInputBuffer取代了,可直接获取buffer)
    ByteBuffer[] decoderInputBuffers = decoder.getInputBuffers();
    //记录传入了第几块数据
    int inputChunk = 0;
    //用于log每帧解码时间
    long firstInputTimeNsec = -1;

    boolean outputDone = false;
    boolean inputDone = false;
    while (!outputDone) {
        if (VERBOSE) Log.d(TAG, "loop");
        if (mIsStopRequested) {
            Log.d(TAG, "Stop requested");
            return;
        }

        // Feed more data to the decoder.
        if (!inputDone) {
            //拿到可用的ByteBuffer的index
            int inputBufIndex = decoder.dequeueInputBuffer(TIMEOUT_USEC);
            if (inputBufIndex >= 0) {
                if (firstInputTimeNsec == -1) {
                    firstInputTimeNsec = System.nanoTime();
                }
                //根据index得到对应的输入ByteBuffer
                ByteBuffer inputBuf = decoderInputBuffers[inputBufIndex];
                Log.d(TAG, "decoderInputBuffers inputBuf:" + inputBuf + ",inputBufIndex:" + inputBufIndex);

                // Read the sample data into the ByteBuffer.  This neither respects nor
                // updates inputBuf's position, limit, etc.


                //从媒体文件中读取的一个sample数据大小
                int chunkSize = extractor.readSampleData(inputBuf, 0);
                if (chunkSize < 0) {
                    //文件读到末尾,设置标志位,发送一个空帧,给后面解码知道具体结束位置
                    // End of stream -- send empty frame with EOS flag set.

                    //When you queue an input buffer with the end-of-stream marker, the codec transitions
                    // to the End-of-Stream sub-state. In this state the codec no longer accepts further
                    // input buffers, but still generates output buffers until the end-of-stream is reached
                    // on the output.
                    decoder.queueInputBuffer(inputBufIndex, 0, 0, 0L,
                            MediaCodec.BUFFER_FLAG_END_OF_STREAM);
                    Log.d(TAG, "queueInputBuffer");

                    inputDone = true;
                    if (VERBOSE) Log.d(TAG, "sent input EOS");
                } else {
                    if (extractor.getSampleTrackIndex() != trackIndex) {
                        Log.w(TAG, "WEIRD: got sample from track " +
                                extractor.getSampleTrackIndex() + ", expected " + trackIndex);
                    }
                    //得到当前数据的播放时间点
                    long presentationTimeUs = extractor.getSampleTime();
                    //把inputBufIndex对应的数据传入MediaCodec
                    decoder.queueInputBuffer(inputBufIndex, 0, chunkSize,
                            presentationTimeUs, 0 /*flags*/);
                    Log.d(TAG, "queueInputBuffer inputBufIndex:" + inputBufIndex);

                    if (VERBOSE) {
                        Log.d(TAG, "submitted frame " + inputChunk + " to dec, size=" +
                                chunkSize);
                    }
                    //记录传入了第几块数据
                    inputChunk++;
                    //extractor读取游标往前挪动
                    extractor.advance();
                }
            } else {
                if (VERBOSE) Log.d(TAG, "input buffer not available");
            }
        }

        if (!outputDone) {
            //如果解码成功,则得到解码出来的数据的buffer在输出buffer中的index。并将解码得到的buffer的相关信息放在mBufferInfo中。
            // 如果不成功,则得到的是解码的一些状态
            int outputBufferIndex = decoder.dequeueOutputBuffer(mBufferInfo, TIMEOUT_USEC);
            Log.d(TAG, "dequeueOutputBuffer decoderBufferIndex:" + outputBufferIndex + ",mBufferInfo:" + mBufferInfo);
            if (outputBufferIndex == MediaCodec.INFO_TRY_AGAIN_LATER) {
                // no output available yet
                if (VERBOSE) Log.d(TAG, "no output from decoder available");
            } else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
                // not important for us, since we're using Surface
                if (VERBOSE) Log.d(TAG, "decoder output buffers changed");
            } else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
                MediaFormat newFormat = decoder.getOutputFormat();
                if (VERBOSE) Log.d(TAG, "decoder output format changed: " + newFormat);
            } else if (outputBufferIndex < 0) {
                throw new RuntimeException(
                        "unexpected result from decoder.dequeueOutputBuffer: " +
                                outputBufferIndex);
            } else { // decoderStatus >= 0
                if (firstInputTimeNsec != 0) {
                    // Log the delay from the first buffer of input to the first buffer
                    // of output.
                    long nowNsec = System.nanoTime();
                    Log.d(TAG, "startup lag " + ((nowNsec - firstInputTimeNsec) / 1000000.0) + " ms");
                    firstInputTimeNsec = 0;
                }
                boolean doLoop = false;
                if (VERBOSE) Log.d(TAG, "surface decoder given buffer " + outputBufferIndex +
                        " (output mBufferInfo size=" + mBufferInfo.size + ")");

                //判断是否到了文件结束,上面设置MediaCodec.BUFFER_FLAG_END_OF_STREAM标志位在这里判断
                if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
                    if (VERBOSE) Log.d(TAG, "output EOS");
                    if (mLoop) {
                        doLoop = true;
                    } else {
                        outputDone = true;
                    }
                }

                //如果解码得到的buffer大小大于0,则需要渲染
                boolean doRender = (mBufferInfo.size != 0);

                // As soon as we call releaseOutputBuffer, the buffer will be forwarded
                // to SurfaceTexture to convert to a texture.  We can't control when it
                // appears on-screen, but we can manage the pace at which we release
                // the buffers.
                if (doRender && frameCallback != null) {
                    //渲染前的回调,这里具体实现是通过一定时长的休眠来尽量确保稳定的帧率
                    frameCallback.preRender(mBufferInfo.presentationTimeUs);
                }
                //得到输出Buffer数组,较高版本已经被getOutputBuffer代替

                ByteBuffer[] decoderOutputBuffers = decoder.getOutputBuffers();
                Log.d(TAG, "ecoderOutputBuffers.length:" + decoderOutputBuffers.length);

             
                //将输出buffer数组的第outputBufferIndex个buffer绘制到surface。doRender为true绘制到配置的surface
                decoder.releaseOutputBuffer(outputBufferIndex, doRender);

                if (doRender && frameCallback != null) {
                    //渲染后的回调
                    frameCallback.postRender();
                }

                if (doLoop) {
                    Log.d(TAG, "Reached EOS, looping");
                    //需要循环的话,重置extractor的游标到初始位置。
                    extractor.seekTo(0, MediaExtractor.SEEK_TO_CLOSEST_SYNC);
                    inputDone = false;
                    //重置decoder到Flushed状态,不然就没法开始新一轮播放
                    // You can move back to the Flushed sub-state at any time while
                    // in the Executing state using flush().
                    //You can move back to the Flushed sub-state at any time while in the Executing state using flush()
                    decoder.flush();    // reset decoder state
                    frameCallback.loopReset();
                }
            }
        }
    }
}           

The code has official and detailed comments I added, here are a few key points:

1. The buyer asks the chef if a basket is available: first ask Mediacodec if there is a buffer that can be used with input:

int inputBufIndex = decoder.dequeueInputBuffer(TIMEOUT_USEC);           

The method definition is:

/**
 * Returns the index of an input buffer to be filled with valid data
 * or -1 if no such buffer is currently available.
 * This method will return immediately if timeoutUs == 0, wait indefinitely
 * for the availability of an input buffer if timeoutUs < 0 or wait up
 * to "timeoutUs" microseconds if timeoutUs > 0.
 * @param timeoutUs The timeout in microseconds, a negative timeout indicates "infinite".
 * @throws IllegalStateException if not in the Executing state,
 *         or codec is configured in asynchronous mode.
 * @throws MediaCodec.CodecException upon codec error.
 */
public final int dequeueInputBuffer(long timeoutUs) {
    int res = native_dequeueInputBuffer(timeoutUs);
    if (res >= 0) {
        synchronized(mBufferLock) {
            validateInputByteBuffer(mCachedInputBuffers, res);
        }
    }
    return res;
}           

TIMEOUT_USEC is the wait timeout. When the returned inputBufIndex is greater than or equal to 0, it means that there are currently available buffers, and the inputBufIndex indicates the sequence number of the available buffers in the Mediacodec. If you have waited TIMEOUT_USEC time and have not found a usable buffer, return inputBufIndex less than 0 and wait for the next loop to retrieve the buffer.

2. The buyer loads the raw pork into the basket and hands it to the chef: each time a piece of data is read from the readSampleData method in MediaExtractor, a piece of data is placed in ByteBuffer, and then the ByteBuffer is passed to Mediacodec for internal processing through Mediacodec's queueInputBuffer.

//从媒体文件中读取的一个sample数据大小到inputBuf中
int chunkSize = extractor.readSampleData(inputBuf, 0);           

Method definition:

/**
 * Retrieve the current encoded sample and store it in the byte buffer
 * starting at the given offset.
 * <p>
 * <b>Note:</b>As of API 21, on success the position and limit of
 * {@code byteBuf} is updated to point to the data just read.
 * @param byteBuf the destination byte buffer
 * @return the sample size (or -1 if no more samples are available).
 */
public native int readSampleData(@NonNull ByteBuffer byteBuf, int offset);           

Android hard codec MediaCodec parsing - from the story of the pork restaurant (1) said, according to the official website description, generally if it is video file data, do not pass to Mediacodec data that is not a complete frame, unless it is data marked with BUFFER_FLAG_PARTIAL_FRAME. So here it can be inferred that the readSampleData method reads a frame of data, which I will verify later.

The return value is the size of the read data, so if the return value is greater than 0, it means that the data has been read, and the data is passed into MediaCodec:

//得到当前数据的播放时间点
long presentationTimeUs = extractor.getSampleTime();
//把inputBufIndex对应的数据传入MediaCodec
decoder.queueInputBuffer(inputBufIndex, 0, chunkSize,
        presentationTimeUs, 0 /*flags*/);           

Regarding the queueInputBuffer method, the definition comment is too long, simply put, here is to pass the inputBufIndex buffer from the 0th bit chunkSize byte data into the MediaCodec, and specify the rendering time of this frame data presentationTimeUs, in the parsing H264 video coding principle - Starting from Son Ye-jin's movie, (1) once said

Here, due to the introduction of B frames, a phenomenon will be caused by the inconsistency between the encoded frame order and the played frame order, so pts and dts 2 timestamps (encoding time and playback time) are also derived

The presentationTimeUs here is pts, because the decoded frame data may not be the same as the playback order, and presentationTimeUs is required to specify the playback order. The last parameter flags is the flag bit used to describe the incoming data, which is generally used in some special cases, and 0 can be passed here.

If the readSampleData method returns a value, that is, the size of the read data is negative, it means that the end of the video file has been read, and the queueInputBuffer method is still called, but special handling is required:

decoder.queueInputBuffer(inputBufIndex, 0, 0, 0L,
        MediaCodec.BUFFER_FLAG_END_OF_STREAM);           

Send an empty frame, flag bit BUFFER_FLAG_END_OF_STREAM, tell MediaCodec that it is at the end of the file, and there is no data left in the file, that is, the buyer tells the cook that there is no more raw pork.

After sending this empty frame indicating the end, no data can be transmitted to the input until the MediaCodec enters the flushed state, or enters the stopped state and then starts.

The code on the input side is here, and then non-stop, immediately go to the ouptut side to try to get the output buffer (the customer walks up to the chef and asks if the pork is fried):

int outputBufferIndex = decoder.dequeueOutputBuffer(mBufferInfo, TIMEOUT_USEC);           

If it is not successful (the chef tells the customer that the pork is not yet fried), you get some decoded states, and in the project code, the following common states are listed:

1.MediaCodec.INFO_TRY_AGAIN_LATER: It means that it has been waiting for a long time TIMEOUT_USEC, and the successful data has not been decoded for the time being. Generally speaking, one is that the waiting time is not enough, and the other is that the input end is a B frame, and the next frame P frame is needed as a reference frame to decode (for details about B frame P frame, see # Analysis of H264 video coding principle - starting from Son Ye Jin's movie (1))

2.MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED: The output buffer array is outdated and needs to be replaced in time, and this flag bit is also obsolete because the newer version has used getOutputBuffer to get the output buffer.

3.MediaCodec.INFO_OUTPUT_FORMAT_CHANGED: The MediaFormat of the output data has changed.

If the decoding is successful, the index of the buffer of the decoded data is obtained in the output buffer. The information about the decoded buffer is placed in mBufferInfo. Then execute a very critical piece of code:

decoder.releaseOutputBuffer(outputBufferIndex, doRender);           

Draw the outputBufferIndex buffer of the output buffer array to the surface (remember the Surface object passed by the configure method). doRender is true, drawn to the configured surface. It can be understood that this line of code is similar to the Canvas draw method in Android, which draws a frame and recycles the Buffer.

summary

Good times are always so short, I think the key code for decoding should have been described in more detail~

In order to avoid the length of the article causing readers to easily doze off after reading, I still stop here, the next blog post # Android hard codec tool MediaCodec analysis - starting from the story of the pork restaurant (3) will explain some key points and pay attention to details after the code of this article, please pay attention to ~~

Reference:

Introduction to Video and Audio Data Processing: FLV Package Format Analysis MediaCodec official website

Android decoder MediaCodec parsing

Author: Cat in the Peninsula Tin Box Link: https://juejin.cn/post/7111340889691127815/ Source: Rare Earth Nuggets The copyright belongs to the author. For commercial reproduction, please contact the author for authorization, and for non-commercial reproduction, please indicate the source.

You are not alone on the road of development, welcome to join the C++ audio and video development exchange group "link" family to discuss and communicate!