天天看點

因品質控制導緻回放錄像過快的原因分析

編者:李國帥

背景原因:

數年前資料,使用directshow播放錄像檔案,發生過快現象。其中一個原因是由品質控制引起的。

問題分析:

如果快速播放檔案的話,也不能正常顯示視訊。

遠端回放的時候快放現象,實際播放的時間比想要播放的時間短很多。

不能使用sleep暫停播放,那樣的話視訊也不會正常顯示。

并且可能出現EC_QUALITY_CHANGE事件,一旦出現視訊就會出現快放,直到檔案結束,并且圖像不再繼續。

[1520] CMyPlayer::OnNotifyGraph eventCode=EC_QUALITY_CHANGE 11

[1520] CVideoOutPin::FillBuffer:Last WasteInMS Need=39 True=6,rtBeginPos=1199888 m_rtlastBeginPos=800000 rtEndPos=1999888,TimeStamp last=39600,now=46800

[1520] CVideoOutPin::FillBuffer:Last WasteInMS Need=80 True=6,rtBeginPos=1999888 m_rtlastBeginPos=1199888 rtEndPos=2799888,TimeStamp last=46800,now=54000

[1520] CVideoOutPin::FillBuffer:Last WasteInMS Need=80 True=6,rtBeginPos=2799888 m_rtlastBeginPos=1999888 rtEndPos=3200000,TimeStamp last=54000,now=57601

下面的消耗時間全部是6毫秒。對于從網絡擷取的資料,丢上一兩幀是正常的,但是是以就不再播放是不合理的。

是以,播放的時候最好按照幀率進行播放,而不是推測幀率按照幀率來播放,太複雜,并且難以控制。

A filter sends this event if it drops samples in response to a quality control message.

It sends the event only when it adjusts the quality level, not for each sample that it

drops. For more information, see Quality-Control Management.

EC_QUALITY_CHANGE

(過濾圖為了品質控制丢桢)

The graph is dropping samples, for quality control.

A filter sends this event if it drops samples in response to a quality control message.

It sends the event only when it adjusts the quality level, not for each sample that it drops.

This message means normally that the renderer is dropping samples (i.e. they are arriving too late, meaning the CPU is overloaded).

如果調高遠端桌面venderer的顯示效果。

需要調低分辨率,幀率嗎? 使用cif也沒有效果,依然顯示一幀。

如果遇到這樣的消息,需要重建立立graph嗎?

到底是在解碼filter丢幀還是在顯示filter丢幀?加入中間過濾器看看。

品質控制 Microsoft DirectX 9.0 chm文檔内容

​​http://www.directshows.ru/htm/defaultqualitycontrol.htm​​

Quality-Control Management

This is preliminary documentation and subject to change.

Quality control is a mechanism for adjusting the rate of data flow through the filter graph in response to run-time performance.

If a renderer filter is receiving too much data or too little data, it can send a quality message.資料太多太少就會發送品質消息

The quality message requests an adjustment in the data rate.

By default, quality messages travel upstream from the renderer until they reach a filter that can respond (if any).

An application can also implement a custom quality manager. In that case, the renderer passes quality messages directly to the application's quality manager.

--

Default Quality Control

This is preliminary documentation and subject to change.

The DirectShow Base Classes implement some default behaviors for video quality control.

Quality messages start at the renderer. The base class for video renderers is CBaseVideoRenderer, which has the following behavior:

When the video renderer receives a sample, it compares the time stamp on the sample with the current reference time.

The video renderer generates a quality message. In the base class, the Proportion member of the quality message is limited to a range of 500 (50%) to 2000 (200%). Values outside this range could result in abrupt quality changes.

By default, the video renderer sends the quality message to the upstream output pin (the pin connected to its input pin). Applications can override this behavior by calling the SetSink method.

What happens next depends on the upstream filter. Typically, this is a transform filter. The base class for transform filters is CTransformFilter, which uses the CTransformInputPin and CTransformOutputPin classes to implement input and output pins. Together, these classes have the following behavior:

The CTransformOutputPin::Notify method calls CTransformFilter::AlterQuality, a private method on the filter base class.

Derived filters can override AlterQuality to handle the quality message. By default, AlterQuality ignores the quality message.

If AlterQuality does not handle the quality message, the output pin calls CBaseInputPin::PassNotify, a private method on the filter's input pin.

PassNotify passes the quality message to the appropriate place—the next upstream output pin, or a custom quality manager.

Assuming that no transform filter handles the quality message, the message eventually reaches the output pin on the source filter. In the base classes, CBasePin::Notify returns E_NOTIMPL. How a particular source filter handles quality messages depends on the nature of the source. Some sources, such as live video capture, cannot perform meaningful quality control. Other sources can adjust the rate at which they deliver samples.

The following diagram illustrates the default behavior.

The base video renderer implements IQualityControl::Notify, which means you can pass quality messages to the renderer itself. If you set the Proportion member to a value less than 1000, the video renderer inserts a wait period between each frame that it renders, in effect slowing down the renderer. (You might do this to reduce system usage, for example.) For more information, see CBaseVideoRenderer::ThrottleWait. Setting the Proportion member to a value greater than 1000 has no effect.

----

EC_QUALITY_CHANGE 網絡資料

Windows 7 - Intercept EC_QUALITY_CHANGE event

Asked By fungi821 on 12-Jul-07 01:06 AMHi,

         On the document it says EC_QUALITY_CHANGE event originates from the

Video Renderer. I have a situation that whenever FGM receives such event,

decoder drops frames constantly.

         If i want to intercept such event, such that so it won't be sent to the

decoder, how should i come about to achieve this? Write a trans-in-place

filter between the renderer and the decoder? Which function do i need to

override?

thanks!

In fact although the event message and the dropped frames occur at the same time, they are not directly cause and effect.

The quality message that causes the decoder to drop frames is a signal directly from the VR to the decoder's Notify function.

You can trap this, but if you just want to avoid dropping frames you are better off turning off the graph clock

(query the graph itself for IMediaFilter and call SetSyncSource(NULL)).

---------------msdn 時鐘

​​http://msdn.microsoft.com/en-us/library/windows/desktop/dd407202(v=vs.85).aspx​​

The IBaseFilter interface inherits from IMediaFilter.

IMediaFilter::SetSyncSource method

HRESULT SetSyncSource(  [in]  IReferenceClock *pClock);

pClock [in]

Pointer to the clock's IReferenceClock interface, or NULL. If this parameter is NULL, the filter graph does not use a reference clock, and all filters run as quickly as possible.

IGraphBuilder *pGraph = 0;
IReferenceClock *pClock = 0;

CoCreateInstance(CLSID_FilterGraph, NULL, CLSCTX_INPROC_SERVER,  IID_IGraphBuilder, (void **)&pGraph);

// Build the graph.
pGraph->RenderFile(L"C:\\Example.avi", 0);

// Create your clock.
hr = CreateMyPrivateClock(&pClock);
if (SUCCEEDED(hr))
{
    // Set the graph clock.
    IMediaFilter *pMediaFilter = 0;
    pGraph->QueryInterface(IID_IMediaFilter, (void**)&pMediaFilter);
    pMediaFilter->SetSyncSource(pClock);
    pClock->Release();
    pMediaFilter->Release();
}      

You can also set the filter graph to run with no clock, by calling SetSyncSource with the value NULL. If there is no clock, the graph runs as quickly as possible. With no clock, renderer filters do not wait for a sample's presentation time. Instead, they render each sample as soon as it arrives. Setting the graph to run without a clock is useful if you want to process data quickly, rather than previewing it in real time.

----------------------

sathaadj問題

Is there a possibility that the MJPEG Decompressor can generate EC_QUALITY_CHANGE event on its own.

I considered that it could be derived from the CVideoTransformFilter base class code which by default has

a quality management code. According to this code (vtrans.cpp) the filter which derives from it should call

ShouldSkipFrame() every time before sending the frame to the output pin.

The ShouldSkipFrame() code in vtrans.cpp is written such a way that it will not consider doing quality control

when there are no time stamps set on the incoming samples, so i decided to put a transform filter inbetween

Camera and the MJPEG Decompressor which would remove the timestamps from the IMediaSample by using

SetTime(NULL,NULL);

BOOL CVideoTransformFilter::ShouldSkipFrame( IMediaSample * pIn)

{

...

         NotifyEvent(EC_QUALITY_CHANGE,0,0);

...

    return m_bSkipping;

}

if i set the timestamps as NULL then GetTime should have returned failure and hence there should not have

been any frame drops by the MJPEG Decompressor filter. But i still have the same problems, i also coded AlterQuality()

function for my new filter which is inbetween camera and MJPG Decompressor but i never received any indication to it,

does that mean it has been handled in the MJPEG Decompressor filter itself?

I also tried SetSyncPoint(TRUE) on every sample sent to the MJPEG Decompressor which also doesn't work MJPEG Decompressor still drops all the key frames?!!!

I have previously tried to set the quality sink to my encoder filter also which also doesnt get any notification about quality change.

Can any one please help me in narrowing down this problem?????

--Stefan 回答

If you CPU hits 100 % the VMR7 will most likely send quality messages.

So if you override AlterQuality() it will be called. I just wrote a message killer filter which returns S_OK to stop quality messages being sent downstream, and it works.

Here are some questions to consider:

How do you know that the MJPG Decompressor freezes ?

Does this also happen with just the VMR7 connected to the Encoder ?

What transformation performs the Encoder ?

How does the Encoder react to quality messages ?

----sathaadj 再問

Hi Stefan,

          Thanks for your reply. I have removed the VMR7 from the graph by which my graph looks like

Camera ---> MJPG Decompressor --> Inf Tee Filter --> Encoder --> Network

There are no VMR's in this graph even then i am getting this event and the MJPG Decompressor freezes.

Here are the answers to your questions:

>>>>How do you know that the MJPG Decompressor freezes ?

I have checked whether i get any data in the encoder's input pin after the EC_QUALITY_CHANGE change message,

i dont get any buffers after the event occurs.

>>>>Does this also happen with just the VMR7 connected to the Encoder ?

>>>>What transformation performs the Encoder ?

My Encoder converts the RGB24 input to a network streamable video (H.264) which is streamed through my network sink filter.

So i have given NULL media types for my encoder's ouput pin, which makes it impossible to connect with the VMR7.

>>>>How does the Encoder react to quality messages ?

I just wanted to check whether they originate from the VMR7 filter so i just returned S_OK from the encoder's AlterQuality() function

anyways i did not get the EC_QUALITY_CHANGE event there.

Any sort of help is most welcome.... ;)

--------

That sounds as if quality messages are not part of your problem.

Did you try

Camera ---> MJPG Decompressor --> Inf Tee Filter --> Encoder-> Null Renderer ?

and

Camera ---> MJPG Decompressor --> Inf Tee Filter --> Null-Renderer ?

That should give you a direction which filter causes the problem.

The infinitee filter uses a queue on every output pin which can switch from direct sample delivery to asynchronous (threaded) delivery.

This behaviour depends on the implementation of the ReceiveCanBlock() method of the downstream filters. What does your encoder and network sink return in this ?

Regards,

Stefan.

2009年3月25日 13:20

----------sathaadj

Stefan,

        The graph is meant to see the local video as well as send it to the far end. When the camera is connected for VGA to the

MJPG Decompressor filter i am able to see the local video through the VMR7 properly. But only when i start the encoder to

give out the data to the network i see the problem.

        There are no problems when the camera runs for QVGA or CIF(untill the CPU is less utilized). So only when the resolution

is high (640*480) onwards, i start to see the problems. I have tested in another system where VGA encoding does not take

more CPU and it works fine there.

        Is there a way to find out which filter has sent this EC_QUALITY_CHANGE event other than setting the quality sink, which

wont be passed through if the filter can handle the event on its own?

        I still believe that MJPG decoder does some performance measurement on its own and decides to stop the streaming. Any

inputs on this view? Are there any other hidden interfaces not documented in MSDN to support these situations?

>>>>>>>What does your encoder and network sink return in this ?

       The encoder filter is derived from the CTransformFilter and i haven't coded ReceiveCanBlock() function. The nework sink

filter is derived from CBaseFilter and i have overridden CBaseInputPin::ReceiveCanBlock() function to return S_FALSE; So when i start

the graph i hit the network sink's ReceiveCanBlock function call, i hope that the Infinite Pin Tee Filter will see this result and take action.

       Will Infinite Tee start a new thread to deliver data for the encoder filter if it sees S_OK on a ReceiveCanBlocK() function call? I have

coded ReceiveCanBlock in encoder as well as the network sink to return S_OK and found the same results.

Thanks for your timely reply,

Satheesh.

問題最後的解決方法:

HRESULT CGraphBase::ConnectFilters()
{
//... 連接配接graph中的所有過濾器,然後
  if (mMediaFilter && m_pVideoRenderer->GetRendererType() == Try_VMR7)
  {
    mMediaFilter->SetSyncSource(NULL);
  }
  return NOERROR;
//...
}
IMediaFilter*    mMediaFilter;
mMediaFilter = NULL;

HRESULT CGraphBase::CreateGraph(HWND inWindow)
{
//... 建立graph
  hr |= m_pGraph->QueryInterface(IID_IMediaFilter, (void **)&mMediaFilter);
  ATLASSERT(mMediaFilter != NULL);
//..
}
HRESULT CGraphBase::ReleaseGraph()
{
//...
  SAFE_RELEASE(mMediaFilter);
//.. 銷毀graph
}