Friday, October 17, 2008

Problem: Source Filter's Pause() fn was not called and directly Stop() fn was called


Problem;
----------------
  Source Filter's Pause() fn was not called and directly Stop() fn was called.
 
 
Solution steps:
------------------------
         For this problem I tried two approach.
 
          1.I have checked the source filter output pin's GetMediaType() fn and CheckMediaType() fn was succeeded and Decoder filter's CheckInputType() fn and CheckTransform() fn was succeeded. so no problem with pin connection
        2.I have developed the player application and rendered only the video output pin and still I got the same error
 
So there is no
            
 
 
Solution:
------------------
 
Importance of biPlanes in BITMAPINFOHEADER:
-------------------------------------------------------------------------

                In Source filter, we set the GetMediaType() returns media type.
This media type have BITMAPINFOHEADER; BITMAPINFOHEADER have biPlanes variable;
 if we set it to zero, then Source Filter directly goes to Filter's Stop() fn...
   if I modified it to 1, then the video is playing fine.
MSDN DSHOW docs prescribed this variable must be set to 1;
 
            So it is better to make the biPlanes  as 1 in media type returned with GetMediaType() fn in our Decoder.
 
 
 
 
 
 
 

how to find framerate of 3GP file?

 
 
how to find framerate of 3GP file? 
Is there any particular box present in 3gp file to specify framerate?
 
Solution:
--------------
      
3gp is actually QuickTime file. There is no special field for Frames Per Second, so you have to
calculate  by dividing the number of frames in stream and stream time
duration.
 

mDurationInSecs =  pMediaTrkInfo->trackDuration / lMovieInfo.timescale;

fFrameRate = pMediaTrkInfo->totalNumOfFrames/ (*apVidInfoLst)[index].mDurationInSecs;

//Frame Rate calculation if 10 frames r played in 2 secs, then FPS is 5 (10/5)

AvgTimePerFrame calculation

 
//FrameRate calculataion
header File is reftime.h
-----------------------------
const LONGLONG MILLISECONDS = (1000);            // 10 ^ 3
const LONGLONG NANOSECONDS = (1000000000);       // 10 ^ 9
const LONGLONG UNITS = (NANOSECONDS / 100);      // 10 ^ 7
//const REFERENCE_TIME FPS_25  = UNITS / 25;
//const REFERENCE_TIME FPS_30  = UNITS / 30;
//AvgTimePerFrame  = UNITS/frameRate;

 

VS 2005 Dshow Error PVOID64

 Error C2146: syntax error: missing ';' before identifier PVOID64

VS 2005 error with Dshow baseclasses.

C:\Program Files\Microsoft Visual Studio 8\VC\PlatformSDK\include\winnt.h(222) : error

C2146: syntax error : missing ';' before identifier 'PVOID64'


Solution:
--------------

The DirectX Include directory contains an early version of BaseTsd.h which does not

include the definition for POINTER_64. You should instead use the version of BaseTsd.h in

the Platform SDK, either the one that ships with Visual Studio 2005 (C:\Program

Files\Microsoft Visual Studio 8\VC\PlatformSDK\Include\BaseTsd.h) or in an updated

Microsoft SDK installation. In order to get the compiler to use the right file, just

remove BaseTsd.h from the DirectX Include directory.


 

Error

 
 
Error :
---------
AKY=00080001 PC=03f6dc24(coredll.dll+0x00021c24) RA=29b314b8(tcpip6.dll+0x000414b8)
BVA=390614b8 FSR=000000f5
and explain the scenario

Solution:
------------
 1.This is the memory based error. somewhere we are using the memory beyond its
boundary.
 I have allocated the 30 bytes of memory and try to do copy more than 40 bytes with
memcpy() fn. It causes the crash.
 

 

Difference between Normal file and PD streamable clip

Difference between Normal file and PD streamable clip:
----------------------------------------------------------------------------------
 
                   Normal video file can have the file index information at the end. ( video track audio track and its duration and all informations available @ the end. if it is so, we can play the file only after downloading the entire contents of a file.)
                 Progressive Download streamable clip will have the index information at the beginning.In case of 3gp file, moov atom will have the file index position and information .
              if the moov atom is available @ the end of the file, then the file is not streamable using PD.
     In case of rtsp streamable 3gp file also have some hint track @ the beginning to inform the file index positions @ the beginning.
                

Assumptions:
------------------

1.Quicktime, mp4,3gp file having hint track can be transferred thru http or ftp protocols
 
2.Moov atom index must be at the beginning of the file ... so that player can play the file before downloading the entire content...

3.Self contained movie can be streamable...

 

Solutions:
-----------
  Any Quicktime,mp4 or 3gp files can be able to do Progressive Download Streaming
if moov  atom is @ first..Moov atom index must be at the beginning of the file ... so that player can play the file before downloading the entire content.

 


 For Example


3gp file atoms are like this:
------------------------------
 ftyp
 moov
  mdat

 we can stream it in PD streaming.

if the 3gp atoms are as follows:

 ftyp
 mdat
 moov


 this indicates Moov atom is at the end of the file... So it can't be streamable in Progressive Download.

 

PD Stack:
-----------
 within PD stack,we will wait for the File Size and information about the video and audio track info...

 if it doesnt receive the video, audio track info and file information it will not proceed to download the content.  

 


  

Without calling Pause () fn we are not able to set seek position...

 


Without calling  Pause () fn we are not able to set seek position...
--------------------------------------------------------------------------

we are able to do the SetPosition ( set the seek position) on the fly while running the video.

    In our Source Filter we are not able to set the seek position. Before setting the seek position, we have to the

following:

        Pause() of IMediaControl;
 SetPosition();
 Run()

then only  we are able to set the seek position to the source filter.

 

Solution:
--------------
       The reason we are not able to do the seek position while running the video is we are not handling
setPosition properly.


  if Source Filter's output pin thread  exists , that represents the video is running; then

          i)call the stop on Source Filter's output pins .

  if(m_paStreams[i]->ThreadExists())
   {
     if(m_paStreams[i]->IsConnected())
     {
    hr = m_paStreams[i]->DeliverBeginFlush() ;
     }
    
     if(m_State != State_Stopped )
     {
      hr = m_paStreams[i]->Stop();
     }
         
     if(m_paStreams[i]->IsConnected())
     {
    hr = m_paStreams[i]->DeliverEndFlush() ;
     }  
   }
 

         ii)Set the seek position

 iii)call the Pause () fn on Each output pins of the source Filter


 
If the filter graph is stopped, the video renderer does not update the image after a seek operation. To the user, it will

appear as if the seek did not happen. To update the image, pause the graph after the seek operation. Pausing the graph cues a

new video frame for the video renderer. You can use the IMediaControl::StopWhenReady method, which pauses the graph and then

stops it.
 

when the Source filter output pin connects to Decoder ?

 
Source Filter:
-------------------------
For Media type, check the
1.GetMediaType() and CheckMediaType() succeeds then only the
Source Filter output pin will connects to any decoder filters

H.264 Nonstandard clip Crash issue in Quicktime 7.0.3

H.264 Nonstandard clip Crash issue in Quicktime 7.0.3 :
=========================================================

 i) Normally the width and height of the H.264 will be the multiple of 16.
 ii)Non standard clip means it will not be the multiple of 16,like 450x360

 iii)In Quicktime Player,Quicktime File parser will give original width and height
  ( 450x360) and the decoder allocated the output video width and height as

450 *360. This will leads to crash. Decoder must output the buffer as follows
464 * 364; (width and height are aligned as multiple of 16).

 In H264, we will encode the video frame with 100x100 and the width and height may

be the 112 * 112; the actual video width height is as 100,100.

 iv)if the decoder allocates only the 450* 360 output width the crash might happen

in decoder or the renderer which renders the data;

 Quicktime may solved this issue in latest version Quicktime 7.5. In older

version,they may allocate the output width and height as 450*360 and crashed.
 Our decoder too behaved in the same way.

 

 


 

 

RGB565 video renderer performance and how can we tackle it ?

RGB565 video renderer performance and how can we tackle it ?

Reason:
=========
 In case of RGB 565 scaling and rotation, for retrieving the R,G,B component,we
have to do calculation for Each and every pixel that will decreases the time taken.
RGB 565 format scaling and rotation is taking so much of time. But In case of the YV12 we are able to do scaling and rotation with twice a speed of the RGB565;
 
Solution:
============
        if we need more performance or effective execution do it as follows:
if they need RGB565 scaling and rotation,do the following:
             i)  convert the RGB 565 to YV12
      ii) Do YV12 scaling or rotation
      iii)Convert YV12 to RGB 565 back.
it will increase the performance;
 
 
 

Why WMP calls the IFileSourceFilter's Load fn Twice

Why WMP calls the IFileSourceFilter's Load fn Twice:
-----------------------------------------------------------

if we implemented the IFileSourceFilter in our Filter, WMP calls the IFileSourceFilter's Load() function twice;

Reason:
----------
 WMP calls "Load" two times, First time just to check whether the filter is proper dll or not
 WMP call "Load" again for the second time. Second time only  we have to create any pins to the Filter;
In case of Source Filter,we will be creating pins only at the second time;
      

WMP player is not loading the Filter DLL in Wifi

WMP player is not loading the Filter DLL in Wifi :
-------------------------------------------------------------------------

1.We enabled the Wifi in a mobile device;
Thru the Wifi, we try to reach the server for streaming server;

 We are giving the rtsp/PD server address in WMP;
In WMP, if we specified the  http  or rtsp, WMP must automatically load the
Filter.
         But it doesnt happen and our filter is not getting loaded;


Reason:
----------
 WMP player will not load the filter DLL before checking the host.

WMP Behavior:
-------------
 WMP will sends some form of HTTP GET / some requests to server

 For Example rtsp://10.203.92.78:8080/1.3gp

 WMP will sends the some network request to the the 10.203.92.78 server; if the

server acknowledges the WMP, then only WMP tries to load the filter DLL for the

corresponding format;

 In case of the Wifi, it may sends the HTTP Get command. Wifi server may not

response to the request or some form of communication gap happens.. so WMP will not get

the response;


  But In case of GPRS streaming,WMP loads the filter DLL and we are able to do streaming well;
 
Another reason WMP may not be able to detect the connected network intelligently;


Solution:
==========
      Develop our own player application to insert the Streaming source Filter and render

it...
 


Note:
------- 
  if our Player application is using RenderFile() means that might cause the same problem;


 

 

WinCE macro and Log Location

WinCE macro:
----------------------
 
#ifndef WINCE
 fp  = fopen("C:\\Test2ByteAligned.Dump","wb");
#else
 fp = fopen("\\My Documents\\Test2ByteAligned.Dump","wb");

RTSP Audio video Sync


Problem :
------------

 Issues in RTSP/LiveStream Audio and video sync

 RTSP video on Demand videos are playing fine. But Only Livestream audio and video

sync is not happening  Streaming server is sending RTSP as meta data channel( just for DESCRIBE,

PLAY,PAUSE and TEARDOWN commands);Streaming server will send video and audio data over the

RTP.

Analysis:
----------

 if there is any issue in audio and video sync issues in RTSP/Livestream RTSP,
it is because of the Streaming Server.

 

 rtsp streaming servers are specialized servers;They have to do audio and video

sync; In case of rtsp streaming, it is enough to render the audio and video data using the

RTP timestamp. there is no need to do audio video sync with RTCP.if we are doing RTCP

sync, that will be better;
 But For RTCP sync timestamp calculation it will takes much time because of

floating operations involved in it. So In case of low end devices such as mobile they will

not use the RTCP for sync; it is enough to do render the video and audio with RTP

timestamp;


 
Solution:
----------
 Make the Streaming server to give output as video and audio synchronized;