Compaq Multimedia Services
for OpenVMS Alpha
Programmer's Guide


Previous Contents Index

4.2.8.3 Sending a Data Buffer to a Video Device Channel

Use the videoStreamAddBuffer function to send an empty data buffer for capture and a filled data buffer for playback to a video device channel.

See the description of the videoStreamAddBuffer function in Section 4.7 for more information about sending a data buffer to a video device channel.

4.2.8.4 Starting Streaming on a Video Device Channel

Use the videoStreamStart function to start streaming on a video device channel. The videoStreamStart function call starts capturing video data and copying it to buffers supplied by the calling application, or playing back video data in the buffers supplied by the application.

See the description of the videoStreamStart function in Section 4.7 for more information about starting video streaming.

4.2.8.5 Getting the Current Position of a Video Stream

To obtain the current position of the specified video stream, use the videoStreamGetPosition function.

See the description of the videoStreamGetPosition function in Section 4.7 for more information about getting the current position of a video stream.

4.2.8.6 Using Callback Functions to Manage Video Streaming

To specify a callback function to process messages sent by the device during video streaming, set the CALLBACK_FUNCTION flag in the dwFlags argument and specify the address of the callback function in the dwCallback argument of the videoStreamInit function. The callback function is required for VIDEO_IN and VIDEO_OUT channels.

Note

Multimedia Services for OpenVMS does not support the use of a callback window. Only callback functions are supported.

Callback functions are used to notify the application when the device is finished opening or closing a video device channel, when a buffer is available, or when an error occurs during video streaming.

Note

Applications must not execute any Multimedia Services for OpenVMS API function calls from a callback function.

It is important to write callback functions carefully to adhere to the Multimedia Services for OpenVMS callback scheme. See the description of the videoStreamInit function in Section 4.7 for more information about using callback functions.

4.2.8.7 Returning Video Streaming Errors

Use the videoStreamGetError function to return the most recent error encountered during video streaming.

See the description of the videoStreamGetError function in Section 4.7 for more information about returning video streaming errors.

4.2.8.8 Stopping Streaming on a Video Device Channel

Use the videoStreamStop function to stop data streaming on a video device channel. The buffers are retained by the device for later use.

See the description of the videoStreamStop function in Section 4.7 for more information about stopping streaming on a video device channel.

4.2.8.9 Unpreparing a Data Buffer After Video Streaming

Use the videoStreamUnprepareHeader function to clean up the preparation performed by the videoStreamPrepareHeader function. Call the videoStreamUnprepareHeader function after the video streaming operation is stopped.

See the description of the videoStreamUnprepareHeader function in Section 4.7 for more information about unpreparing a video data buffer.

4.2.8.10 Terminating Video Data Streaming

Use the videoStreamFini function to terminate streaming on a video device channel. This is the last streaming function sent to a video device channel.

If there are buffers sent with the videoStreamAddBuffer function that have not been returned to the application, this operation fails. Use the videoStreamReset function to stop streaming on a video device channel, mark all pending buffers as done, and reset the current position to zero before calling videoStreamFini .

See the descriptions of the videoStreamFini and videoStreamReset functions in Section 4.7 for more information about terminating streaming on a video device channel.

4.2.9 Modifying the Rendering Attributes of Video Images

The attributes of the video data used to render the data to 8-bit dithered images can be modified using a number of video functions. These attributes apply when capturing or looping back 8-bit X image data. They do not operate on JPEG, 24-bit X image, or YUV data.

The following list presents the attributes that can be modified and the functions used to modify them:

See the descriptions of these functions in Section 4.7 for more information.

4.2.10 Modifying the Quality of Video Data

You can use the videoGetQuality and videoSetQuality functions to obtain and set the current quality value. These functions control the quality of the compression of the captured data. They do not control the quality of the noncompressed captured data or the playback data, even when the playback data is compressed data, because the quality is encoded within the compressed data already. Various vendors may implement quality differently in their capture device implementations.

4.2.11 Closing a Video Device

Use the videoClose function to close a video device.

See the description of the videoClose function in Section 4.7 for more information about closing a video device.

4.2.12 Handling Errors

The video capture and playback functions return specific error codes. Multimedia Services for OpenVMS provides a function that converts error codes into textual descriptions of the errors. An application must first look at an error code to determine how to proceed, then prompt the textual description of the error to describe it to users.

Use the videoGetErrorText function to retrieve textual descriptions of specified video input and output errors.

Use the videoStreamGetError function to return the error most recently encountered while streaming video data.

See the descriptions of the videoGetErrorText and
videoStreamGetError functions in Section 4.7 for more information about handling video capture and playback errors.

4.3 Video Capture and Playback Header Files

The function prototypes, constants, flags, and data structures that applications can use to access the video capture and playback services are defined in the mme_api.h header file provided with Multimedia Services for OpenVMS. Include this header file in all application programs that use video capture and playback functions.

4.4 Video Capture and Playback Data Structures

This section contains the data structures used by the video capture and playback functions. The data structures are presented in alphabetical order. Each structure is defined, followed by a description of each field.

Note

All video data structures must be allocated with the mmeAllocMem function. All video data buffers must be allocated with the mmeAllocBuffer or the mmeAllocBufferAndGetShminfo function.

See Chapter 2 for more information about these memory allocation functions.

4.4.1 CHANNEL_CAPS Data Structure

The CHANNEL_CAPS data structure is used with the videoGetChannelCaps function to return the capabilities of a video capture device channel to an application.

Example 4-1 shows the CHANNEL_CAPS data structure definition.

Example 4-1 CHANNEL_CAPS Data Structure Definition

typedef struct channel_caps_tag  { 
    DWORD dwFlags;            /* Flags for channel caps info */ 
    DWORD dwSrcRectXMod;      /* Granularity: horiz src rectangle */ 
    DWORD dwSrcRectYMod;      /* Granularity: vert src rectangle */ 
    DWORD dwSrcRectWidthMod;  /* Granularity: src rectangle width */ 
    DWORD dwSrcRectheightMod; /* Granularity: src rectangle height */ 
    DWORD dwDstRectXMod;      /* Granularity: horiz dest rectangle */ 
    DWORD dwDstRectYMod;      /* Granularity: vert dest rectangle */ 
    DWORD dwDstRectWidthMod;  /* Granularity: dest rectangle width */ 
    DWORD dwDstRectHeightMod; /* Granularity: dest rectangle height */ 
} CHANNEL_CAPS; 

The CHANNEL_CAPS data structure has the following fields:

dwFlags
Specifies flags giving information about the channel. The following flags are defined:

VCAPS_OVERLAY
Indicates the channel is capable of overlay. This flag is used only for VIDEO_EXTERNALOUT channels.

VCAPS_SRC_CAN_CLIP
Indicates that the source rectangle can be set smaller than the maximum dimensions.

VCAPS_DST_CAN_CLIP
Indicates that the destination rectangle can be set smaller than the maximum dimensions.

VCAPS_CAN_SCALE
Indicates that the source rectangle can be a different size than the destination rectangle.

dwSrcRectXMod
Returns the granularity allowed when positioning the source rectangle in the horizontal direction.

dwSrcRectYMod
Returns the granularity allowed when positioning the source rectangle in the vertical direction.

dwSrcRectWidthMod
Returns the granularity allowed when setting the width of the source rectangle.

dwSrcRectHeightMod
Returns the granularity allowed when setting the height of the source rectangle.

dwDstRectXMod
Returns the granularity allowed when positioning the destination rectangle in the horizontal direction.

dwDstRectYMod
Returns the granularity allowed when positioning the destination rectangle in the vertical direction.

dwDstRectWidthMod
Returns the granularity allowed when setting the width of the destination rectangle.

dwDstRectHeightMod
Returns the granularity allowed when setting the height of the destination rectangle.

Some channels can use source and destination rectangles that fall on only two, four, or eight pixel boundaries. Similarly, some channels only accept capture rectangles widths and heights that are multiples of a fixed value. Use the values contained in the CHANNEL_CAPS data structure when calculating and specifying the source and destination rectangles.

4.4.2 VIDEOHDR Data Structure

The VIDEOHDR data structure defines the header used to identify a video data buffer. The following functions use the VIDEOHDR data structure:

Example 4-2 shows the VIDEOHDR data structure definition.

Example 4-2 VIDEOHDR Data Structure Definition

typedef struct videohdr_tag { 
    LPSTR lpData;             /* Pointer to a video data buffer */ 
    DWORD dwBufferLength;     /* Length of the video data buffer */ 
    DWORD dwBytesUsed;        /* Bytes used in the video data buffer */ 
    DWORD dwTimeCaptured;     /* Time (in ms) of frame capture */ 
    DWORD dwUser;             /* Data supplied by the user */ 
    DWORD dwFlags;            /* Flags for video data buffer info */ 
    DWORD dwReserved[4];      /* Reserved for use by device interface */ 
    LPSTR lpData2[2];         /* Pointers to two video data buffers */ 
    DWORD dwBufferLength2[2]; /* Length of the two video data buffers */ 
    DWORD dwBytesUsed2[2];    /* Bytes used in two video data buffers */ 
} VIDEOHDR; 

The VIDEOHDR data structure has the following fields:

lpData
Specifies a pointer to a video data buffer.

dwBufferLength
Specifies the length (in bytes) of the data buffer.

dwBytesUsed
Specifies the number of bytes used in the data buffer.

dwTimeCaptured
Specifies the time (in milliseconds) when the frame was captured relative to the first frame in the stream.

dwUser
Specifies 32 bits of user data.

dwFlags
Specifies flags giving information about the data buffer. The following flags are defined:

VHDR_DONE
Set by the device interface to indicate it is finished with the data buffer and it is returning the buffer to the application.

VHDR_FIELD_EVEN ( videoFrame and videoStreamAddBuffer functions only)
Tells the device to use an even field for use with the buffer.

VHDR_FIELD_ODD ( videoFrame and videoStreamAddBuffer functions only)
Tells the device to use an odd field with the buffer.

VHDR_PREPARED
Not supported by Compaq Multimedia Services for OpenVMS Alpha.

VHDR_INQUEUE
Not supported by Compaq Multimedia Services for OpenVMS Alpha.

VHDR_KEYFRAME
Not supported by Compaq Multimedia Services for OpenVMS Alpha.

dwReserved[4]
Reserved for use by the device interface.

lpData2[2]
Specifies pointers to two video data buffers. This is a Compaq extension to the multimedia API specification. It provides dual VIDEO_IN and VIDEO_OUT buffer support for video option modules that support dual buffering. See Appendix B for specific device details.

dwBufferLength2[2]
Specifies the length in bytes of the two lpData2 video data buffers. This is a Compaq extension to the multimedia API specification. It provides dual VIDEO_IN and VIDEO_OUT buffer support for video option modules that support dual buffering. See Appendix B for specific device details.

dwBytesUsed2[2]
Specifies the number of bytes used in the lpData2 video data buffers. This is a Compaq extension to the multimedia API specification. It provides dual VIDEO_IN and VIDEO_OUT buffer support for video option modules that support dual buffering. See Appendix B for specific device details.

In Microsoft Video For Windows (VFW), the VIDEOHDR data structure fields dwFlags and dwBytesUsed are frequently updated to reflect the state of the buffer as it is processed by the driver. On OpenVMS systems, these fields are updated only once when the buffer is returned to the client.

4.5 Using Dual Buffers

Dual buffering can be used for capture, to retrieve a JPEG buffer and loopback a dithered buffer (for simultaneous viewing) that is being retrieved and, for playback, to playback one buffer and loopback a dithered buffer (for simultaneous viewing) that is being played back. Furthermore, for capture, the two buffers can also be stored away.

For capture, two different formats can be captured. One has to be JPEG, and the other can be uncompressed data format. For playback, the playback buffer is restricted to JPEG and the loopback buffer has to be uncompressed data format.

Compaq has added several extensions to the API specification to provide the software support required for dual buffers as follows:

The VIDEOHDR data structure has been extended by three fields: lpData2, dwBufferLength2, and dwBytesUsed2. These fields have been added to support dual buffers. See Section 4.4.2 for more information.

An application can pass two buffers for each videoFrame and videoStreamAddBuffer call. One buffer must be in an uncompressed format, and the other buffer must be in a compressed format. Both requested buffers cannot be the same type (for example, compressed or uncompressed).

The buffer types are specified during the required call to the videoConfigure function in the BITMAPINFOHEADER data structures. An application can pass two BITMAPINFOHEADER data structures, one in the lpData2[0] argument and the other in the lpData2[1] argument.

The order in which the BITMAPINFOHEADER data structures are specified during a call to the videoConfigure function determines the buffer type order expected by the videoFrame and videoStreamAddBuffer functions. For example, if during a call to the videoConfigure function:

lpData1 is specified as BICOMP_DECXIMAGEDIB
lpData2 is specified as JPEG_DIB
then, during subsequent calls to the videoFrame or videoStreamAddBuffer function:
lpData2[0] is filled with BICOMP_DECXIMAGEDIB data
lpData2[1] is JPEG_DIB data (filled or played back)

Note

The lpData2, dwBufferLength2, and dwBytesUsed2 fields are ignored if a single buffer is requested during a call to the videoConfigure function. Instead, the lpData, dwBufferLength, and dwBytesUsed fields are used.

4.6 Video Capture and Playback Function Overview

Table 4-1 provides a summary of the video capture functions.

Table 4-1 Video Capture and Playback Function Summary
Application Function Description
Functions for opening, closing, and communicating with a video capture device:
videoClose Closes the specified video device channel
videoGetErrorText Retrieves a description of the error identified by the error code
videoOpen Opens a channel on the specified video device
Functions for controlling the configuration of a video capture device:
videoConfigure Sets or retrieves a configurable driver option
videoGetChannelCaps Retrieves a description of the capabilities of a channel
videoGetNumDevs Returns the number of video devices installed
Function for operating on a single frame:
videoFrame Transfers a single frame from or to a video device channel
Functions for controlling video capture streaming:
videoStreamAddBuffer Sends a buffer to a video device
videoStreamFini Terminates streaming from the specified device channel
videoStreamGetError Returns the most recent error encountered
videoStreamGetPosition Retrieves the current position of the specified video device channel
videoStreamInit Initializes a video device channel for streaming
videoStreamPrepareHeader Prepares a buffer for video streaming
videoStreamReset Stops streaming on the specified video device channel, returns all video buffers from the driver, and resets the current position to zero
videoStreamStart Starts streaming on the specified video device channel
videoStreamStop Stops streaming on the specified video device channel
videoStreamUnprepareHeader Cleans up the preparation performed by the videoStreamPrepareHeader function
Functions for modifying the rendering attributes of uncompressed rendered images:
videoGetBrightness Obtains the current brightness value
videoGetContrast Obtains the current contrast value
videoGetMirror Obtains the current mirroring value
videoGetSaturation Obtains the current saturation value
videoGetSharpening Obtains the current sharpening value
videoSetBrightness Sets the current brightness value
videoSetContrast Sets the current contrast value
videoSetMirror Sets the current mirroring value
videoSetSaturation Sets the current saturation value
videoSetSharpening Sets the current sharpening value
Functions for modifying the quality of video data:
videoGetQuality Obtains the current quality value
videoSetQuality Sets the current quality value
Functions for specifying and retrieving the hardware port:
videoGetPortNum Obtains the current port number value
videoSetPortNum Sets the current port number value
Functions for specifying and retrieving the video standard type:
videoGetStandard Obtains the current input standard type
videoSetStandard Sets the current input standard type
Functions for obtaining and setting the video field mode:
videoGetFieldMode Obtains the current video field mode
videoSetFieldMode Sets the current video field mode
Function for setting the LUT controls for YUV data on the J300:
videoj300SetVideoOutYUVLut Sets the current LUT controls for YUV data to video output on the J300


Previous Next Contents Index