Compaq Multimedia Services
for OpenVMS Alpha
Run-Time Environment Guide


Previous Contents Index


Glossary


Adaptive Differential Pulse Code Modulation: See ADPCM.

ADPCM: Adaptive Differential Pulse Code Modulation. An encoding format for storing audio information in a digital format.

algorithm: In compression software, a specific formula used to compress or decompress video.

API: Application programming interface. A collection of routines used to access functions of software modules or layers.

application programming interface: See API.

audio: The sound for multimedia systems. Audio frequencies range from 15 Hz to 20,000 Hz.

audio stream: The frames of audio data.

Audio/Video Interleaved file format: See AVI file format.

AVI file format: Audio/Video Interleaved file format. A RIFF file specification used with applications that capture, edit, and play back audio/video sequences.

bitmap: A representation of characters or graphics by individual pixels arranged in row and column order. Each pixel can be represented by either one bit (for simple black and white) or up to 32 bits (for high-definition color).

blurring: The state in which an image is no longer in focus. No definite edges exist and colors blend together.

brightness: The intensity of light. See also luminance.

chrominance: Color information whose form depends on the particular application.

colormap: The range of colors that can be used for display. It is the same as palette but colormap is the term used in the X Window System. A colormap is a list of red, green, and blue values. The values are referenced by an index into the list: entry 0 might be pink, entry 1 might be blue, entry 2 might be black, and so forth.

X image data does not contain the actual color information but it does contain the index. This is either to reduce the amount of information contained in the image or because a display cannot handle true color. An 8 bits/pixel image can use 256 colors (28) and a 4 bits/pixel image can use 16 colors (24). A 24 bits/pixel image is true color and does not use a colormap.

compress: To reduce, using a variety of computer algorithms and other techniques, the amount of data required to accurately represent a video image or segment, thereby reducing the amount of space required to store it. Most types of compression, such as JPEG, cause some data to be lost. Compare with decompress.

contrast: The difference between the highest level and the lowest level of luminance in a picture.

data type: The characteristic of the data being stored. For example, numeric, alphanumeric, dates, and logical (true/false) are typical data types. When data is assigned a particular type, it can be manipulated only according to the rule of that type.

decompress: To reverse the procedure conducted by compression software, thereby returning compressed data to its original size and condition. Compare with compress.

device-independent bitmap format: See DIB format.

DIB format: Device-independent bitmap format. A file format that represents bitmap images in a form that is independent of a specific device. Bitmaps can be represented as 1, 4, and 8 bits/pixel, with a palette containing colors represented in 24 bits. Bitmaps can also be represented as 24 bits without a palette and in run-length encoded format.

edge: A point of transition. For images, an edge is denoted by a change in color. An in-focus image has clearly defined edges. See also sharpening.

frame: A single, complete picture in a video or film recording. A video frame consists of two interlaced fields of either 525 lines (NTSC) or 625 lines (PAL and SECAM), running at 30 frames per second (NTSC) or 25 frames per second (PAL and SECAM). Movie theater films run at 24 frames per second.

hue: The distinctive characteristics of a color that allow it to be assigned a position in the spectrum; a particular shade of a color.

image: The computerized representation of a picture or a graphic.

International Organization for Standardization: See ISO.

ISO: International Organization for Standardization. A worldwide group responsible for establishing and managing various standards committees and expert groups, including several image-compression standards.

Joint Photographic Experts Group: See JPEG.

JPEG: Joint Photographic Experts Group. A working committee under the auspices of the International Organization for Standarization (ISO) that is defining a universal standard for the digital compression and decompression of still images for use in computer systems.

luminance: The measure of the energy that produces the sensation of brightness.

mirroring: Rotating an image 180° around the vertical axis at the center of the image so that left and right are interchanged.

motion video: A sequence of images (frames) that are displayed so rapidly that the viewer sees the images as a continuously moving picture.

multimedia: Pertaining to the delivery of information that combines different content formats, such as motion video, audio, still images, graphics, animation, text, and so forth.

MPC: Multimedia PC. A personal computer that meets a minimum set of specifications and as a result is considered to be multimedia capable. One of the requirements for an MPC is a CD-ROM drive with a data transfer rate of at least 150 KB/s.

Multimedia PC: See MPC.

National Television Standards Committee: See NTSC.

NTSC: National Television Standards Committee. A committee of the Electronics Industries Association (EIA) that prepared the standard of specifications for commercial color broadcasting, which was approved by the Federal Communications Commission (FCC) in December 1953. NTSC is the TV standard for the U.S., Japan, and other countries. See also PAL format, SECAM format.

PAL format: Phase Alternation Line format. One of the two European video standards. The PAL format uses interlaced scans with 25 frames per second and 625 lines per screen. See also frame, NTSC, SECAM format.

palette: The range of colors that can be used for display. See also colormap.

PCM: Pulse code modulation. The most common method of encoding an analog signal into a digital bit stream. PCM is a digitization technique, not a universally accepted standard.

Phase Alternation Line format: See PAL format.

pixel: The shortened form of picture element. A pixel is the minimum raster display element, represented as a point with a specified color or intensity level. One way to measure picture resolution is by the number of pixels used to create images.

pulse code modulation: See PCM.

quality: The clarity of an image as the result of the amount of compression it undergoes.

resolution: The number of pixels per unit of area. A display with a finer grid contains more pixels and thus has a higher resolution capable of reproducing more detail in an image.

Resource Interchange File Format: See RIFF.

RGB: Red, green, and blue. A type of computer color display output signal comprised of separately controllable red, green, and blue signals.

RIFF: Resource Interchange File Format. A platform-independent multimedia specification (published by Microsoft and others in 1990) that allows audio, image, animation, and other multimedia elements to be stored in a common format.

sample: In the context of audio, the digital representation of the analog voltage.

saturation: The degree of purity of a color, as measured by its freedom of mixture with white; intensity of hue.

SECAM format: Sequential Coleur A Memoire (sequential color with memory) format. The French color TV system, also adopted by Russia. The basis of operation is the sequential recording of primary colors in alternate lines. See also NTSC, PAL format.

Sequential Coleur A Memoire: See SECAM format.

sharpening: The emphasis of edge information in an image. Compare with blurring.

SMPTE format: Society of Motion Picture and Television Engineers format. A video encoding format developed by the Society of Motion Picture and Television Engineers.

Society of Motion Picture and Television Engineers format: See SMPTE format.

S-video: A type of video signal used in the Hi8 and S-VHS video tape formats. It transmits luminance and color portions separately using multiple wires, thus avoiding the NTSC encoding process and its inevitable loss of picture quality.

video: See motion video.

video capture: The input of video data through the input jack of a video capture device. Video capture is used to input video frames from a video camera, video player, television tuner, or other video capture device.

video playback: The output of video data through the output jack of a video playback device. Video playback is used to output video frames to an external monitor, a VCR, or other video playback device.

Waveform Audio (WAVE) file format: See WAVE file format.

WAVE file format: Waveform Audio file format. A RIFF file specification used with applications that record and play back waveform audio data.

YUV format: An encoding technique in which luminance and chrominance are encoded separately. The YUV format is as follows: Y1U12Y2 V12 where Y specifies the luminance and U and V specify the chrominance.


Index Contents