Symbian
Symbian Developer Library

SYMBIAN OS V9.4

Feedback

[Index] [Previous] [Next]

#include <devvideoconstants.h>
This item is not part of the S60 5th Edition SDK

Enum TDevVideoPanicCodes

TDevVideoPanicCodes

Description

DevVideo Panic Codes

EDevVideoPanicPreConditionViolation

A pre-condition on a method has been violated.

EDevVideoPanicPostConditionViolation

A post-condition on a method has been violated.

EDevVideoPanicInvalidHwDeviceId

An invalid hardware device ID has been supplied.

[Top]


Enum TImageDataFormat

TImageDataFormat

Description

Specifies the data format used for an uncompressed picture. The values are bit patterns that can be combined with other format definition constants.

ERgbRawData

Raw RGB picture data in a memory area.

ERgbFbsBitmap

RGB picture data stored in a Symbian OS CFbsBitmapCFbsBitmap object.

EYuvRawData

Raw YUV picture data stored in a memory area. The data storage format depends on the YUV sampling pattern and data layout used.

[Top]


Enum TRgbFormat

TRgbFormat

Description

RGB uncompressed image format alternatives.

ERgb16bit444

16-bit RGB data format with four pixels per component. The data format is the same as used in Symbian EColor4K bitmaps, with each pixel using two bytes with the bit layout [ggggbbbb xxxxrrrr] where "x" indicates unused bits. (This corresponds to "XRGB" 16-bit little-endian halfwords)

ERgb16bit565

16-bit RGB data format with five bits per component for red and blue and six bits for green. The data format is the same as used in Symbian EColor64K bitmaps, with each pixel using two bytes with the bit layout [gggbbbbb rrrrrggg] (This corresponds to "RGB" 16-bit little-endian halfwords)

ERgb32bit888

32-bit RGB data format with eight bits per component. This data format is the same as is used in Symbian EColor16MU bitmaps. The bit layout is [bbbbbbbb gggggggg rrrrrrrr xxxxxxxx] where "x" indicates unused bits. (This corresponds to "XRGB" 32-bit little-endian words)

EFbsBitmapColor4K

CFbsBitmapCFbsBitmap object with EColor4K data format.

EFbsBitmapColor64K

CFbsBitmapCFbsBitmap object with EColor64K data format.

EFbsBitmapColor16M

CFbsBitmapCFbsBitmap object with EColor16M data format.

EFbsBitmapColor16MU

CFbsBitmapCFbsBitmap object with EColor16MU data format.

[Top]


Enum TYuvSamplingPattern

TYuvSamplingPattern

Description

YUV (YCbCr) uncompressed image data sampling pattern.

EYuv420Chroma1

4:2:0 sampling format. 4 luminance sample positions correspond to one chrominance sample position. The four luminance sample positions are on the corners of a square. The chrominance sample position is vertically half-way of the luminance sample positions and horizontally aligned with the left side of the square. This is the MPEG-2 and the MPEG-4 Part 2 sampling pattern.

EYuv420Chroma2

4:2:0 sampling format. 4 luminance sample positions correspond to one chrominance sample position. The four luminance sample positions are on the corners of a square. The chrominance sample position is vertically and horizontally in the middle of the luminance sample positions. This is the MPEG-1 sampling pattern.

EYuv420Chroma3

4:2:0 sampling format. 4 luminance sample positions correspond to one chrominance sample position. The four luminance sample positions are on the corners of a square. The chrominance sample position colocates with the top-left corner of the square. This sampling format is one of the options in Annex E of H.264 | MPEG-4 AVC.

EYuv422Chroma1

4:2:2 sampling format. 2 luminance sample positions correspond to one chrominance sample position. The luminance sample positions reside on the same pixel row. The chrominance sample position is co-located with the left one of the luminance sample positions. This is the MPEG-2 4:2:2 sampling pattern.

EYuv422Chroma2

4:2:2 sampling format. 2 luminance sample positions correspond to one chrominance sample position. The luminance sample positions reside on the same pixel row. The chrominance sample position is in the middle of the luminance sample positions. This is the MPEG-1 4:2:2 sampling pattern.

[Top]


Enum TYuvDataLayout

TYuvDataLayout

Description

Defines the YUV data layout in a decoded picture.

EYuvDataPlanar

The data is stored in a plane mode. The memory buffer contains first all Y component data for the whole picture, followed by U and V, making the data format Y00Y01Y02Y03...U0...V0... For YUV 4:2:0 data, this is the same data format as EFormatYUV420Planar in the Onboard Camera API

EYuvDataInterleavedLE

The data is stored interleaved mode, all components interleaved in a single memory block. Interleaved layout is only supported for YUV 4:2:2 data. The data byte order is Y1VY0U, corresponding to "UY0VY1" little-endian 32-bit words. This is the same data format as EFormatYUV422Reversed in the Onboard Camera API

EYuvDataInterleavedBE

The data is stored interleaved mode, all components interleaved in a single memory block. Interleaved layout is only supported for YUV 4:2:2 data. The data byte order is UY0VY1, corresponding to "UY0VY1" big-endian 32-bit words. This is the same data format as EFormatYUV422 in the Onboard Camera API

EYuvDataSemiPlanar

The data is stored in a semi-planar mode. The memory buffer contains first all Y component data for the whole picture, followed by U and V components, which are interlaced, making the data format Y00Y01Y02Y03...U0V0U1V1... For YUV 4:2:0 data, this is the same data format as FormatYUV420SemiPlanar in the Onboard Camera API

[Top]


Enum TPictureEffect

TPictureEffect

Description

Defines the picture effect used for an input picture. Please refer to ITU-T H.264 | ISO/IEC MPEG-4 AVC [] for the definitions of the transition effects.

EEffectNone

No effect.

EEffectFadeFromBlack

Fade from black.

EEffectFadeToBlack

Fade to black.

EEffectUnspecifiedThroughConstantColor

Unspecified transition from or to constant colour.

EEffectDissolve

Dissolve.

EEffectWipe

Wipe.

EEffectUnspecifiedMixOfTwoScenes

Unspecified mixture of two scenes.

[Top]


Enum TRgbRange

TRgbRange

Description

Defines the data value range used for RGB data. Used for determining the correct color space conversion factors.

ERgbRangeFull

The RGB data uses the full 8-bit range of [0…255].

ERgbRange16to235

The RGB data uses the nominal range of [16…235]. Individual samples can still contain values beyond that range, the rest of the 8-bit range is used for headroom and footroom.

[Top]


Enum TVideoDataUnitType

TVideoDataUnitType

Description

Defines possible data unit types for encoded video data. The data unit types are used both for encoded video input for playback as well as encoded video output from recording.

EDuCodedPicture

Each data unit is a single coded picture.

EDuVideoSegment

Each data unit is a coded video segment. A coded video segment is a part of the coded video data that forms an independently decodable part of a coded video frame. For example, a video packet in MPEG-4 Part 2 and slice in H.263 are coded video segments.

EDuSeveralSegments

Each data unit contains an integer number of video segments consecutive in decoding order, possibly more than one. The video segments shall be a subset of one coded picture.

EDuArbitraryStreamSection

Each data unit contains a piece of raw video bitstream, not necessarily aligned at any headers. The data must be written in decoding order. This data unit type can be used for playback if the client does not have information about the bitstream syntax, and just writes data in random-sized chunks. For recording this data unit type is useful if the client can handle arbitrarily split data units, giving the encoder maximum flexibility in buffer allocation. For encoded data output, each data unit must still belong to exactly one output picture.

[Top]


Enum TVideoDataUnitEncapsulation

TVideoDataUnitEncapsulation

Description

Defines possible encapsulation types for coded video data units. The encapsulation information is used both for encoded video input for playback as well as encoded video output from recording.

EDuElementaryStream

The coded data units can be chained in a bitstream that can be decoded. For example, MPEG-4 Part 2 elementary streams, H.263 bitstreams, and H.264 | MPEG-4 AVC Annex B bitstreams fall into this category.

EDuGenericPayload

The coded data units are encapsulated in a general-purpose packet payload format whose coded data units can be decoded independently but cannot be generally chained into a bitstream. For example, the Network Abstraction Layer Units of H.264 | MPEG-4 AVC fall into this category.

EDuRtpPayload

The coded data units are encapsulated in RTP packet payload format. The RTP payload header may contain codec-specific items, such as a redundant copy of a picture header in the H.263 payload specification RFC2429.

[Top]


Enum THrdVbvSpecification

THrdVbvSpecification

Description

Defines the HRD/VBV specification used in a stream.

EHrdVbvNone

No HRD/VBV specification.

EHrdVbvCodingStandard

The HRD/VBV specification in the corresponding coding standard.

EHrdVbv3GPP

Annex G of 3GPP TS 26.234 Release 5.

[Top]


Enum TPrePostProcessType

TPrePostProcessType

Description

Defines the pre-processor and post-processor types available in the system. One pre-processor or post-processor can implement multiple operations simultaneously, and thus the types are defined as bit values that can be combined as a bitfield.

EPpInputCrop

Input cropping, used for pan-scan cropping in video playback and digital zoom in video recording. Pan-scan cropping is useful, for example, for displaying arbitrary-sized pictures with codecs that only support image dimensions that are multiples of 16 pixels.

EPpMirror

Horizontal mirroring, flips the image data around a vertical line in its center.

EPpRotate

Picture rotation, supports rotation by 90 or 180 degrees, clockwise and anticlockwise.

EPpScale

Picture scaling to a new size, includes both upscaling and downscaling. The supported scaling types and scale factors depend on the pixel processor.

EPpOutputCrop

Crops the picture to a final output rectangle.

EPpOutputPad

Pads the output picture to a defined size. Used in video recording to pad pictures to suit the encoder input requirements.

EPpYuvToRgb

YUV to RGB color space conversion. Supported only for video playback.

EPpRgbToYuv

RGB to YUV color space conversion. Supported only for video recording.

EPpYuvToYuv

YUV to YUV data format conversion. Supported only for video recording.

EPpNoiseFilter

Noise filtering. Noise filtering is typically used to enhance the input picture from the camera, and is usually only supported for video recording.

EPpColorEnhancement

Color enhancement. Color enhancement is typically used to enhance the input picture from the camera, and is usually only supported for video recording.

EPpFrameStabilisation

Frame stabilisation. Supported only for video recording.

EPpDeblocking

Deblocking is typically used to remove artefacts from the output picture that result from high compression or a noisy input signal. Only supported for video playback.

EPpDeringing

Deringing is typically used to remove artefacts from the output picture that result from a noisy input signal corrupting motion estimates. Only supported for video playback.

EPpCustom

Custom hardware device specific processing.

[Top]


Enum TDitherType

TDitherType

Description

Dithering types.

EDitherNone

No dithering.

EDitherOrdered

Ordered dither.

EDitherErrorDiffusion

Error diffusion dither.

EDitherOther

Other hardware device specific dithering type.

[Top]


Enum TRotationType

TRotationType

Description

Rotation types for pre-processors and post-processors.

ERotateNone

No rotation.

ERotate90Clockwise

Rotate the picture 90 degrees clockwise.

ERotate90Anticlockwise

Rotate the picture 90 degrees anticlockwise.

ERotate180

Rotate the picture 180 degrees.

[Top]


Enum TBitrateControlType

TBitrateControlType

Description

Defines possible encoding bit-rate control modes.

EBrControlNone

The encoder does not control the bit-rate, but uses specified target picture quality and picture rate as such. The coded data stream must still remain compliant with the standard and buffer settings in use, if any, and thus HRD/VBV settings can limit the possible bit-rate.

EBrControlStream

The encoder controls the coded bit-rate of the stream. The caller indicates target bit-rate, target picture quality, target frame rate, spatial-temporal trade-off, and latency-quality trade-off.

EBrControlPicture

The encoder controls the coded bit-rate of each picture. The caller gives the target amount of bits per frame. Each given input frame is coded. This type of operation is applicable only in memory-buffer-based input.

[Top]


Enum TScalabilityType

TScalabilityType

Description

Defines the scalability type for a single bit-rate scalability layer.

EScalabilityTemporal

The layer uses temporal scalability. Using the layer increases the picture rate.

EScalabilityQuality

The layer uses quality scalability. Using the layer improves picture quality.

EScalabilitySpatial

The layer uses spatial scalability. Using the layer increases picture resolution.

EScalabilityFineGranularity

The layer is a fine-granularity scalability layer. In fine granularity scalability, the output quality increases gradually as a function of decoded bits from the enhancement layer.

EScalabilityQualityFG

The layer is a fine-granularity quality scalability layer.

[Top]


Enum TErrorControlStrength

TErrorControlStrength

Description

Forward error control strength used for an unequal error protection level. Also other values between EFecStrengthNone and EFecStrengthHigh can be used, the encoder will round the values to the levels it supports.

EFecStrengthNone

No error control.

EFecStrengthLow

Low error control strength.

EFecStrengthNormal

Normal error control strength.

EFecStrengthHigh

High error control strength.

[Top]


Enum TInLayerScalabilityType

TInLayerScalabilityType

Description

Defines the scalability type for in-layer bit-rate scalability.

EInLScalabilityTemporal

Temporal scalability, such as B-pictures.

EInLScalabilityOther

Other scalability type.

[Top]


Enum TFramePortion

TFramePortion

Description

Defines what part of a frame is contained within a video buffer.

EFramePortionUnknown

The frame portion is unknown.

EFramePortionWhole

An entire frame.

EFramePortionStartFragment

A fragment of a frame containing the start but not the end.

EFramePortionMidFragment

An fragment of a frame containing neither the start nor the end.

EFramePortionEndFragment

A fragment of a frame containing the end but not the start.


devvideoconstants.h Global variables

[Top]


KDevVideoPanicCategory

const TLitC< sizeof(L"DevVideo")/2 KDevVideoPanicCategory)={sizeof(L"DevVideo" )/2-1,L"DevVideo" };

Description

DevVideo Panic Category

[Top]


KUidDevVideoDecoderHwDevice

const TUid KUidDevVideoDecoderHwDevice={ 0x101fb4be };

Description

Video Decoder HW Device Plugin Interface UID

[Top]


KUidDevVideoPostProcessorHwDevice

const TUid KUidDevVideoPostProcessorHwDevice={ 0x101fb4bf };

Description

Video Post Processor HW Device Plugin Interface UID

[Top]


KUidDevVideoEncoderHwDevice

const TUid KUidDevVideoEncoderHwDevice={ 0x101fb4c0 };

Description

Video Encoder HW Device Plugin Interface UID

[Top]


KUidDevVideoPreProcessorHwDevice

const TUid KUidDevVideoPreProcessorHwDevice={ 0x101fb4c1 };

Description

Video Pre Processor HW Device Plugin Interface UID

[Top]


KUidDevVideoPlayHwDeviceExtensionScanCopy

const TUid KUidDevVideoPlayHwDeviceExtensionScanCopy={ 0x101FFA18 };

Description

MMMFVideoPlayHwDeviceExtensionScanCopyMMMFVideoPlayHwDeviceExtensionScanCopy Custom Interface UID

[Top]


KPictureRate5

const TReal KPictureRate5=5.0;

Description

Picture frame rate constants

Using these constants is recommended when the picture rate is known to match one of them, to ensure that floating point equality comparisons work as expected.

Note that the MSL video APIs currently only deal with non-interlaced frames. For interlaced video, all references to the term "picture" should be considered to refer to complete frames. As such, the term "picture rate" here refers to the frame rate for interlaced video.