See: ITU-R BT.2020
See ITU-R BT.601
See ITU-R BT.709
The Advanced Authoring Format – was an industry initiative launched in 1998, to create a file interchange standard for the easy sharing of media data and metadata among digital production tools and content creation applications, regardless of platform. It includes EBU/SMPTE metadata and management of pluggable effects and codecs. It allows open connections between equipment where video, audio and metadata, including information on how the content is composed, where it came from, etc., are transferred. It can fulfill the role of an all-embracing EDL or offer the basis for a media archive that any AAF-enabled system can use. Quantel products make extensive use of AAF.
In 2007 AAF Association, Inc. changed its name to the Advanced Media Workflow Association (AMWA), with the tag ‘Putting AAF and MXF to work’. Today it is a container (or wrapper) format, with focus on file-based workflows including MXF and other formats. It was involved with the MXF Mastering Format Project that provided real-world solutions for key workflows, focusing on creating a single MXF master file from which multiple versions of a program may be created.
The Audio Engineering Society (AES) and the EBU (European Broadcasting Union) together have defined a standard for Digital Audio, now adopted by ANSI (American National Standards Institute). Commonly referred to as ‘AES/EBU’ and officially as AES3, this digital audio standard permits a variety of sampling frequencies, for example CDs at 44.1 kHz, or DATs and digital VTRs at 48 kHz. 48 kHz is widely used in broadcast TV production although 32-192 kHz are allowed. One cable and connector, usually an XLR, carries two channels of digital audio.
See also: Word clock
The Advanced Media Workflow Association is an open, community-driven forum that creates specifications and technologies for networked media workflows. These cover both business aspects and the development of standards for better technical interoperability. It focuses on file-based workflows to help content creators and distributors in the film, television, advertising, Internet and post-production industries.
In 2014 AMWA, working with the UK’s Digital Production Partnership, created a standard specification for file delivery from post-production to the broadcaster. The resulting AS-11 Contribution Application Specification (AS-11) defines a constrained MXF file and video and audio codecs along with required metadata.
See also: DPP
American Standard Code for Information Interchange. This is a standard computer character set used throughout the industry to represent keyboard characters as digital information. There is an ASCII table containing 127 characters covering all the upper and lower case characters and non-displayed controls such as carriage return, line feed, etc. Variations and extensions of the basic code are used in special applications.
The (US) Advanced Television Systems Committee. Established in 1982 to co-ordinate the development of voluntary national technical standards for the generation, distribution and reception of high definition television. In 1995 the ATSC published “The Digital Television Standard” which describes the US Advanced Television System, referred to as ATSC A/53. This uses MPEG-2 compression for the video and AC-3 for the audio and includes a range of video resolutions (as described in ATSC Table 3) and audio services (Table 2). It uses 8 and 16 VSB modulation respectively for terrestrial and cable transmission.
ATSC M/H (mobile/handheld) allows digital TV to be received by mobile devices. Using 8VSB modulation it is subject to inherent problems of doppler shift and multipath interference, so additional channel coding has been added.
ATSC 2.0 is a backward-compatible revision of the standard, enabling interactive and hybrid television by connecting the TV with the Internet and allowing interactive elements. Video compression uses the more efficient AVC. There is also audience measurement, enhanced programming guides, video-on-demand services, and the ability to store information on new receivers, including non-realtime content.
ATSC 3.0 is work in progress. Many ideas are being examined including the use of OFDM modulation, as used in DVB, in place of VSB, HEVC video compression, and being able to deliver HD, 4K and 3D and probably SD too. Watch this space!
A family of two HD codecs from Panasonic that were designed to be compliant with H.264/MPEG-4 AVC, and use only intra-frame coding (GOP of 1), making the coded material easily editable at every frame. AVC-Intra was aimed at professional users and was adopted by Panasonic for its P2 cameras (AVC-Intra P2), offering considerably more efficient compression than the original DVCPRO HD codec – maybe by as much as 2:1. This was at a time when long GOP coding was being used in products including HDV and XDCAM HD. With increased coding efficiency some believed the use of long GOP coding in professional recorders would fade.
There are two classes: AVC-Intra 50 and AVC-100. The former produces a nominal 50 Mb/s for 1920 x1080 and 1280×720 formats using 4:2:0 10-bit sampling, with the frames horizontally reduced to 0.75 of the original line length. AVC-100 produces up to 100 Mb/s with 4:2:2 sampling for the same two frame formats, but without any size reduction. Both codecs offer a range of popular framerates. Both these codecs are now included in Panasonic’s AVC-Ultra range.
Advanced Video Codec High Definition, a joint development between Panasonic and Sony, applies MPEG-4’s AVC video coding and Dolby Digital (AC-3) or linear PCM audio coding, to meet the needs of the high definition consumer market with 1080i and 720p formats. The use of AVC provides at least twice the efficiency of MPEG-2 coding, used in HDV and MiniDV, to offer longer recording times or better pictures – or both. Possible recording media include standard DVD disks, flash memory and hard drives.
Further developments have expanded the applications of AVCHD technology. The AVCHD Format Version 2.0 adds specifications for 3D and 1080/60P and 50P and supporting trademarks; AVCHD 3D, AVCHD Progressive and AVCHD 3D/Progressive.
Audio Video Interleave, a Microsoft multimedia container format introduced in 1992 as part of its Video for Windows technology. AVI files can hold audio and video data in a standard container and provide synchronous video/audio replay. Most AVI files also use the OpenDML file format extensions, forming AVI 2.0 files.
Some consider AVI outdated, as there are significant overheads using it with popular MPEG-4 codecs that seemingly unduly increase file sizes. Despite that, it remains popular among file-sharing communities, probably due to its high compatibility with existing video editing and playback software, such as Windows Media Player.
The amount of information (data) that can be passed in a given time. In television a large bandwidth is needed to show sharp picture detail in realtime, and so is a factor in the quality of recorded and transmitted images. For example, ITU-R BT.601 and SMPTE RP 125 allow analog luminance bandwidth of 5.5 MHz and chrominance bandwidth of 2.75 MHz for standard definition video. 1080-line HD has a luminance bandwidth of 30 MHz (ITU-R BT.709).
Digital image systems generally require large bandwidths hence the reason why many storage and transmission systems resort to compression techniques to accommodate the signal.
An analog component-video VTR system for PAL and NTSC television introduced in 1982, using a half-inch tape cassette very similar to the domestic Betamax. This was developed by Sony and was marketed by them and several other manufacturers. Betacam records the Y, R-Y and B-Y component signals onto tape; many machines were operated with coded (PAL or NTSC) video in and out. Initially developed for the industrial and professional markets the system was enhanced to offer models with full luminance bandwidth (Betacam SP 1986), PCM audio and SDI connections with a great appeal to the broadcast market.
Founded in 2005, three years after the introduction of the Blu-ray Disc system, the BDA is a voluntary membership group for those interested in creating, manufacturing, or promoting the BD formats and products, as well as those seeking more information about the format as it evolves.
The BDA aims to develop BD specifications, ensure products are correctly implemented, promote wide adoption of the formats and provide useful information to those interested in supporting those formats.
See also: Blu-ray Disc
This optical disk, designed for HD, can hold 25 GB on a single-layer CD-sized (12cm) disk using 405 nanometer blue-violet lasers. Dual layer disks hold up to 50 GB. Also available are triple layer (100 GB) and quadruple layer (128 GB) disks, which may accommodate 4K UHD video. The companies that established the basic specifications were: Hitachi Ltd., LG Electronics Inc., Matsushita Electric Industrial Co. Ltd., Pioneer Corporation, Royal Philips Electronics, Samsung Electronics Co. Ltd., Sharp Corporation, Sony Corporation, and Thomson Multimedia.
Players must be able to decode MPEG-2, H.264/AVC (MPEG-4 part 10) and SMPTE VC-1 coded material. MPEG-2 offers backward compatibility for DVDs while the other two more modern codecs are at least 50 percent more efficient, using less disk space or producing higher quality results. Audio codecs supported are Linear PCM, Dolby Digital, Dolby Digital Plus, Dolby TrueHD, DTS Digital Surround, DTS-HD.
The baseline data rate is 36 Mb/s – giving over one-and-a-half hours recording of HD material on a single layer, or about 13 hours of SD. For Blu-ray Disc movies (BD-ROM) the maximum transfer rate is 54 Mb/s for audio and video, with a maximum of 40 Mb/s for video. Random access allows easy video editing and simultaneous record and playback.
Ultra HD Blu-ray is the 4K Blu-ray format, expected for delivery Christmas 2015. Handling 4K UHD video at up to 60 f/s, the specification includes HEVC (H.265) video compression, a wider color gamut (than HD) as well as High Dynamic Range (HDR) and 10-bit video sampling. Disc capacities are set at 66 GB (dual layer) and 100 GB (triple layer). The system will also be able to play legacy standards including Blu-ray, DVD and CD. The final specification is expected in mid-2015.
Short-range, up to 100m, wireless data connection in a Personal Area Network (PAN). Bluetooth is used in products such as phones, printers, modems and headsets and is acceptable where two or more devices are in proximity to each other and not needing high bandwidth (3 Mb/s max.). It is easy to set up without configuration as Bluetooth devices advertise all services they provide making using the service easily accessible, without network addresses, permissions and all the other considerations that go with typical networks.
The Comité Consultatif International des Radiocommunications is history as it has been absorbed into the ITU, under the ITU-R prefix.
See also: ITU
International Telegraph and Telephone Consultative Committee. As the name suggests this was initially set up to establish standards for the telephone industry in Europe. It has now been superseded by ITU-T so putting both radio frequency matters (ITU-R) and telecommunications under one overall United Nations body.
See also: ITU
International Commission on Illumination (Commission Internationale de l’Eclairage) is devoted to international cooperation and exchange of information among its member countries on all matters relating to the science and art of lighting. It is a technical, scientific and cultural, non-profit autonomous organization that has grown out of the interests of individuals working in illumination – lighting. It is recognized by ISO as an international standardization body.
See also: X´Y´Z´
Color Science has been an area of scientific research for over 100 years. Brought together in 1931 by the CIE (Commission Internationale de l’Eclairage) this sphere of science studies all aspects of human perception of color and brightness. Early use was to study how dyes could be mixed. The issues of color printing and the effects of different viewing conditions on perception. There are large amounts of research and books published relating to this subject.
The definitions for all our television and cinema viewing standards are rooted in color science. The numeric definition of r,g,b and the conversions to YCrCb are examples of the practical use of color science. In this multimedia world media transportability and maintenance of creative intent would not be possible without color science defining the solid core of math to support the industry.
See also: Color Space
The ITU has defined common image formats. A standard definition image of 352 x 240 pixels is described for computers. For HDTV production the HD-CIF preferred format is defined in ITU-R BT.709-4 as 1920 x 1080 pixels, 16:9 aspect ratio with progressive frame rates of 24, 25 and 30 Hz (including segmented scan) and interlace field rates of 50 and 60 Hz. This has helped to secure the 1920 x 1080 format as the basis for international program exchange.
See also: ITU-R BT.709
A digital video tape recording format working to the ITU-R BT.601, 4:2:2 standard using 8-bit sampling. The tape is 19 mm wide and allows up to 94 minutes to be recorded on a cassette.
Introduced in 1986, Sony’s D1 VTR set a benchmark as it was the first uncompressed component digital tape format. It offered very high quality, only small degradation over many re-record generations and, with its high chrominance bandwidth, allowed excellent chroma keying in post production. Despite the advantages, D1 use was limited by high cost and is rarely found today. However the term ‘D1’ is still occasionally used to imply uncompressed component digital recording – ‘D1’ quality.
This refers to Sony’s MPEG IMX VTRs that record I-frame only 4:2:2-sampled MPEG-2 SD video at 50 Mb/s onto half-inch tape. In bit rate, this sits IMX between Betacam SX and Digital Betacam. There is a Gigabit Ethernet card available which has caused some to dub it the eVTR as it can be considered more as a ‘storage medium’ for digital operations.
The HDCAM VTR format has been assigned D11.
This is assigned to DVCPRO HD.
A VTR standard for digital composite (coded) PAL or NTSC signals. It uses 19 mm tape and records up to 208 minutes on a single cassette. Neither cassettes nor recording formats are compatible with D1. Being relatively costly and not offering the advantages of component operation the format has fallen from favor. VTRs have not been manufactured for many years.
A VTR standard using half-inch tape cassettes for recording digitized composite (coded) PAL or NTSC signals sampled at 8 bits. Cassettes record 50 to 245 minutes. Since this uses a composite PAL or NTSC signal, the characteristics are generally as for D2 except that the half-inch cassette size allowed a full family of VTR equipment to be realized in one format, including a camcorder. D3 is rarely used today.
There is no D4. Most DVTR formats hail from Japan where 4 is regarded as an unlucky number.
A VTR format, introduced in 1994 by Panasonic, that uses the same cassette as D3 but recording uncompressed component signals sampled to ITU-R BT.601 recommendations at 10-bit resolution. With internal decoding, D5 VTRs can play back D3 tapes and provide component outputs.
D5 offers all the performance benefits of D1, making it suitable for high-end post production as well as more general studio use. Besides servicing the current 625 and 525 line TV standards the format extends to HDTV recording by use of about 4:1 compression (HD-D5).
A little used digital tape format which uses a 19mm helical-scan cassette tape to record non-compressed HDTV material. The Thomson VooDoo Media Recorder is the only VTR based on D6 technology. The format has passed into history.
This is assigned to DVCPRO.
This is assigned to Digital-S.
SMPTE Task Force On Digital Cinema, intended to aid digital cinema development by determining standards for picture formats, audio standards and compression, etc.
Digital Cinema Initiatives, LLC was formed in 2002 with members including Disney, Fox, MGM, Paramount, Sony Pictures Entertainment, Universal and Warner Bros. Studios. Its purpose was to establish and document specifications for an open architecture for Digital Cinema components that ensures a uniform and high level of technical performance, reliability and quality control. It published the Digital Cinema System Specification in July 2005 (freely available at their website) and established a set of technical specifications that allowed the industry to roll-out Digital Cinema. It is a measure of the DCI’s success that now well over half of the world’s cinemas are digital.
There are three levels of images, all with a 1:1 pixel aspect ratio, 12-bit 4:4:4 sampling in X´Y´Z´ color space.
|Level||Picture Size||Aspect Ratio||Frame Rate|
The specification includes requirements for JPEG 2000 image compression, X´Y´Z´ color space and a maximum playout bit rate of 250 Mb/s. To prevent piracy by copying the media files there is AES 128 encryption (Advanced Encryption Standard able to use keys of 128, 192, and 256 bits to encrypt and decrypt data in blocks of 128 bits). There is also forensic marking to deter and trace the bootlegger’s camcorder pointed at the screen. Such schemes include Philips’ forensic watermarking or Thomson’s NexGuard watermarking.
DSM → DCDM → DCP → DCDM* → Image and Sound
DCI describes a workflow from the output of the feature post production or DI, termed the Digital Source Master (DSM), to the screen. The Digital Cinema Distribution Master (DCDM) is derived from the DSM by a digital cinema post production process, and played directly into a digital cinema projector and audio system for evaluation and approval.
The approved DCDM is then compressed, encrypted and packaged for distribution as the Digital Cinema Package (DCP). At the theater, it is unpackaged, decrypted and decompressed to create a DCDM* with images visually indistinguishable from those of the original DCDM.
Digital Leader and the Digital PROjection VErifier (DPROVE) are two products that are based on SMPTE RP 428-6-2009. The Digital Leader is aimed at digital movie post production and cinemas. In post it can be added as a leader and/or footer (end) of Digital Cinema Distribution Master (DCDM) ‘reels’ so allowing a quick quality check.
DPROVE is a set of Digital Cinema Packages (DCPs) that help checking projector performance and aligment as well as the sound’s synchronization with the pictures.
See also: DCI
The Digital Production Partnership was formed by leading UK public service broadcasters to help the television industry maximize the opportunities, and be aware of the challenges, of digital television production. It works in two areas: technology, and shared thinking, information and best practice.
In 2011 it created common technical standards for the delivery of SD and HD video to the major broadcasters. In 2012 a common format, structure and wrapper for the delivery of programs by digital file, including metadata, was agreed. The DPP file-based delivery standard became the UK’s standard in late 2014. For this the DPP referred to AMWA to create a subset of its AS-11 file system that can be edited for breaks, re-timed, and may have additional language tracks, captions and subtitles wrapped into the file container.
See JPEG 2000
Digital Living Network Alliance is a nonprofit organization founded by Sony in 2003 that aims to deliver an interoperability framework of design guidelines based on open industry standards to complete cross-industry digital convergence.
The resulting ‘digital home’ should then be a network of consumer electronic, mobile and PC devices that transparently co-operate to deliver simple, seamless interoperability that enhances and enriches users’ experience.
Digital Video Broadcasting, the group, with over 200 members in 25 countries, which developed the preferred scheme for digital broadcasting in Europe. Initially the DVB Group put together a portfolio of broadcast standards; the major ones including a satellite system, DVB-S, and now the more efficient DVB-S2, a matching cable system, DVB-C (and now DVB-C2), and a digital terrestrial system, DVB-T (and now DVB-T2). DVB-H is a newer broadcast standard designed for terrestrial operation with hand-held devices, typically mobile TVs, phones and tablets where power must be conserved.
DVB-S (1995) is the original DVB forward error coding and modulation standard for satellite television. DVB-S is used for both broadcast network feeds and for direct broadcast satellite services.
DVB-S2 (2003) is used for all new European digital satellite multiplexes, and satellite receivers will be equipped to decode both DVB-S and DVB-S2. Currently its main use is to distribute HDTV. DVB-S2 is based on DVB-S adding two key features: allowing changing encoding parameters in realtime (VCM, Variable Coding and Modulation) and ACM (Adaptive Coding and Modulation) to optimize the transmission parameters for various users for a claimed net performance gain of 30 percent (ie, more data transmitted for more channels).
DVB-T is a transmission scheme for digital terrestrial television (DTT). Its specification was approved by ETSI in February 1997 and DVB-T services started in the UK in autumn 1998.
As with the other DVB standards, MPEG-2 sound and vision coding are used. It uses Coded Orthogonal Frequency Division Multiplexing (COFDM) modulation. It enables effective operation in very strong multipath environments (that cause picture ‘ghosting’ in analog TV reception), meaning in can operate an overlapping network of transmitting stations using the same frequency. In the areas of overlap, the weaker received signals are rejected. Where transmitters carry the same programming the overlapping signals provide more reliable reception, known as a single-frequency network (SFN).
DVB-T2 (2009). The DVB TM-T2 technical group worked on a more advanced DTT standard focusing on modulation, channel encryption and signal layout. The resulting DVB-T2 offers a 50 percent increase in payload capacity under similar reception circumstances. Its error correction coding, shared with DVB-S2 and DVB-C2, involves LDPC (Low Density Parity Check) coding combined with BCH (Bose-Chaudhuri-Hocquengham) coding, offering a very robust signal. Along with other changes it is more flexible, supporting SD, HD, UHD, mobile TV, radio, or any combination thereof.
DVB-C (1994) for digital transmission via cable transmits an MPEG-2 or MPEG-4 family digital audio/digital video stream, using a QAM modulation with channel coding.
DVB-C2 (2010) almost doubles the payload so relieving the many cable networks that were running at near capacity.
The DVB digital TV standards are used around the world with notable exceptions being ATSC in the USA and Canada, ISDB in Japan, DMB-T/H (Digital Multimedia Broadcast-Terrestrial/ Handheld) in China, and T-DMB in South Korea.
There are several additional DVB transmission standards that can be found on the website. These include DVB-RCS2 that provides an air interface specification for low-cost two-way satellite broadband VSAT (very small aperture terminal) systems to provide dynamic, demand-assigned transmission capacity for a wide range of users. It provides a broadband Internet connection with no need of local terrestrial infrastructure. Data speeds of several tens of Mb/s down to terminals, and up to 10 Mb/s or more can be achieved.
DVB-CPCM DVB Content Protection and Copy Management is a digital rights management standard which is under development. This is intended as a practical rights management system primarily for European digital television; but other countries may adopt it.
CPCM allows adding information to digital content, such as TV programs, that shows how content may be used by other CPCM-enabled devices. Content providers can store flags with the content to indicate how it may be used. All CPCM-enabled devices should obey these flags, allowing or denying its movement, copying to other CPCM devices, controlling use on other equipment, and observing time limits.
The full technical specification of DVB-CPCM is available for free downloading at the DVB website.
The European Telecommunications Standards Institute. Its mission is to produce lasting telecommunications standards for Europe and beyond. ETSI has 655 members from 59 countries inside and outside Europe, and represents administrations, network operators, manufacturers, service providers, research bodies and users.
An integrated set of standards developed by ANSI designed to improve data speeds between workstations, supercomputers, storage devices and displays while providing a single standard for networking storage and data transfer. It can be used point-to-point, switched or in an arbitrated loop (FC-AL) connecting up to 126 devices.
Planned in 1997 to run on a fiber-optic or twisted-pair cable at an initial data rate of 1Gb/s, it has been consistently upgraded to make, 2, 4, 8 and 14 Gb/s (14GFC) available. Expect both 28 and 4 x 28 Gb/s in 2015. There is a road map sketched to 2028 with the possibility of about an 8-fold further increase in speed. These are nominal wire speeds but 8b/10b encoding is used to improve transmission characteristics, provide more accuracy and better error handling. With every 8-bit data byte for transmission converted into a 10-bit Transmission Character, the useful data rate is reduced by 20 percent.
Because of its close association with disk drives, its TV application is mostly, but not always, in the creation of storage networking. It can interface with the SCSI disk interface, which is key to its operation in storage networking such as SAN.
See also: SAN
Framework for Interoperable Media Services aims to define standards which enable building media systems using a Service Orientated Architecture (SOA), to provide flexibility, efficiency and scalability that have not been possible with traditional architectures. FIMS is a task force jointly managed by AMWA and EBU. It has over 100 participants that are mostly broadcast industry manufactures.
FIMS recognizes that the television business has radically changed. It aims to change the rigid, hard-wired program production chains of today to a new environment where each process is considered to be a “service” which can be used when needed, and when released, available for others. It has set out to provide a solution which is flexible, cost effective, reliable, expandable, enables best-of-breed products to be employed, future-proof and enables integration with media business systems. Potentially this could offer a very efficient and comprehensive service.
See IEEE 1394
A high-speed digital subscriber line (DSL) standard for short local loops (the connection between the customer’s premises and the telecom’s network) that is expected to deliver from 150 Mb/s up to 1 Gb/s data rates over copper, and is slated as matching fiber at distances up to 400 metres. This means that many consumers can receive fast internet without running fiber to the house. The protocol is defined in Recommendation ITU-T G.9701. Such data rates are ample to support live 4K and even 8K UHD streaming to viewers’ homes. The first consumer installations of this technology are expected in late 2015.
Hybrid Broadband Broadcast TV (HbbTV) is a European initiative to provide both broadcast and broadband/web content on viewers’ screens. It combines linear (normal) channels with internet content, providing interactivity and the ability to deliver service packages to all relevant devices.
Differences in viewing between computer and TV do not help, such as viewing distance: lean forward (PC) and lean back (TV), lack of mouse and keyboard on TVs, different colors: PC black text on white – the reverse on TV, and lack of computing power in TVs.
See also: Second screen
Short for HDTV.
High Definition Television. A television format with higher definition than SDTV. While DTV at 625 (576) or 525 (480) lines is usually superior to analog PAL and NTSC, it is generally accepted that 720-line and upward is HD. This also has a picture aspect ratio of 16:9.
While there are many picture HDTV formats there is a consensus that 1920 x 1080 is a practical standard for global exchange of television material; a common image format. Many productions are made in this format.
See IEEE 1394
Developed by Apple and produced since 1994, it is a standard for a peer-to-peer serial digital interface which can operate at 400 to 3200 Mb/s (1394b) typically over shielded twisted pair cable up to 4.5m, and 100m on optical fiber.
Practically it can send A/V media over 100m of Cat-5 cable at 100 Mb/s. Consumers connect DV devices over longer distances using readily available low cost cables. IEEE 1394c has a data rate to 800 Mb/s over Cat5 cable and combines 1394 and GigE on one cable.
The high speed and low cost of IEEE 1394a makes it popular in multimedia and digital video applications. Uses include peer-to-peer connections for digital dub editing between camcorders, as well as interfacing video recorders, printers, PCs, TVs and digital cameras.
IEEE 1394 is recognized by SMPTE and EBU as a networking technology for transport of packetized video and audio. Its isochronous data channel can provide guaranteed bandwidth for frame-accurate realtime (and faster) transfers of video and audio while its asynchronous mode can carry metadata and support I/P. Both modes may be run simultaneously.
IEEE 1394 is known as FireWire by Apple, I-Link by Sony and Lynx by Texas Instruments. Future developments of FireWire are expected to increase data speed to 6.4 Gb/s.
This describes a Precision Time Protocol (PTP) that enables synchronizing distributed clocks to within 1 microsecond via Ethernet networks with relatively low demands on local clocks, the network and computing capacity. There are many applications for example in automation to synchronize elements of a production line (without timing belts).
PTP runs on IP networks, transferring precision time to slave devices via a 1 GHz virtual clock (timebase). Independent masters can be locked to one master clock, creating wide, or even global locking. SMPTE has been assessing the possibilities of using PTP as a synchronizing source for television applications.
Standard that defines wired Ethernet.
The Interoperable Master Format has been developed by the Entertainment Technology Center at the University of Southern California. It is intended as a ‘grand master’ format, from which all kinds of deliverables can be created. An IMF may contain the images, audio, subtitles and captioning, technical metadata and the playlist for the content.
The IPDC (Internet Protocol Data Cast) Forum was launched in 2002 to promote and explore the capabilities of IP-based services over digital broadcast platforms (DVB and DAB). Participating companies include service providers, technology providers, terminal manufacturers and network operators. The Forum aims to address business, interoperability and regulatory issues and encourage pilot projects.
See also: IP over DVB
Integrated Services Digital Broadcasting. Standard for digital broadcasting used in Japan. ISDB has many similarities to DVB including OFDM modulation for transmission and the flexibility to trade signal robustness against delivered data rate. ISDB-T (terrestrial) is applicable to all channel bandwidth systems used worldwide: 6, 7, and 8 MHz. The transmitted signal comprises OFDM blocks (segments) allowing flexible services where the transmission parameters, including modulation and error correction, can be set segment-by-segment for each OFDM segment group of up to three hierarchical layers in a channel. Within one channel, the hierarchical system allows both robust SD reception for mobile and portable use and less robust HD; a form of graceful degradation.
Intersociety Digital Cinema Forum
The Internet Streaming Media Alliance is a coalition of industry leaders dedicated to the adoption and deployment of open standards for streaming rich media such as video, audio, and associated data, over Internet protocols.
International Standards Organization. An international organization that specifies international standards, including those for networking protocols, compression systems, disks, etc.
International Telecommunications Union. The United Nations regulatory body covering all forms of communication. The ITU sets mandatory standards and regulates the radio frequency spectrum. ITU-R (previously CCIR) deals with radio spectrum management issues and regulation while ITU-T (previously CCITT) deals with telecommunications standardization.
Suffix ‘BT.’ denotes Broadcasting Television.
ITU-T SG 9 (integrated broadband cable networks and television and sound transmission), in its meeting of October/November 2007, gave consent to a draft new recommendation on IPTV to go through the Alternate Approval Process (AAP). The draft new recommendation J.700 titled “IPTV Service Requirements and Framework for Secondary Distribution” is now at the Last call Judgement (LJ) stage. It describes the service requirements and functional framework architecture for support of IPTV services. Requirements for network elements and customer premises equipment (CPE) are covered. The draft new recommendation also leverages existing deployed technologies to provide a smooth path for operators to integrate IPTV technology into their networks.
This defines the parameters of UHDTV (Ultra High Definition Television), including display resolution, frame rate, chroma sub-sampling, bit depth, color space and audio system. The image sizes are 4K (3840 x 2160) and 8K (7680 x 4320), with frame rates 23.976, 24, 25, 29.97, 30, 50, 59.94, 60 and 120 Hz. All scans are progressive. The system offers a wider dynamic range with the images’ colorimetry including a wider gamut than HDTV, which is already wider than SD. Sampling may be 10 or 12-bit and 4:4:4, 4:2:2 or 4:2:0 to suit the application.
This standard defines the digital encoding parameters of SD television for studios. It is the international standard for digitizing component television video in both 525 and 625 line systems and is derived from SMPTE RP125. ITU-R BT.601 deals with both color difference (Y, R-Y, B-Y) and RGB component video and defines sampling systems, RGB/Y, R-Y, B-Y matrix values and filter characteristics. It does not actually define the electro-mechanical interface; see ITU-R BT. 656.
ITU-R BT.601 is normally taken to refer to color difference component digital video (rather than RGB), for which it defines 4:2:2 sampling at 13.5 MHz with 720 (4) luminance samples per active line. The color difference signals R-Y and B-Y are sampled at 6.75 MHz with 360 (2) samples per active line. Its depth may be 8 or 10 bits.
Some headroom is allowed so, with 10-bit sampling, black level is at 64 (not 0) and white at level 940 (not 1023) – to minimize clipping of noise and overshoots. With 210 levels each for Y (luminance), Cr and Cb (the digitized color difference signals) = 230 – over a billion unique colors can be defined.
The sampling frequency of 13.5 MHz was chosen to provide a politically acceptable common sampling standard between 525/59.94 and 625/50 systems, being a multiple of 2.25 MHz, the lowest common frequency to provide a static sampling pattern for both.
The international standard for interconnecting digital television equipment operating to the 4:2:2 standard defined in ITU-R BT.601. It defines blanking, embedded sync words, the video multiplexing formats used by both the parallel (now rare) and serial interfaces (SDI), the electrical characteristics of the interface and the mechanical details of the connectors.
In 2000, ITU-R BT.709-4 recommended the 1080 active-line high definition television standard for 50 and 60 Hz interlace scanning with sampling at 4:2:2 and 4:4:4. Actual sampling rates are 74.25 MHz for luminance Y, or R, G, B and 37.125 MHz for color difference Cb and Cr, all at 8 bits or 10 bits, and these should be used for all new productions. It also defines these 1080-line square-pixel standards as common image formats (CIF) for international exchange.
The original ITU-R BT.709 recommendation was for 1125/60 and 1250/50 (1035 and 1152 active lines) HDTV formats defining values and a ‘4:2:2’ and ‘4:4:4’ sampling structure that is 5.5 times that of ITU-R BT.601. Note that this is an ‘expanded’ form of 601 and so uses non-square pixels.
See also: Common Image Format
See Dual link
This is another image compression system from the Joint Photographic Experts Group (ISO/ITU-T). JPEG 2000 is very different from the original JPEG; whereas JPEG is DCT-based and examines images in a series of 8 x 8 pixel blocks, JPEG 2000 is wavelet-based using Discrete Wavelet Transform (DWT), to analyze the detail of pictures in a different way. Both coding and decoding require far more processing than JPEG, MPEG-2 or MPEG-4. Also JPEG 2000 is intra-frame only; there are no predictive frames (as in MPEG). Whereas MPEG tends to show macro blocks as it starts to fail, and the original JPEG shows ‘mosquito wings’ or ringing effects, JPEG 2000 can switch to lower data rates that can cause a softening of picture areas, which is far less noticeable. There are two file-name extensions; .JP2 is for ISO/IEC 15444-1 files and .JPX for ISO/IEC 15444-2 files.
JPEG 2000 is about twice as efficient as the equivalent I-only MPEG-2, and excels at high bit rates. It is used at up to 250Mb/s for DCI Digital Cinema applications, usually showing 24 pictures per second in 2K and 4K formats. It lends itself to a wide range of uses from portable digital cameras through to advanced pre-press and television acquisition – as well as Digital Cinema. Some favor it for use in TV distribution. The company intoPix, a specialist in JPEG 2000 technology, offers a video-over-IP solution using JPEG 2000 for HD and 4K UHD via 1Gb/s media networks with 10ms of latency. Its further technology developments are aimed at expanding the use of JPEG 2000 in TV.
Joint Photographic Experts Group (ISO/ITU-T). It has defined many types of image compression. JPEG is a DCT-based data compression standard for individual pictures (intra-frame). It offers compression of between two and 100 times and has three levels of processing which are defined as: baseline, extended and lossless encoding.
JPEG baseline compression coding, which is overwhelmingly the most common in both the broadcast and computer environments, starts with applying DCT to 8 x 8 pixel blocks of the picture, transforming them into frequency and amplitude data. This itself may not reduce data but then the generally less visible high frequencies can be divided by a high ‘quantizing’ factor (reducing many to zero), and the more visible low frequencies by a much lower factor. The ‘quantizing’ factor can be set according to data size (for constant bit rate) or picture quality (constant quality) requirements – effectively adjusting the compression ratio. The final stage is Huffman coding which is lossless but can further reduce data by 2:1 or more.
Baseline JPEG coding creates .jpg files and it is very similar to the I-frames of MPEG, the main difference being they use slightly dissimilar Huffman tables.
See also: Motion JPEG
KLV is a data encoding protocol (SMPTE 336M). The Key is a unique, registered sequence of bits that defines the type of content that is coming (video, audio, EDL, etc) and Length – number of bytes ahead of Value, the content ‘payload’ itself. Compliance to KLV means that a wider range of equipment and applications can understand each others’ files.
Multichannel Audio Digital Interface, widely used among audio professionals, defines the data format and electrical characteristics of an interface carrying multiple digital audio channels, as in the Audio Engineering Society’s AES10-2008. It is popular for its large channel capacity: 28, 56, or 64 channels at up to 96 kHz, 24 bits per channel, and up to 3000m connections over optical fiber (or 100m over coax).
MHEG is the Multimedia and Hypermedia Experts Group. MHEG-5 is an open standard for TV middleware – or application program interface (API) – that enables broadcasters to offer interactive / hybrid services with a wide audience appeal, as well as video. Its TV variant, the MHEG Interaction Channel (MHEG-IC), is used by the UK’s Freeview and Freesat and is also specified by Freeview NZ and Freeview Australia.
Multimedia Home Platform – DVB-MHP is open middleware from the DVB project for interactive television. It enables the reception and execution of interactive, Java-based applications on a TV set that can be delivered over a broadcast channel, together with the audio and video streams. The applications can provide information services such as games, interactive voting, e-mail, SMS and shopping. Some may require using an IP return channel.
Early deployments included DVB-T in Italy, DVB-S in Korea and Poland and DVB-C in Belgium. There have also been trails in other countries.
This is where broadcasters and mobile (cell) telcos come together to provide consumers with access to video content on their mobile phones and tablet computers. This includes downloads to flash memory, 3G and 4G streaming and mobile on-demand broadcast TV. The landscape is complex as there are many competing formats including DVB-H, DVB-SH, MediaFLO, ISDB-T, S-DMB/T-DMB in different regions and backed by different hardware manufacturers, technology suppliers, content providers and mobile operators. Also there are any number of screen resolutions and aspect ratios to be catered for. China is adding its homegrown China Multimedia Mobile Broadcasting (CMMB). In Europe, the European Union has decided to support the DVB-H standard for mobile TV. DVB-H uses a separate broadcast network, rather than a phone network, to send TV content to phones or mobile devices.
Media Object Server (protocol) – a communications protocol for newsroom computer systems (NCS) and broadcast production equipment. It is a collaborative effort between many companies to enable journalists to see, use, and control a variety of devices from their desktop computers, effectively allowing access to all work from one screen. Such devices include video and audio servers and editors, still stores, character generators and special effects machines.
MOS uses a TCP/IP-based protocol and is designed to allow integration of production equipment from multiple vendors with newsroom computers via LANs, WANs and the Internet. It uses a ‘one-to-many’ connection strategy – multiple MOSs can be connected to a single NCS, or a single MOS to many NCSs.
A high-performance, perceptual audio compression coding scheme which exploits the properties of the human ear and brain while trying to maintain perceived sound quality. MPEG-1 and 2 define a family of three audio coding systems of increasing complexity and performance – Layer-1, Layer-2 and Layer-3. MP3 is shorthand for Layer-3 coding. MPEG defines the bitstream and the decoder but, to allow for future improvements, not an encoder. MP3 is claimed to achieve ‘CD quality’ at 128-112 kb/s – a compression of between 10 and 12:1. Not all listeners agree with that.
See also: Auditory masking
A compression scheme designed to work at 1.2 Mb/s, the basic data rate of CD-ROMs, so that video could be played from CDs. Its quality is not up to modern standards and it is not much used.
ISO/IEC 13818. A family of inter- and intra-frame compression systems designed to cover a wide range of requirements from ‘VHS quality’ all the way to HDTV through a series of compression algorithm ‘profiles’ and image resolution ‘levels’. With data rates from below 4 to 100 Mb/s, this family includes the compression system that currently delivers digital TV to homes and that puts SD video onto DVDs as well as putting HD onto 6.35mm videotape for HDV.
In all cases MPEG-2 coding starts with analyzing 8×8-pixel DCT blocks and applying quantizing to achieve intra-frame compression that is very similar to JPEG. This compression is referred to as I-frame only MPEG-2. Producing much higher compression involves analyzing the frame-to-frame movement of 16×16-pixel ‘macroblocks’ to produce vectors that show the distance and direction of macroblock movement. Their correctness is a factor of coders’ quality and efficiency. This vector data is carried in the P (predictive) and B (bi-directional predictive) frames that exist between I frames (see diagram). SDTV transmissions and DVDs typically contain two I-frames per second typically using about 4 Mb/s or less – a big difference from the 180 Mb/s of uncompressed SD video. The set of images between I-frames is a Group of Pictures (GOP) – usually about 12 for 576/50I and 15 for 480/60I transmissions. These are called ‘long GOP’. The GOP length can vary during transmission – an I-frame may be forced at the start of a new sequence, such as after a video cut, or other occasions were there is a big change at the input.
MPEG-2 12 frame GOP
*Note: for transmission the last ‘I’ frame is played out ahead of the last two ‘B’ frames to form the sequence I1, B1, B2, P1, B3, I1 B4, P2, B5, B6, P3, I2, B7, B8
Levels and profiles: MPEG-2 is a single compression standard that can operate on many different levels – picture source formats ranging from about VCR quality to full HDTV, and profiles – a collection of compression tools that make up a coding system. Current interest includes the Main Profile @ Main Level (MP@ML) covering current 525/60 and 625/50 broadcast television as well as DVD-video and Main Profile @ High Level (MP@HL) for HDTV. Besides the transmission/delivery applications which use 4:2:0 sampling, the 422 Profile (4:2:2 sampling) was designed for studio use and offers greater chrominance bandwidth which is useful for post production.
Blocking and ‘blockiness’: MPEG-2 artifacts generally show as momentary rectangular areas of picture with distinct boundaries. Their appearance generally depends on the amount of compression, the quality and nature of the original pictures as well as the quality of the coder. The visible blocks may be 8 x 8 DCT blocks or, most likely, ‘misplaced blocks’ – 16 x 16 pixel macroblocks, due to the failure of motion prediction/estimation in an MPEG coder or other motion vector system, e.g. a standards converter.
Audio: Digital audio compression uses auditory masking techniques. MPEG-1audio specifies mono or two-channel audio which may be Dolby Surround coded at bit rates between 32 kb/s to 384 kb/s. MPEG-2 audio specifies up to 7.1 channels (but 5.1 is more common), rates up to 1 Mb/s and supports variable bit-rate as well as constant bit-rate coding. MPEG-2 handles backward compatibility by encoding a two-channel MPEG-1 stream, then adds the 5.1/7.1 audio as an extension.
MPEG-21 (.m21 or .mp21) , standardized as ISO/IEC 21000, creates descriptions for a multimedia framework to provide a ‘big picture’ of how the system elements relate to each other and fit together. The resulting open framework for multimedia delivery and consumption includes content creators and content consumers as focal points to give creators and service providers equal opportunities in an MPEG-21 open market. This can also give the consumers access to a large variety of content in an practical manner. MPEG-21 defines a Digital Item as a basic unit of transaction. It is a structured digital object, including a standard representation, identification and metadata.
ISO/IEC 14496. MPEG-4 covers three areas, digital television, interactive graphics applications (synthetic content) and interactive multimedia (Web distribution and access to content). It provides the standardized technological elements enabling the integration of the production, distribution and content access of the three fields.
Since its first publication in 1999, MPEG-4 video compression achieved quality targets with ever-lower bit rates. Like MPEG-2 the compression is DCT-based and uses inter- and intra-field compression but implements many refinements, such as a choice of block sizes and motion compensation accuracy of one-eighth of a pixel against MPEG-2’s half pixel.
MPEG-4 is guilty of generating too many names and versions. The highest quality MPEG compression technology is known by ISO and IEC as MPEG-4 AVC (Advanced Video Coding). It is also know by the ITU-T as H.264 or MPEG-4 part 10. Notable predecessors are MPEG-4 part 2 (ASP) and H.263. Significantly, MPEG-4 AVC achieves up to a 64 percent bit rate reduction over MPEG-2 for the same quality and it opened possibilities for HD DVDs and transmission, etc., as well as room to offer more SD DTV channels, or more quality. MPEG-4 also specifies low bit rates (5-64 kb/s) for mobile and Internet applications with frame rates up to 15 Hz, and images up to 352 x 288 pixels.
MPEG-4 AVC video coding and decoding are far more complex than MPEG-2 but Moore’s Law absorbed that technical challenge. QuickTime and RealPlayer were among early adopters of MPEG-4. While established systems need to stick to their MPEG-2, most if not all later video services use MPEG-4.
The interactive multimedia side of MPEG-4 includes storage, access and communication as well as viewer interaction and 3D broadcasting. Aural and visual objects (AVOs) represent the content which may be natural – from cameras or microphones, or synthetic – generated by computers. Their composition is described by the Binary Format for Scene description (BIFS) – scene construction information to form composite audiovisual scenes from the AVOs. Hence, a weather forecast could require relatively little data – a fixed background image with a number of cloud, sun, etc, symbols appearing and moving, audio objects to describe the action and a video ‘talking head’ all composed and choreographed as defined by the BIFS. Viewer interactivity is provided by the selection and movement of objects or the overall point of view – both visually and aurally.
Audio: This builds on previous MPEG standards and includes High Efficiency Advanced Audio Coding (HE-AAC). This nearly doubled the efficiency of MPEG-4 Audio, improving on the original AAC and offers better quality for the same bit rate as the ubiquitous MP3 codec (from MPEG-2). Stereo CD-quality at 48 kb/s and excellent quality at 32 kb/s is reported. This is not a replacement for AAC, but rather a superset which extends the reach of high-quality MPEG-4 audio to much lower bit rates. High Efficiency AAC decoders will decode both types of AAC for backward compatibility.
DVB has approved two MPEG-4 codecs for use for broadcast transport streams: H.264/AVC video codec (MPEG-4 Part 10) and the High Efficiency Advanced Audio Coding (HE-AAC) audio codec. This mandates support of Main Profile for H.264/AVC SDTV receivers, with an option for the use of High Profile. The support of High Profile is mandated for H.264/AVC HDTV receivers.
The value of information often depends on how easily it can be found, retrieved, accessed, filtered and managed. MPEG-7, formally named ‘Multimedia Content Description Interface’, provides a rich set of standardized tools to describe multimedia content. Both human users and automatic systems that process audiovisual information are within its scope. It was intended to be the standard for description and search of large volumes of audio and visual content – including that from private databases, broadcast and via the Web. Applications include database retrieval from digital libraries and other libraries, areas like broadcast channel selection, multimedia editing and multimedia directory services.
MPEG-7 offers a set of audiovisual Description Tools (the metadata elements, their structure and relationships that are defined as Descriptors and Description Schemes). It specifies a Description Definition Language (DDL) so that material with associated MPEG-7 data can be indexed and allow fast and efficient searches. These searches will permit not only text-based inquiries, but also for scene, motion and visual content. Material may include stills, graphics, 3D models, audio, speech and video as well as information about how these elements are combined. Besides uses in program-making MPEG-7 could help viewers by enhancing EPGs and program selection.
Moving Picture Experts Group. This is a working group of ISO/IEC for the development of international standards for compression, decompression, processing, and coded representation of moving pictures, audio and their combination. It has also extended into metadata. Four MPEG standards were originally planned but the accommodation of HDTV within MPEG-2 has meant that MPEG-3 is now redundant. MPEG-4 is very broad and extends into multimedia applications. MPEG-7 is about metadata and MPEG-21 describes a ‘big picture’ multimedia framework.
MPEG High Efficiency Video Coding was developed to achieve twice the efficiency of MPEG-4 AVC. Apart from having the potential to halve the bandwidth currently used to transmit HDTV services, it also halves the data needed to be transmitted for UHD. That means that a 4K UHD channel can fit into one DVB-T2 multiplex – the bandwidth that was used for one analog PAL TV channel.
Beyond helping to enable terrestrial, satellite and cable 4K transmissions, it is also a part of the Ultra HD Blu-ray specification.
See also: Display Resolution