Category Archives: I

I-frame only (a.k.a. I-only)

Video compression schemes in which every frame is intra-frame (I-frames) compressed, i.e. individually defined and does not depend on any other frames for decoding. There are no P (predictive) or B (bi-directional) frames in such compression schemes. This is considered preferable for studio use as edits can be made on any frame boundaries without involving any processing beyond decoding the individual frames.

All DV compression is I-frame only. MPEG-2 and MPEG-4 with a GOP of 1 is I-frame only. For example these are used in Sony’s IMX VTRs and HDCAM SR respectively. JPEG 2000 as used in DCI cinema is I-frame only.

See also: Cut (edit), D11, GOP, Intra-frame (compression), JPEG 2000, MPEG-2, MPEG-4


Integrated Digital TV receiver. For viewers to receive DTV services they require a receiver either in the form of a new television set with the tuner and digital decoder built in (IDTV) or a set-top box. IDTVs typically include provision for all widely available terrestrial DTV services, so cable and satellite still require a set-top box. Note that although the set may be able to receive HD the screen may not be able to display the full sized 1920 x 1080 HD picture. In this case processing is included to re-size the pictures to fit the screen.

See also: IRD, Table 3

IEEE 1394a/b (a.k.a. FireWire, I-Link)

Developed by Apple and produced since 1994, it is a standard for a peer-to-peer serial digital interface which can operate at 400 to 3200 Mb/s (1394b) typically over shielded twisted pair cable up to 4.5m, and 100m on optical fiber.

Practically it can send A/V media over 100m of Cat-5 cable at 100 Mb/s. Consumers connect DV devices over longer distances using readily available low cost cables. IEEE 1394c has a data rate to 800 Mb/s over Cat5 cable and combines 1394 and GigE on one cable.

The high speed and low cost of IEEE 1394a makes it popular in multimedia and digital video applications. Uses include peer-to-peer connections for digital dub editing between camcorders, as well as interfacing video recorders, printers, PCs, TVs and digital cameras.

IEEE 1394 is recognized by SMPTE and EBU as a networking technology for transport of packetized video and audio. Its isochronous data channel can provide guaranteed bandwidth for frame-accurate realtime (and faster) transfers of video and audio while its asynchronous mode can carry metadata and support I/P. Both modes may be run simultaneously.

IEEE 1394 is known as FireWire by Apple, I-Link by Sony and Lynx by Texas Instruments. Future developments of FireWire are expected to increase data speed to 6.4 Gb/s.

See also: Asynchronous, Isochronous


IEEE 1588

This describes a Precision Time Protocol (PTP) that enables synchronizing distributed clocks to within 1 microsecond via Ethernet networks with relatively low demands on local clocks, the network and computing capacity. There are many applications for example in automation to synchronize elements of a production line (without timing belts).

PTP runs on IP networks, transferring precision time to slave devices via a 1 GHz virtual clock (timebase). Independent masters can be locked to one master clock, creating wide, or even global locking. SMPTE has been assessing the possibilities of using PTP as a synchronizing source for television applications.


See also: Black and burst, Tri-level sync

Illegal colors

Colors that lie outside the limits, or gamut, of a particular defined color space. These can be generated when transferring images from one color space to another, as they all have different boundaries, or as the result of color processing. For example, removing the luminance from a high intensity blue or adding luminance to a strong yellow in a paint system may well send a subsequent video signal too high or low, producing at least inferior results and maybe causing technical problems. Out-of-gamut detectors can be used to warn of possible problems and correction is also available. Some broadcasters reject material with illegal colors.


The Interoperable Master Format has been developed by the Entertainment Technology Center at the University of Southern California. It is intended as a ‘grand master’ format, from which all kinds of deliverables can be created. An IMF may contain the images, audio, subtitles and captioning, technical metadata and the playlist for the content.

In-server editing

Editing at a workstation which directly edits material stored in a server. For this the workstation does not need large-scale video and audio storage but depends totally on the server store. The arrangement allows background loading of new material to the server, via several ports if required, and playout of finished results, while avoiding any need to duplicate storage or transfer material to/from the workstation and allowing any number of connected workstations to share work. The efficiency of in-server editing allows fast throughput and is specially attractive to news as well as to post production where jobs can be instantly available in rooms, or moved between rooms.

This depends on using a server that can act as an edit store and perform reliable video replay and record functions. It also requires a powerful interface to the edit workstation.

Quantel’s edit workstations with sQ servers operate this way. The workstation/server connection is by Gigabit Ethernet.

in-server editing

See also: Delta editing


InfiniBand defines an input/output architecture that can connect servers, communications infrastructure equipment, storage and embedded systems. It can achieve very high data transfers up to 120GB/s over copper and optical fiber connections, with the benefits of low latency and only requiring a low processing overhead. It is used in many data centers, high-performance computer clusters, connecting supercomputers, and in embedded applications that scale from two nodes up to a single cluster that interconnects thousands of nodes.


IT-News Gathering. Coined by Panasonic to highlight their use of Secure Data (SD) memory as the in-camera media store for their DVCPRO P2 (P2 Cam) news cameras.

See also: DV, ENG

Inter-frame (compression)

Video compression which involves more than one frame to code and decode. Inter-frame compression compares consecutive frames to remove common elements and arrive at ‘difference’ information to describe the frames between the (integral) I-frames. MPEG-2 and MPEG-4 use two types of inter-frame processed pictures – the ‘P’ (predictive) and ‘B’ (bi-directional) frames. As ‘P’ and ‘B’ frames are not complete in themselves but relate to other adjacent frames, they cannot be edited independently.

See also: Cut edit, I-frame, MPEG-2, MPEG-4

Interactive Television (iTV)

A service that may be enabled with DTV which allows viewers to participate or access more information about the program. The interactivity may be implemented by selecting different TV channels (unknown to the viewer) or by a return control path to the service provider. Besides using a phone line, DVB has devised return control paths for satellite (DVB-RCS), cable (DVB-RCC) and terrestrial (DVB-RCT). Some consider interactivity is the future of television – the ‘killer app(lication)’ that will make DTV a commercial success. Others talk of lean back (viewing) and lean forward (interaction) being very different attitudes of both body and mind and question whether the two belong in the same place.

See also: Return control

Interaxial distance (Stereoscopic)

The distance between the centers of the lenses of two recording cameras. A typical distance would be 63.5 mm (approximating average adult eye spacing). The term ‘interaxial’ is sometimes also used interchangeably with ‘interocular’ (when referring to eyesight, ‘interpupillary’ is often used).

Interlace Factor

The reduction in vertical definition during vertical image movement due to interlaced (rather than progressive) scans. When interlace is used in broadcast television it means that a field of all the odd lines (1, 3, 5, etc) is sent followed by the even (2, 4, 6, etc.) lines that fit between the odd ones. Experimentally the interlace factor is found to be about 30%. Note that, when scanning film frame-per-frame (i.e. 24 or 25fps, not 3:2 pull-down to 60fps), or a succession of electronic frames each representing a single snapshot in time, there is no vertical movement between fields and so in these cases the Interlace Factor has no effect.

See also: 24PsF

Interlace (scan)

Method of scanning lines down a screen (vertical refresh). It is still used in most of today’s television broadcasts but was originally designed to suit the needs of CRT displays and analog broadcasts. Interlace is indicated in television scan formats by an ‘I’ e.g. 1080I, etc. (though the use of ‘i’ is common). Each displayed picture comprises two interlaced fields: field two fills in between the lines of field one. One field displays odd lines, then the other shows even lines. For analog systems, this is the reason for having odd numbers of lines in a TV frame eg 525 and 625, so that each of the two fields contain a half-line, causing the constant vertical scan to place the lines of one field between those of the other.

The technique greatly improves the portrayal of motion and reduces picture flicker without having to increase the picture rate, and therefore the bandwidth or data rate. Disadvantages are that it reduces vertical definition of moving images by about 30% (see Interlace Factor) of the progressive scan definition and tends to cause some horizontal picture detail to ‘dither’ – causing a constant liveliness even in still pictures.

Interlaced video requires extra care for processing, such as in DVE picture manipulation of size, rotation, etc, as any movement between fields has to be detected if frame-based processing which can produce higher-quality results, is used. Also frame freezes and slow motion need ‘de-interlace’ processing.

There is continuing debate about the use of interlaced and progressive scans for digital television formats. This has intensified now that the increasingly popular panel displays all use progressive scans. Interestingly the latest television standard for ITU-R BT.2020, for 4K and 8K UHD only includes progressive scans.

See also: Interlace Factor, Progressive


As a part of the chemical lab film intermediate process internegatives are created by contact printing from interpositives. These very much resemble the cut negative. The stock is the same as for interpositives: slow, very fine grain with a gamma of 1, and the developed film is orange-based. To increase numbers, several internegatives are copied from each interpositive. These are then delivered to production labs for large scale manufacture of release prints.

See also: Film basics (Tutorial 2)

Interocular distance (Stereoscopic)

The distance between the centers of eyes. A typical distance for humans would be 63.5 mm (approximating average adult eye spacing). The term ‘interaxial’ is sometimes also used interchangeably with ‘interocular’ (when referring to eyesight, ‘interpupillary’ is also occasionally used).


The ability of systems to interoperate; to understand and work with information passed from one to another. Applied to digital media this means video, audio and metadata from one system can be used directly by another. Digital signals may be originated in various formats and subjected to different types of compression so care is needed to maintain interoperability.

Interpolation (spatial)

Defining the value of a new pixel from those of its near neighbors. For example, when re-positioning or re-sizing a digital image, for dramatic effect or to change picture format, more, fewer or different pixels are required from those in the original image. Simply repeating or removing pixels causes unwanted artifacts. For far better results the new pixels have to be interpolated, calculated by making suitably weighted averages of adjacent input pixels, to produce a more accurate result. The quality will depend on the techniques used; bi-cubic interpolation is generally accepted as being good, and the number of pixels (points) taken into account (hence 16-point interpolation), or area of original picture that is used to calculate the result all affect the quality of the result.

See also: Anti-aliasing, Interpolation (temporal), Sub-pixel

Interpolation (temporal)

Interpolation between the same point in space (pixel) on successive frames. It can be used to provide motion smoothing and is extensively used in standards converters to reduce the judder caused by changes of field, or frame, rates such as between 50 and 60 Hz. The technique can also be adapted to create frame averaging for special effects and slow motion. Various qualities of processing are used. It can be very complex, attempting to work out how each element in successive pictures is moving in order to synthesize ‘between’ images (e.g. to convert 50 pictures into 60 pictures while still showing smooth motion).


This is a first part of the chemical lab intermediate process where a positive print of film is produced from the cut (edited) camera negative. Interpositives are made by contact printing onto another orange-base stock. In order to preserve as much detail as possible from the negative, including its dynamic range, interpositive material is very fine grain, slow and has a gamma of 1. During the copying process, grading controls are used to position the image density in the center of the interpositive material’s linear range. As a part of the process of going from one camera negative to, possibly, thousands of prints, a number of interpositives are copied from the negative.

See also: Film basics (Tutorial 2)

Intra-frame (compression)

Compression that uses just one picture. The compression process only is designed to remove what it considers to be redundant and visually less significant information from within the frame itself. No account is taken of other frames. JPEG and the ‘I’ frames of MPEG-2 are coded in this way and use DCT. In an MPEG-2 sequence of frames editing can only be at I-frames as they are the only independent frames.


See also: DCT, I-frame only, JPEG, MPEG-2


1) Intellectual Property – this can be very valuable and there are regular court cases where owners of this type of IP are trying to sue other people who they think have stolen their IP.

2) Internet Protocol – is the de facto standard for networking and is the most widely used of the network protocols that carry data and lie on top of physical networks and connections. Besides its Internet use it is also the main open network protocol that is supported by all major computer operating systems. IP, or specifically IPv4, describes the packet format for sending data using a 32-bit address to identify each of nearly 4.3 billion devices on a network with four eight-bit numbers separated by dots e.g. Each IP data packet contains a source and destination address as well as a payload of data. There is now IPv6 which brings, among many other enhancements, 128-bit addressing – allowing 2128 addresses, plenty for all the connected devices on planet Earth, and thus relieving IPv4’s address shortage.

Above IP are two transport layers. TCP (Transmission Control Protocol) provides reliable data delivery, efficient flow control, full duplex operation and multiplexing – simultaneous operation with many sources and destinations. It establishes a connection and detects corrupt or lost packets at the receiver and re-sends them. Thus TCP/IP, the most common form of IP, is used for general data transport but is relatively slow and not ideal for video.

The other transport layer is UDP (User Datagram Protocol) which uses a series of ‘ports’ to connect data to an application. Unlike the TCP, it adds no reliability, flow-control or error-recovery functions but it can detect and discard corrupt packets by using checksums. This simplicity means its headers contain fewer bytes and consume less network overhead than TCP, making it useful for streaming video and audio where continuous flow is more important than replacing corrupt packets.

There are other IP applications that live above these protocols such as File Transfer Protocol (FTP), Telnet for terminal sessions, Network File System (NFS), Simple Mail Transfer Protocol (SMTP) and many more.

Video over IP – Watching video over the internet is commonplace. It represents a very large, and growing, part of internet traffic, and fits well with the rising population of Smart TVs. There are several suitable streaming protocols in use, including those offering variable bit rates such as HTTP Live Streaming from Apple, and HTTP Smooth Streaming from Microsoft. These offer a good chance of providing uninterrupted viewing, even when the internet connection gets a bit slow.


IP Datacast Forum (IPDC)

The IPDC (Internet Protocol Data Cast) Forum was launched in 2002 to promote and explore the capabilities of IP-based services over digital broadcast platforms (DVB and DAB). Participating companies include service providers, technology providers, terminal manufacturers and network operators. The Forum aims to address business, interoperability and regulatory issues and encourage pilot projects.

See also: IP over DVB


IP over DVB

The delivery of IP data and services over DVB broadcast networks. Also referred to as datacasting, this takes advantage of the very wideband data delivery systems designed for the broadcast of digital television, to deliver IP-based data services – such as file transfers, multimedia, Internet and carousels, which may complement, or be instead of, TV.

Due to DVB-T’s ability to provide reliable reception to mobile as well as fixed receivers, a new standard DVB-H has been added to send IP-style service to people on the move – typically to phones. For interactivity, a return path can be established by the phone.

See also: IP Datacast Forum, Data carousel


Internet Protocol Television refers to the use of the IP packetized data transport mechanism for the delivery of streamed realtime (live streaming) and downloaded television signals across a network. This is a huge subject as video accounts for an increasingly large part of internet traffic. Cisco predicts that, excluding video peer-to-peer file sharing, 79 percent of domestic internet traffic will be video by 2018, up from 66 percent in 2013. And that including file sharing, it will take between 80-90 percent of global consumer traffic in 2018.



Integrated Receiver Decoder. A device that has both a demodulator and a decoder (e.g. for MPEG-2) built in. This could be in a digital television set or a set-top box.

See also: IDTV


Integrated Server Architecture is a Quantel term for the technology used in its sQ servers to manage the contents of several separate servers simultaneously. ISA operates at two levels: one locks browse and full quality material together under a single ISA database. The other level allows all material held on several sQ servers to be handled as if on a single server, effectively as if on a single database. This facilitates system scaling of users and storage by adding servers and makes possible the ‘ZoneMagic’ operation where two separated servers are kept in step.

See also: NAS, SAN


Integrated Services Digital Broadcasting. Standard for digital broadcasting used in Japan. ISDB has many similarities to DVB including OFDM modulation for transmission and the flexibility to trade signal robustness against delivered data rate. ISDB-T (terrestrial) is applicable to all channel bandwidth systems used worldwide: 6, 7, and 8 MHz. The transmitted signal comprises OFDM blocks (segments) allowing flexible services where the transmission parameters, including modulation and error correction, can be set segment-by-segment for each OFDM segment group of up to three hierarchical layers in a channel. Within one channel, the hierarchical system allows both robust SD reception for mobile and portable use and less robust HD; a form of graceful degradation.

See also: COFDM, DVB



The Internet Streaming Media Alliance is a coalition of industry leaders dedicated to the adoption and deployment of open standards for streaming rich media such as video, audio, and associated data, over Internet protocols.



International Standards Organization. An international organization that specifies international standards, including those for networking protocols, compression systems, disks, etc.



A form of data transfer that carries timing information with the data. Data is specified to arrive over a time window, but not at any specific rate within that time. ATM, IEEE 1394 and Fibre Channel can provide isochronous operation where links can be booked to provide specified transfer performance. For example, 60 TV fields can be specified for every second but their arrival may not be evenly spread through the period. As this is a guaranteed transfer it can be used for ‘live’ video but is relatively expensive on resources.

See: ATM, Asynchronous, Fibre Channel, IEEE 1394, Synchronous


International Telecommunications Union. The United Nations regulatory body covering all forms of communication. The ITU sets mandatory standards and regulates the radio frequency spectrum. ITU-R (previously CCIR) deals with radio spectrum management issues and regulation while ITU-T (previously CCITT) deals with telecommunications standardization.

Suffix ‘BT.’ denotes Broadcasting Television.


ITU IPTV standard Rec J.700

ITU-T SG 9 (integrated broadband cable networks and television and sound transmission), in its meeting of October/November 2007, gave consent to a draft new recommendation on IPTV to go through the Alternate Approval Process (AAP). The draft new recommendation J.700 titled “IPTV Service Requirements and Framework for Secondary Distribution” is now at the Last call Judgement (LJ) stage. It describes the service requirements and functional framework architecture for support of IPTV services. Requirements for network elements and customer premises equipment (CPE) are covered. The draft new recommendation also leverages existing deployed technologies to provide a smooth path for operators to integrate IPTV technology into their networks.


ITU-R BT.2020

This defines the parameters of UHDTV (Ultra High Definition Television), including display resolution, frame rate, chroma sub-sampling, bit depth, color space and audio system. The image sizes are 4K (3840 x 2160) and 8K (7680 x 4320), with frame rates 23.976, 24, 25, 29.97, 30, 50, 59.94, 60 and 120 Hz. All scans are progressive. The system offers a wider dynamic range with the images’ colorimetry including a wider gamut  than HDTV, which is already wider than SD. Sampling may be 10 or 12-bit and 4:4:4, 4:2:2 or 4:2:0 to suit the application.

ITU-R BT.601

This standard defines the digital encoding parameters of SD television for studios. It is the international standard for digitizing component television video in both 525 and 625 line systems and is derived from SMPTE RP125. ITU-R BT.601 deals with both color difference (Y, R-Y, B-Y) and RGB component video and defines sampling systems, RGB/Y, R-Y, B-Y matrix values and filter characteristics. It does not actually define the electro-mechanical interface; see ITU-R BT. 656.

ITU-R BT.601 is normally taken to refer to color difference component digital video (rather than RGB), for which it defines 4:2:2 sampling at 13.5 MHz with 720 (4) luminance samples per active line. The color difference signals R-Y and B-Y are sampled at 6.75 MHz with 360 (2) samples per active line. Its depth may be 8 or 10 bits.

Some headroom is allowed so, with 10-bit sampling, black level is at 64 (not 0) and white at level 940 (not 1023) – to minimize clipping of noise and overshoots. With 210 levels each for Y (luminance), Cr and Cb (the digitized color difference signals) = 230 – over a billion unique colors can be defined.

The sampling frequency of 13.5 MHz was chosen to provide a politically acceptable common sampling standard between 525/59.94 and 625/50 systems, being a multiple of 2.25 MHz, the lowest common frequency to provide a static sampling pattern for both.

See also: 13.5 MHz, 4:2:2, Frequency, Into digits (Tutorial 1)

ITU-R BT.656

The international standard for interconnecting digital television equipment operating to the 4:2:2 standard defined in ITU-R BT.601. It defines blanking, embedded sync words, the video multiplexing formats used by both the parallel (now rare) and serial interfaces (SDI), the electrical characteristics of the interface and the mechanical details of the connectors.

ITU-R BT.709

In 2000, ITU-R BT.709-4 recommended the 1080 active-line high definition television standard for 50 and 60 Hz interlace scanning with sampling at 4:2:2 and 4:4:4. Actual sampling rates are 74.25 MHz for luminance Y, or R, G, B and 37.125 MHz for color difference Cb and Cr, all at 8 bits or 10 bits, and these should be used for all new productions. It also defines these 1080-line square-pixel standards as common image formats (CIF) for international exchange.

The original ITU-R BT.709 recommendation was for 1125/60 and 1250/50 (1035 and 1152 active lines) HDTV formats defining values and a ‘4:2:2’ and ‘4:4:4’ sampling structure that is 5.5 times that of ITU-R BT.601. Note that this is an ‘expanded’ form of 601 and so uses non-square pixels.

See also: Common Image Format