A general term for a device that creates a digital representation of film for direct use in digital television or for digital intermediate work. For television, film scanners replaced traditional telecines that had to work in realtime. For use in digital film production, they should capture the full detail of the film so that, when transferred back to film, the film-digital-film chain can appear as an essentially lossless process. For this, film scanners are able to operate at greater than HD resolutions (1920 x 1080); 4K is now predominant in the movie busininess. The output is data files rather than the digital video that would be expected from the old traditional telecines.
For movie production the output must retain as much of the negative’s latitude as possible, so the material is transferred with a best light pass and recorded by linear electronic light sensors, typically of CMOS technology to a high accuracy – at least 13 bits of accuracy (describing 8192 possible levels). Using a LUT, this can be converted into 10-bit log data which holds as much of the useful information but does not ‘waste’ data by assigning too many digital levels to dark areas of pictures.
Note that this is different from using a telecine to transfer film to video where, normally the film is graded as the transfer takes place. Additional latitude is not required in this digital state so 10 or 8 bits linear coding is sufficient.