Thursday, 21 February 2008

The Modular Transfer Function and (Imagined) Reality

I’m going to try to describe an important insight I’ve had into digitality. Loosely, the outcome of this insight has a simple end result: In describing the resolution of an image using terms derived from the analogue realm we are missing a crucial element. That element equates to looking at a landscape on a bright sunny day through a fog. That digital fog is described at the end of what follows.

The further you delve into a subject, the more you learn of course, but there’s a sense in which the width of your enquiry widens, the deeper you go. Visualize an iceberg and the tip that sits above sea level is what one might know, generally from day to day. Beneath the surface is the rest of the enquiry – both known and unknown to oneself. But as the journey of discovery goes on you find other relationships between what you have known and what you come to understand as important about your subject.

I am not a mathematician yet my enquiry about High Definition keeps indicating the importance of mathematics in terms of the deep understanding of the subject. Here I’m talking about a Newtonian world view in which you understand why this or that function happens – because it is dependent perhaps on a set of causes and effects – that’s why I’m using the term Newtonian – as opposed to Euclidian or to bring it up to date, Einsteinian.

The thing about the cogs and ratchets of reality is that as we enquire into the meaning function they represent, they actually tell us about a quantum world and do not necessarily tell us about the significance of things at a non-quantum level.

So there’s a problem about going deeply into your subject – you start to disassociate from reality as it is lived.

Mathematics is useful if you can bring its insights back to the 'surface'.

Film, using a description that is based on an understanding that is prior to analogue descriptions could be said to be 'alchemical'. A description of reality based upon 'Digital data technology' now speaks to us. Recombined elements that rely on signal values, though emulating what we believe we know about or own signal processor, the brain, is a best guess scenario – we don’t actually know how we interface with the brain, the common sense to all our other senses. A philosopher might say that we actually do not know and cannot know anything at all.

But I’m not a philosopher; in the end I’m a mystic – someone that actually believes in the reality of inner knowledge – through design you might say. I believe in in-tuition, in-spiration in-dependence, inner-revelation, in-sight.

I have always intuitively known about system noise. Any analogue system has system noise. Turn the volume up on an amplifier and speaker and you’ll hear hiss and hum, the former known as white noise. Any system has the effect of itself standing by to perform as an integral element of its performance. The signal to noise ratio of the system – that is how much noise the system generates from just being – is a definition of its efficiency. There is generally more noise in a cheap system than an expensive one because better components are used in the expensive system (though this is not always true of course).

Digital systems are different though. Analogue systems use their own elemental characteristics to reproduce sounds or images; digital systems use the neutral form of data description to represent sounds and images. What I mean here is that the analogue realm is analogous to the world it describes – it tries to imitate using what it is made of. Vibrations gathered at a mike are sent through a set of mathematical algorithms that try to imitate the original then output through the reverse of the original gathering unit (the mike, the speaker). Though digital systems use the same front and end bits, as son as the signal is gathered it is rendered into a set of values that seek to reduce the essential nature of the original to a mathematical description. As I have illustrated elsewhere to this article, up until recently they have inherited the prior analogue mathematics to do the work – that of Fourier’s discrete cosine transform. Only now are we moving on into Fourier’s Wavelet Transforms that are more sympathetic to what the digital realm essentially is.

Digits work best in describing life, as it is if they rise and fall in an analogue way. That’s difficult if you are essentially a packet of data. You’re either there or not there, whereas analogue increases then falls in a manner sympathetic to the way out senses apprehend the world. Imagine a train whistles in the distance approaching us, shifting pitch as it passes then fading away. Pure analogue response uses the medium of air to transmit through.

Fourier’s wavelet transform organizes data in a way that is analogous to the analogue approach. Quite simply it is a more intelligent use of the digital form.

I was reading more deeply about the way in which we make comparisons and judgments about resolution. The old analogue system used pairs of black and white lines of equal width in smaller and smaller widths to understand how much resolution was reproducible by a system. If we said a TV system had 800 lines resolution we were saying that above 800 lines you witness a blur, below 800 lines you could distinguish white and black lines. It was quite simple in the old days.

Now we understand something a little different about systems and resolution. I’m going to quote from Norman Koren’s excellent article on Modular Transfer Function at http://www.normankoren.com/Tutorials/MTF.html. Please bare with this as if you don’t understand this sort of stuff and don’t ever want to understand this sort of stuff, the language itself says something about the study:

Systems for reproducing information, images, or sound typically consist of a chain of components. For example, audio reproduction systems consist of a microphone, mike preamp, digitizer or cutting stylus, CD player or phono cartridge, amplifier, and loudspeaker. Film imaging systems consist of a lens, film, developer, scanner, image editor, and printer (for digital prints) or lens, film, developer, enlarging lens, and paper (for traditional darkroom prints). Digital camera-based imaging systems consist of a lens, digital image sensor, de-mosaicing program, image editor, and printer. Each of these components has a characteristic frequency response; Modular Transfer Function is merely its name in photography. The beauty of working in frequency domain is that the response of the entire system (or group of components) can be calculated by multiplying the responses of each component.

The response of a component or system to a signal in time or space can be calculated by the following procedure: i) Convert the signal into the frequency domain using a mathematical operation known as the Fourier transform, which is fast and easy to perform on modern computers using the FFT ( Fast Fourier Transform) algorithm. The result of the transform is called the frequency components or FFT of the signal. Images differ from time functions like sound in that they are two-dimensional. Film has the same MTF in any direction, but not lenses. (ii) Multiply the frequency components of the signal by the frequency response (or MTF) of the component or system. (iii) Inverse transform the signal back into time or spatial domain. Doing this in time or spatial domain requires a cumbersome mathematical operation called convolution. If you try it, you'll know how the word "convoluted" originated. And you'll know for sure why frequency domain is widely appreciated.


What Norman Koren is talking about here is the way in which what we used to call system noise has its digital counterpart. Look at the picture at the front of this article. You’ll see a series of lines both black and white – that’s the ideal. Look at the picture at the end of this article. It has a series of line both black and white sitting on a field of grey. As the lines become thinner (therefore requiring more resolution) they decrease in contrast until they merge in to the background because the system itself has a limit on what contrast it can represent. As Norman describes above “the response of the entire system (or group of components) can be calculated by multiplying the responses of each component”.

I want to revisit the paragraph I wrote at the head of this article to bring this insight into focus: The further you delve into a subject, the more you learn of course, but there’s a sense in which the width of your enquiry widens, the deeper you go.”

The simple insight into the deterioration of resolution as its representation lessens in terms of frequency (use the metaphor of lessening of contrast to understand this) now powers along my enquiry and gives me inspiration and the resilience to continue. Using the metaphor at the head of the article: That element equates to looking at a landscape on a bright sunny day through a fog. So the grey background to the lines of resolution that deteriorate into grey from their prior black and white state is the fog I was talking about. As with all metaphors they only point at the situation and do not fully describe it - but they are helpful.

That other metaphor which like all statements of 'fact' is as much a metaphor as that of fog - simply a way of getting at the 'truth' of a situation. The description of the Modular Transfer Function has brought Fourier into my field of vision yet again and this alone tells me that at some point the revelation that is waiting concerning the actual nature of digitality is waiting to reveal itself. It’s just a question of taking a deep enough breath of air before diving ever deeper into the subject so that one remembers what one is diving for.