ESSAY - Masayuki Kawai - Analogue is the Future (2019)

A great short essay here by video artist Masayuki Kawai on analogue, digital and data -

I’d be interested to know what people think. I like the fact he focuses less on technology, but on how analogue and digital act upon data.

“As it recurs, digital data feedback converges on utility and meaning and strengthens its control over information… Analog data feedback is constantly affected by uncertainty, generates difference, and thereby always deviates from institutionality.”

I only just came across him today whilst looking at this festival -

You can see some of his video feedback works here:

note - the pdf link is from his ‘writings’ page, not his ‘videofeedback’ page (which seems broken at present)

1 Like

i think anyone who comes down hard on one side or the other of the kind of tired “analog vs digital” false dichotomy is missing the point. each kind of signal has various strengths and weaknesses. it is vastly unlikely that anyone working in video feedback these days has a purely analog signal flow unless they are using exclusively mixers and tbcs from the 70s and tube cameras only. (ccd and cmos sensors are discrete, and even a tube sensor is a blend of discrete and continuous). also speaking as someone who has experimented heavily in digital feedback worlds i can honestly say that i think the sheer possibilities of working with purely digital feedback systems seems to dwarf any potentials of analog only feedback systems
i do honestly believe that hybrid digital analog computing is actually the future, pretty much any kind of video feedback system one is likely to be playing with (unless u have a sandin and some tube cameras) is most likely of this sort anyhow.


“As it recurs, digital data feedback converges on utility and meaning and strengthens its control over information… Analog data feedback is constantly affected by uncertainty, generates difference, and thereby always deviates from institutionality.”

this does not seem to actually mean anything as far as i can tell. i have a bit of a problem translating anything out of critical theory, tho it seems like they are trying to say something along the lines of ‘analog is warm and alive, digital is cold and dead’, which i think is unsupported by any evidence whatsoever aside from human prejudice.

1 Like

cheers for you thoughts on this - its true that analogue equip has always had digital components.

Maybe the quote above is better read in context, but I take it to mean that that digital data ‘fixes’ information - eg, a digital image is essentially a spreadsheet, as opposed to the chemistry of film - which then has great utility (and can be exploited).

it is a misconception that digital and analog signals have any inherent qualities other than one being discrete and the other being continuous. there is nothing inherent in a digital signal path that makes it ‘fix’ anything, what the signal path does is purely on the design side. film and phosphor displays are also essentially discrete in nature, they exploit general levels of ‘fuzziness’ in how the discrete spatio temporal blocks interact with one another and can pass thru the liminal zones of what humans can percieve as discrete

1 Like

without commenting much on the nature of analog/digital , i just want to lay out my understanding of the word ‘analog’ because it has conflicting meanings im pretty sure (and often i am confused by it):

  • the word analog has a technical meaning in electronics to mean a continuous parameter or signal, for example how an analog-to-digital converter reads a continuous voltage signal.
  • this use comes from the english word analogy ie to compare two things - in this case the (continuous) signal is comparable to the output. for example a telephone line carries a varying voltage signal which is analogous to the varying pressure of sound-waves in the receiver. a counter point would be a morse-code telegram, where the (discrete) signal is encoded and decoded, not analogous to the message.
  • as we know this electrical use of analog also applies to (tube) television , where the varying voltage in a video signal is analogous to the intensity of electron-gun (and thus brightness of the resulting image)

it seems analog has since taken on another different meaning, as a retronym to describe something not digital. my guess is this came from the naming of analog clocks (since the hands do move continuously) to distinguish from digtial clocks. but is now applied (without any connection to the electrical/original meanings):

  • analog photography , to now distinguish from digital photography
  • analog film, to now distinguish from digital film.

the film part is particularly confusing because analog film creates the illusion of motion by flickering over discrete (chemically exposed) images - much closer to how digital video does, than analog imo. (unless theres another reason why we call chemically processed film analog that i havent heard ?)

maybe we should have started calling it acoustic film to distinguish from electric film instead ? :sweat_smile:


its actually kind of the other way around, the original meaning of an analog signal was in terms of the basic concept of an analog computer where a system is (at the very least liminally for humans) a continuious representation of a continuous variable. stonehenge, a pipe organ, hydraulic logic, and hourglasses are all examples of analog computing techniques. as analog computers became more highly developed due to how well they could quickly and accurately solve complex systems of differential equations (tide tables, weather prediction, and aiming systems for giant guns on navy ships were the main uses and analog computers were still used heavily for these purposes up until the 80s) the terms analog signal and electrical signals where varying voltage mapped to continuous variables became synonomous. the reduction of price and size of transistors hit a tipping point in the 80s and digital microprocessors started to dominate the field, even though they did and still do kind of suck for solving differentials, they at least sucked the exact same way every time and didnt require a team of occult engineers to regularly fine tune the systems in order to get consistent predictions. and also the digital microprocessors were also far more general in what kind of problems they could solve whereas analog computers were highly specialized. (a tide table computer couldnt really predict the weathrr, it is difficult to play tetris on stonehenge) nowadays the most common analog computer systems are modular synthesizers! but yeah if u said computer in 1940 most folks would have either thought you were talking about a human with an abacus or an analog differential analyzer, unlike nowadays when someone says computer they mean digital computer with von neumann architecture

1 Like