../

A Case for Analog

Analog vs Digital is like Image vs Text. Images are more difficult to replicate but they are richer! Text could be more formal.

I have heard my dad self-deprecatingly use the adjective “analog” to show how out of touch he is with modern technology, specifically modern digital computers. ‘Analog’ to him and to almost everybody today has the connotation of being all those old-fashioned tech like the abacus (the abacus by the way is a digital computer), while digital are the shiny new ones like the Macbook Pro. However, recently I have seen a few projects1 that explore analog computing as a more energy-efficient alternative to digital computers. And there are trends too.

It seems that beginning from the 1970’s, the technology world picked up the digital computing paradigm and ran very fast with it, growing at the rate predicted by Moore’s law until now when people are predicting that physical limitations will bring about the end of the law. They are also predictions that the growth in computing capacity will the driven by other areas that include AI (machine learning) and quantum computing. These two fields are anything but digital. Quantum computing can be seen as a whole different computing paradigm where the other two paradigms are analog and digital.

Machine learning on the other hand seems to me like a field that would really benefit from analog computing. The term analog itself comes from a Greek word that could be translated to mean “same ratio”. In effect, anolog systems try to create a measurable model of the real world. The analog clock is a model of the earth rotation, the thermometer is a model of a body’s temperature. The most advanced analog model of reality is found in the human brain, which has gradually evolved over millions of years to become a complex and intriguing piece of nature. Machine learning researchers frequently look to the human brain for inspiration while developing artificial intelligence. For instance, these days, neural networks often look for patterns in large amounts of data to enable it make predictions. This is similar to how the brain uses heuristics (aka biases) that have been acquired over several years of evolution to make many decisions. These decisions may not be logical (which is the area where digital computers are king) or rational or even correct, but they are a good reflection of how the individual percieves the world. It is a model of the imperfect world.

As a computing tool for financial transactions or for some security systems, precise calculations are needed and digital computers could remain the defacto tools for it. But when we require tools that can make predictions, recommendations, analog may be the most energy-efficient way to go.

Further material: