|Homepage / Publications & Opinion / Silicon.com
Peter Cochrane's Uncommon Sense: Analogue or Digital?
"We simply cannot define what is alive and what is intelligent."
Interaction between humans and computers is evolving and Peter Cochrane can't wait for a brave new world where machines beget machines...
As a scientist and engineer I've always struggled with the definition and understanding of analogue and digital. In broad terms it is easy to define these modes in the following manner: If something is continuous and well behaved then it is analogue; if it discontinuous, and a limited number of states, then we consider it to be digital.
As I speak into this voice recorder my neurons are firing in a digital manner to create a continuum of spoken words in the form a pressure wave to be converted and impressed electromagnetically onto a tape in an analogue format. Later my PA will replay these words though an analogue machine directly to her ears and in turn her neurons will fire in a digital manner to see her rapidly pressing keys on her laptop to create the column you are now reading.
So here we have multiple digital and analogue operations that at first sight look inherently and predominantly analogue. As a young student I had the good fortune to study both analogue and digital computing at a time when analogue was the predominant mode for most large scientific investigations. I found myself both programming digital machines using machine code and physically wiring analogue computers, to model physical situations.
Interestingly some of these early analogue computers did not involve electronics but water. It may seem quaint but the reality is a very few of those machines and techniques are still used for the very reason that the complexity of the problems still remain beyond our biggest digital machines. But there is very rapid trend towards the digitisation of analogue computing even for extremely complex models.
So where is all of this going? I look to the future with an increasing conviction that this current digital revolution is set to continue for some considerable time. I think we can look forward to digital computers at least a billion times more power than those we currently enjoy - and well within the next 30 years.
At the same time I look at the human race and see no progress whatsoever in our abilities - relatively speaking we have stopped evolving. It seems abundantly clear that without some technological augmentation, our species will run out of steam to be surpassed by its own technology. Already we see products emanating from Russia that have a rudimentary ability to invent new technologies on the basis of our past history and knowledge subsumed into modest computers.
We still need that spark of human innovation to make the final decisions, to define the real requirements and identify the economic routes to solution and market. But in this arena I can see the man-machine gap closing rapidly and certainly within the next 25 years we could find ourselves pushed out of the loop.
Perhaps we shouldn't resist. Instead we could identify those areas in which we excel and offload those where we do not to the machines. Let them take over.
History has taught us that it would be foolish indeed not to have accepted the spinning and weaving machines, the lathe, milling machine and so on. Today we are fundamentally incapable of building television sets, cameras, PCs, automobiles. At the most fundamental level our machines do it for us.
And we all drive Rolls Royce quality vehicles in terms of the precision engineering and reliability compared to 30 years ago. In computing terms, what we are now using is also miraculous compared to 30 years ago. Without our machines we would know nothing about the genome or indeed modern medicines and our societies would suffer significantly. There are vast areas where we have yet to let our machines loose so that they can furnish us with better solutions than we are fundamentally capable of creating.
In the military domain we have already gone down the route of modelling conflicts in great detail before we fire the first bullet and we have a reasonable surety of a win before a war even starts. Unfortunately, as has been witnessed by recent events, we have not become smart enough to think through and model what happens when a conflict comes to a rapid completion.
What is really required is a far greater dedication of effort focused on the modelling of the human condition and society. As a species we face tremendous opportunities and risks at the same time. We currently waste vast resources worrying and investigating problems with probabilities that are thousands of times less than the likelihood of the planet being destroyed by the impact of a meteor. At the same time we neglect the very real problems associated with the distribution of food and resource and the overall wellbeing of some 80 per cent of our fellow men who are significantly disadvantaged by history, location and politics.
Without my laptop I would have virtually no means of communication, no means of running by business and no means of storing the vast amount of data that is well beyond my brainpower to accommodate. For every sheet of paper I retain there are now thousands of pages stored on hard drives. I am watching the average file size grow from a few kilobytes to several megabytes as the nature of information gradually creeps away from the written word and towards the pictographic.
The single biggest problem is finding what I know I have stored somewhere. I really could do with some artificial intelligence embedded in my machine to augment my predominantly analogue brain in this predominantly digital environment. Unlike most people I don't see machines as a threat. I see them as a friend and I don't think we should spend too much time worrying about us being analogue and them being digital, rather how we get those two domains and modes of operation to coalesce to our very definite advantage.
As a general rule I try and avoid all discussions about machine intelligence and life for the very reason that we have no adequate description, definition, quantification or measure of either quantity. We simply cannot define what is alive and what is intelligent.
But I will give you this very crude observation. With my sensory network of eyes and ears, nose and touch I take in gigabytes of information, while through my fingers and my mouth I output just a few 10s of bytes per second.
In contrast my laptop, with no sensory input other then the keyboard, receives a few 10s of bytes, but outputs gigabytes on its display. When our machines are given the benefit of sensory input beyond a keyboard - and I mean sight and sound and possibility feel - augmented by hardware and software that can adapt to those inputs, I think we are going to find intelligence and life itself will emerge in this silicon domain.
At that point the nature of computing and our relationship with it will change forever. For many years I've been aware of the fact that this machine trains me inasmuch as I learn what to do and what not to do, in order that I don't upset it and it runs and functions efficiently.
In my future world I would like machines to learn about me and what I need to operate efficiently - and for them to adapt to my idiosyncrasies instead of the other way round.
This column was typed on flight LH3543 between Vienna and Frankfurt and despatched to silicon.com from Phoenix Arizona via a free Wi-Fi hotel service.