Hi, I'm Tim Tyler - and today I will be discussing my take on "the
singularity".
I have noticed that number of people have adopted the habit of referring
to the imminent rise of machine intelligence as "the singularity".
The use of this terminology began with a science fiction writer - but
has spread out into the wider community - and now there is a
singularity institute, a singularity university, singularity conferences,
a singularity movie, and a singularity book.
I see a problem with this. The problem is that I think the terminology
is ridiculous. There is no prescident for using such a term to
describe historical events. The term is overloaded with multiple
contradictory meanings, and none of them make much sense.
Historians tend to use the term "revolution" to refer to radical
changes in human history. So, we have the "agricultural revolution",
the "industrial revolution", the "information revolution" - and so on.
In the future, we may see an "intelligence revolution" and
probably a "nanotechnology revolution" not long afterwards. That
kind of terminology is simple, obvious and consistent with history.
The chance of any credible future historian ever referring to the
events in question as "the technological singularity" seems slim.
So, what is the case for the "singularity" terminology? As far as I
can see there isn't any serious scientific case. This is more something
that has arisen out of science fiction.
The term singularity comes from mathematics - where it refers to a
discontinuity - a point where strange or discontinuous behaviour
arises. The classical example is division by zero - which leads not to
another value, but to infinity.
Some proponents have concluded that the rate of progress will
actually do something like that. Here is an example of that, from
one of the singularity advocates
:
If computing speeds double every two years, what
happens when computer-based AIs are doing the research?
Computing speed doubles every two years.
Computing speed doubles every two years of work.
Computing speed doubles every two subjective years of work.
Two years after Artificial Intelligences reach
human equivalence, their speed doubles.
One year later, their speed doubles again.
Six months - three months - 1.5 months ... Singularity.
While machines will construct smarter machines, the proposed timescale
here seems rather over optimistic. One problem is with the third of
the three steps. Progress does not just depend on thinking.
It also depends on the ability to sense and modify the world - in
order to perform experiments - and robot capabilities are not doubling
every two years. Another problem lies with the assumption that the
current growth does not depend on the contributions of computers to
the design of computers in a significant way. In fact, designing and
optimising integrated circuits is an incredibly compute-intensive task
- and the involvement in computers in the process is already enormous.
Machines don't yet do everything - but already they do a lot - and
their capabilities matter.
However fast progress goes, it won't ever become "infinitely" fast. We
will run into physical limits first. So, the hypothetised "singularity"
will never actually be reached.
The "singularity" term has subsequently been liberally redefined and
reused. That is not necessarily bad - but it does present
critics with something of a moving target. It is hard to be critical
when you don't know what someone is talking about. Here is Eliezer Yudkowsky describing some of
the various interpretations of the idea that have been proposed:
The thing about these three logically distinct schools of
Singularity thought is that, while all three core claims support each
other, all three strong claims tend to contradict each other."
Indeed.
Prediction horizon
One idea is that the singularity represents a kind of "prediction
horizon". Here is Ray Kurzweil on that topic:
[Ray Kurzweil footage]
It is true that the far future is difficult to predict. The
problems here are that different things are unpredictable on different
time scales. Also, far future details have always been
unpredictable - and probably will be challenging to predict for a long
time to come. So, this "prediction horizon" seems to recede as we
approach it - and therefore does not correspond to any particular
future event.
Another problem with the idea of a "prediction horizon" is that it
seems more closely analogous to an event horizon than to a
singularity - and so seems like poor justification for using
the latter term.
Future breakdown
Another idea is that our "model of the future" will break down. Humans
predict the future using many different kinds of models, and not very
many of those models seem likely to be "broken" by the advent of highly
intelligent machines. So: is such an event actually likely to happen?
Here's Robin Hanson on the issue:
[Robin Hanson footage]
My perspective is that our ability to predict the future appears to be
improving over time - as our ability to collect data improves, our
models improve, and we develop bigger computers. This effect seems to
be more significant than the fact that some of the things that are
being modelled are themselves growing more complex. This effect will
tend to push the point which we can't predict beyond farther
out as time passes - and so it will never arrive. Our models of the
future will not "break down". Rather, they will get better and
better.
It is the same with physics. In physics, the term singularity is used
to describe the way that the laws of physics as they were understood
in the 1950s failed with a kind of division by zero when applied to
black holes. However, rather obviously this was a flaw in those
theories - and not a real physical phenomenon. The modern
theories that unite relativity and quantum physics do not have such
singularities.
Particular thinkers may have found that their brain melts
down when contemplating these far future events - but that is best
seen as a property of their brains - and not the future events
themselves. If your model breaks, then that's a problem with your
model - and not some kind of an event in the future.
Exponential growth
What about the idea that exponential growth will lead to rapid
technological change? Here is Terrence McKenna describing that:
[Terrence McKenna footage]
Another enthusiast for this idea is Ben Goertzel:
[Ben Goertzel footage]
It is likely to be true that the future will exhibit rapid change -
but there appears to be nothing "singular" about exponential growth.
Exponential growth is what you get when mice find a grain pile - there
just isn't anything "singular" about it.
Intelligence explosion
What about the idea that there will be an intelligence explosion -
and that self-improving systems will rapidly explode - and take over the
world?
I think that the "intelligence explosion" concept is a reasonable one
- and note that it already has a perfectly good name. However, if you
look at the effect, it is one which is already going on. Machines are
already dramatically augmenting human intelligence - and
have been doing so for some time. Most machine code in the world is
now written by computers. Much of the world's refactoring is also done
by computers. Companies are self-improving intelligent systems. The
intelligence explosion is not really an event localised in the future,
but is much more spread out - and we can clearly see the beginnings of
it happening now. I've made another video about this, entitled: The
intelligence explosion is happening now.
The proposed "singular" aspect apparently involves the intelligence of
computers suddenly reaching human-level, and then taking over
from there. The problem here is that there is no such thing as
"human-level" intelligence. Workers have varying levels of
intelligence required for their jobs - and consequently automation
takes their jobs gradually. Already we have automated bank
tellers, checkout assistants, cleaners, factory workers, telephone
operators - and so on. In the job market, machines will
gradually catch up with humans - not overtake them suddenly.
Functionally, human intelligence goes from Einstein - right down to
heavily disabled individuals that are practically comatose. The
distribution of human intelligence forms a bell-shaped curve - not a
sharp spike - and in the job market - which is where humans and
machines compete - the spread of intelligence required to perform
tasks is even broader. Many humans are currently doing jobs that
could be being done by much less intelligent machines.
"Singular" just means "unique"
What about the idea that the coming events are "singular" - because
they are unique - and have never happened before? Many events fit the
description of being unique. The second world war was unique. The
millennium was unique. The moon landings were unique. The discovery of
DNA was unique. Yet we do not describe these events as
singularities - and nor should we.
Robin Hanson has even tried rebranding the agricultural
revolution and the industrial revolution as
singularities in
a recent article.
Robin wrote: "whatever the Industrial Revolution was, clearly it was
an event worthy of the name 'singularity'".
I disagree - I thing calling the industrial revolution a
singularity represents pandering to pseudoscience. I see the same
problem as with all other usage of the term - the word is simply
inaccurate and inappropriate. "Revolution" is a much
better-established term. If the events are sufficiently dramatic, then
we have precident for using the term "takeover" - and there are
various other terms that would fall within existing scientific
traditions.
I think the best perspective on the "singularity" phenomenon, is that
it is not science, but marketing. Usually the singularity folk have
something to sell. Whether it is that the future is going to be bad,
and we have to work to stop it - or that the future is going to be
great, and we should unite to make it happen - or that the future is
going to be weird, and that you need assistance understanding it -
there is usually something that we need to contribute to, or
invest in.
I find it all rather nauseating. I would prefer it if people would use
established, sensible, scientific terminology, instead of nonsensical,
fantastical concepts derived from science fiction. Whenever I hear
people talking about the singularity, I wonder if they know
what they are talking about. Whenever I see institutions associating
themselves with the singularity, my assessment of their credibility
plummets. The term makes me wonder what people are selling.
I don't mind marketing - but here we are mostly talking about ideas
that are either confused, confusing, misleading, wrong - or just have
nothing to do with the term being used. "The singularity" is
pseudoscientific mystical-sounding mumbo-jumbo.
My council for the singularity enthusiasts - is to reconsider your
position. Scientifically speaking, your terminology sucks. It is not
cool or sexy - it is mostly just bad marketing masquerading as
science. For the sake of your credibility, my recommendation is to
drop it.