About Me

My photo
Australian philosopher, literary critic, legal scholar, and professional writer. Based in Newcastle, NSW. My latest books are THE TYRANNY OF OPINION: CONFORMITY AND THE FUTURE OF LIBERALISM (2019); AT THE DAWN OF A GREAT TRANSITION: THE QUESTION OF RADICAL ENHANCEMENT (2021); and HOW WE BECAME POST-LIBERAL: THE RISE AND FALL OF TOLERATION (2024).

Saturday, September 12, 2009

Interview with Greg Egan

The new issue (#42) of Aurealis contains an interview that I did with Greg Egan last year (i.e., I was interviewing Greg, not the other way around, much as that might also have been an interesting exercise). If you're going to ask me whether it's available on the net, well, I believe not. You might just have to buy a copy, much as that's a radical thought.

I'll give you a teaser, though. We got into some deep philosophical waters, including when I asked Greg about the various blows to human exceptionalism that have come from science over the past four or five hundred years (beginning perhaps with Copernicus). After some other thoughts on the subject, Greg adds:

... I think there's a limit to this process of Copernican dethronement: I believe that humans have already crossed a threshold that, in a certain sense, puts us on an equal footing with any other being who has mastered abstract reasoning. There's a notion in computing science of "Turing completeness", which says that once a computer can perform a set of quite basic operations, it can be programmed to do absolutely any calculation that any other computer can do. Other computers might be faster, or have more memory, or have multiple processors running at the same time, but my 1988 Amiga 500 really could be programmed to do anything my 2008 iMac can do — apart from responding to external events in real time — if only I had the patience to sit and swap floppy disks all day long. I suspect that something broadly similar applies to minds and the class of things they can understand: other beings might think faster than us, or have easy access to a greater store of facts, but underlying both mental processes will be the same basic set of general-purpose tools. So if we ever did encounter those billion-year-old aliens, I'm sure they'd have plenty to tell us that we didn't yet know — but given enough patience, and a very large notebook, I believe we'd still be able to come to grips with whatever they had to say.

Discuss!

Meanwhile, over on his own site, Greg has posted a fascinating four part trip diary covering his visit to Iran last year. Though his visit, with all its adventures and misadventures, was well before the recent election and subsequent protests, it provides wonderful insight into contemporary Iran. Greg writes with his usual lucidity about how the country seems on the ground ... to an educated and liberal Westerner who has managed to pick up a basic knowledge of Farsi from his work with refugees in Australia. I can't recommend this too highly. Do have a look.

Coming up soon from Greg is his essay "Born Again, Briefly", in 50 Voices of Disbelief: Why We Are Atheists .

5 comments:

Athena Andreadis said...

Greg Egan is formally correct: one thing they tell you early on in Computing 101 is that a ticker tape machine is equivalent to any computer, regardless of power.

On the other hand, conscious minds may also be equivalent in terms of ability to decipher, understand, etc -- but they way they think and the conclusions they reach may be fundamentally different depending on the chassis and environment.

That Guy Montag said...

Actually there's precedent on this point in analytic philosophy from Donald Davidson. I'm very fond of his attack on conceptual schemes because it has two major implications at least in my life. The first is that if the aliens are thinking they're translatable, as Egan says, given enough time. The second implication is that you get to say bullshit when someone tells you that religion and science are fundamentally 'different ways of knowing' and we therefore can't compare their conclusions.

D said...

We already know that an Einstein was able to explain to us (well, those of us who aren't too stupid, which is sort of the point) everything he thought up, and I don't doubt that an alien who's only moderately smarter than an Einstein would manage to explain to him everything it knew. All of this is consistent with me, Einstein and the alien all being turing machines, so one wonders how important the point really is:
- that stuff about me being able to understand all the thoughts of Edward Witten given arbitrary amounts of time doesn't do that much given that my time is quite limited.
- Intelligence seems to be more about the stuff you're running in your brain than about the equipment it runs on. This is true even of computer software - both Chessmaster and Rybka run on the same hardware, but the latter plays Chess about 400 ELO points better.

wsinda said...

I seriously doubt if Turing completeness has any practical use in this. By the same argument you could say that earthworms have the same hardware as us (neurons, sensors, etc.), so in theory they should be able to understand us. But the complexity of our brain is several orders of magnitude higher than theirs. Then I can also envisage aliens having much more complex brains than us, with modes of thinking that we simply cannot grasp.

Harald Striepe said...

I believe one issue the Touring Principle does not cover is that conscious understanding requires a certain amount of simultaneous modeling, e.g. sufficient working memory, to grasp complex relationships. Our system is in fact highly parallel.

This is where simple machines and simple minds have the disadvantage.

It is different from executing a single stream of tasks.