Skip to content

March 22, 2013

Do Androids Dream of Electric Sheep?


Humans have always been fascinated by the idea of talking with machines. Almost no main stream science fiction movie exists that does not pick up the idea of audio-visual communication with computers, the most well-known probably being 2001 – A Space Odyssey, Blade Runner, Star Trek, The Hitchhiker’s Guide to the Galaxy and I, Robot.

In all of these movies, humans can directly communicate with computers using voice commands (“Tea. Earl Grey, hot”) or more elaborate sentences and the computers will talk back to humans as a result. In some of the movies, the computer is a non-bodily entity (like the ship’s main computer in Star Trek), but sometimes it even is humanoid (like Cmdr. Data in Star Trek or the Replicants in Blade Runner).

In a Pecha Kucha presentation I recently delivered at the OOP conference in Munich, I took on this theme and started my talk with a imaginative conversation between two computer programs that most of you might know very well: Eliza and Siri.

Do Androids Dream of Electric Sheep? from Peter Friese on Vimeo.

In this Pecha Kucha session, Siri and Eliza join me live on stage to explain why we do not (yet) use the potential of our smart phones.

Much has been said about Siri, so let’s focus on Eliza.

Eliza started as an experiment in 1964, when Joseph Weizenbaum, then professor of computer sciences at the MIT, wrote it as an early attempt to research natural language communication between man and computer (see his ACM paper “ELIZA – A Computer Program For the Study of Natural Language Communication Between Man And Machine”). At its heart, Eliza is program that reacts to specific key words in sentences entered by a human user and will then respond to those keywords according to a predefined script. It’s the script that basically makes up the “personality” of Eliza, the most famous one being Doctor, simulating a rogerian psychoanalyst.

(Weizenbaum was rather surprised to see how people thought that Eliza had a real personality and a capacity for empathy. In reaction, he started to raise uncomfortable questions about our dependence on computers.)

In my Pecha Kucha, I went on to tell my audience to make better use of the capabilities of our smart phones. Most people have stopped thinking about it, but after all, we’re carrying a super computer in our pockets.

So to set a good example, I decided to create ElizaApp – an app that can listen to what you say and answer in spoken language. This is going to be a great project, because I’ll show several very interesting things:

  1. How to analyse spoken language on a mobile device
  2. How to synthesize speech on mobile devices
  3. How to integrate a JavaScript engine in your mobile app
  4. How to create a Siri-look-alike chat UI

Over the course of the next few weeks, I will write several posts covering these topics. If there is anything that interests you in particular, feel free to add a comment.

Of course, in the end Eliza will be available on the App Store. In the mean time, be sure to check out ElizaApp and register for early access!

Thanks for reading this post. Follow me on twitter here to be notified about updates and other posts I write. Or, subscribe to my RSS feed here. If you want to get in touch with me, use the contact form.

Fork me on GitHub
Read more from iOS, Talks