The Mischievous Nerd's Guide to World Domination
Author: Stephen Oberauer

Chapter 16
Artificial vision


By November my little computer program had become far more advanced. Having had experience in 3D graphics during my last year at school, it was easy to figure out how to convert my AI program from 2D to 3D.

If I clicked on an artificial idiot on my screen I could see in 3D, in a window, exactly what they would see. I had created a 3D world for them to move around in with corridors, obstacles and food. I know I said that I wanted to make them seem as human as possible, but I decided against the need for artificial toilets. I thought it was something they could do without. We have 5 senses and so far I had only given them 1, so I still had a long way to go. The next thing I had to do was find a way to communicate with them, which was possibly impossible.

As always I started small. I added some functionality to my program so that I could type things into the computer and they could respond by displaying words on the screen. A baby learns to speak by hearing words, repeating them, remembering what happened when certain words were spoken and using those words within the same context that it heard them in. My program was taught to do the same thing.

The first thing that happened when I activated the new features was that my little creatures would send me a completely random sequence of letters like ‘IFMAUNFEWUJL SDFHZXPOQ’. I had to teach them to speak properly, so when their hunger ratings were low I would type ‘food’ and then place an item of food next to them. They learnt the word immediately and instantly repeated the text ‘food’ more times per second than I could register. I gave them another item of food each and the word ‘food’ continued to be displayed.

I stopped feeding them and after a few minutes it was back to random letters, but every third word was still ‘food’. I had to teach them more words.

The method I came up with was that I added another rating to the list which, like their hunger rating, affected their happiness. It was an appreciation rating, a value that I could increase or decrease by clicking a button to show them that I appreciated what they were doing. I then spent a couple of weeks building virtual 3D objects that they could recognise. I would show them the object, type the name of the object, increase the appreciation rating and then remove the object. I would then show them an object and if they sent me the correct word I would increase the rating, otherwise I would decrease the rating. It worked like magic. Soon they could recognise all of the objects that I had designed. Unfortunately they still didn’t know what those objects were used for, but I had to do this one small step at a time.

The next couple of months was spent making them more complex. I added three extra virtual senses and gave each object taste, smell and touch information. Each food item would give them a different taste rating, so some food would taste bad, but decrease their hunger and so they were forced to make decisions as to whether they would eat a particular type of food or search for another type. They learned that the faster they would travel, when bumping into a wall, the more virtual pain they would feel and they would therefore became more aware of their speed and what was in front of them. I also created different levels and introduced the concept of gravity so that they could learn to be afraid of heights.

Hearing was going to be tricky. I had to research what kinds of information a computer would process when speaking into a microphone and simulate those. I had previously written software to simulate drums for my bands, so I had experience working with computer audio, making this not too difficult. After another month I had given some of my objects sounds. I had made a virtual cat which randomly roamed the 3D hallways. I had made a copy of John, and just changed a few of the copy’s attributes, like making it feel fluffy, not be attracted to female virtual humans, unable to send messages except ‘meow’ and used a picture of our pet cat instead of a picture of myself. I made the cat produce a virtual meowing sound so that my artificial idiots soon associated the sound with the cat and walked towards it in order to get a little bit of happiness by feeling its fluffy coat.

Another 4 months of effort was enough figure out how to make them speak whole sentences. John’s first sentence was ‘Give me food’, and Jill’s was ‘Give me John’.


Buy this book

"This extract remains the exclusive property of the author who retains all copyright and other intellectual property rights in the work. It may not be stored, displayed, published, reproduced or used by any person or entity for any purpose without the author's express permission and authority."

Please rate and comment on this work
The writer appreciates your feedback.

Book overall rating (No. of ratings: 
Would you consider buying this book?
Yes | No
Your rating:
Post a comment Share with a friend
Your first name:
Your email:
Recipient's first name:
Recipient's email:

Worthy of Publishing is against spam. All information submitted here will remain secure, and will not be sold to spammers.

No advertising or promotional content permitted.