Educating a robot

Daniel Eduardo
6 min readJan 7, 2021

I grew up thinking that the workings of any technological gadget were magic. Just like how ancient civilizations used to invent gods and myths to explain the natural phenomena around them, I grew up believing in works of the gods like “computers” and myths like “computers are smart and therefore they work.”

I wasn’t able to see that behind a computer lies code, or more importantly a programmer. However, some doubts did arise such as when the microwave or VHS (I’m old I know) would get disconnected and I’d have to reprogram the time. If the computer god was so smart, why did I have to keep teaching it how to tell time?

Photo by Lorenzo Herrera on Unsplash

The pursuit of answering that question has led me down an interesting path, and recently to a career change to study programming. The first time I asked that question to an “expert” (an employee at an internet cafe in my hometown), his answer was so complex and convoluted that I thought he must be more lost than I was. Later on, I asked the same question to a friend of my older sister, who threw some sentences together about a “processor” that stored information on a “hard disk” via an “operating system.” Which of course led me to start watching 90’s-era YouTube documentaries to try to learn about how a microchip works, then transistors, punch cards, tubes, until I finally got to binary code. At that point I had just about given up, starting to think that I would need a masters of information sciences and a PhD in nuclear physics to finally understand, how does a computer work?

To continue, I’ll give a high-level, practical summary of how a computer works, and name what for me were the key milestones in getting to where we are today. We’ll delve into the myth of the magic computers and demonstrate that their magic is a product of centuries of evolution of the marvelous human brain all smashed into less than 100 years, culminating in what will probably be the most significant legacy (positive one at least) of our species.

1. The language of machines: Binary code

Machines understand only in 0’s and 1’s, so the only way to communicate with them is using binary code. Every letter, number, or any other type of information that we want to input to a computer has its own translation into binary.

For example, the number 37 in binary looks like this:

If we want to write a word in binary, we have to define every letter:

instructables.com

So “Hi” would be:

H = 0100 1000 
i = 0100 1001

Even every color has its own numeric value, so to communicate with a computer which color we want, we would translate that color into binary like this:

Yellow: #FFCC99  
F: 0100 0110
F: 0100 0110
C: 0100 0011
C: 0100 0011
9: 1001
9: 1001

Even though modern computers can interpret sounds, at the core those computers still translate that information into 0’s and 1’s to process it.

2. Communicating with machines: Electric circuit

The Talking Machine by Clare Briggs. !922

So how does a computer know what is a 0 and what is a 1? Behind binary code is the electric circuit.

Imagine that we have a lamp:

Every time we turn it on, that represents a 1

Every time we turn it off, that represents a 0

If we want to translate “This it is a parallelepiped” into binary to go through an electric circuit, we have to turn the lamp off and on this many times

This is the essence of how an electric circuit transmits information, which is the primary function of a chip.

By turning off and on in particular patterns, it communicates information in a type of morse code that we know as binary.

Vacuum tube, the grand parent of the microchip

If we look at some of the early computers, we can see how their electric circuits were fitted inside large tubes with minimal capacity, which partly explains why in old movies the computers were so big and with flickering lights.

Over the years as the electrical circuits got smaller, we could integrate more pieces into them to process even more information. Now, increasingly smaller and more efficient computers are universalizing access to programming, which in turn creates more people specializing in working with them. In the last few decades we have been able to create ever more complex methodologies, programs, and languages that enable us to reach even more impressive goals.

3. Human code vs machine code

In the 50’s, programmers used punch cards to indicate to a computer what they wanted it to do. A perforation meant a 1, and a space meant a 0.

Punched card

Using this format, if we wanted to know the result of 36772 times 278, we would need a lot of time (and a lot of space) to do the calculation. In 1951, Grace Hopper wrote the first compiler, A-0 . A compiler is a program that turns the language’s statements into 0’s and 1’s for the computer to understand. This lead to faster programming, as the programmer no longer had to do the work by hand.

This type of innovation saved programmers a lot of time and unified innovation around the ability of computers to process information (Moore’s law). As time passed, operations became more complex, and languages started to contain more methods and functions to condense information to be communicated to machines.

Lets see what does this simple code:

x = 0
while x < 5 do
puts "#{x + 3}"
x += 1
end

Ok, lets explain what it is happening here.

x = 0   

The value of “x” is 0.

while x < 5  

As long as “x” is less than 5 do the following

puts "Hello world" 

The word “puts” ask for display “Hello word” into the screen

x += 1 #Here sum to "x" 1 and the value of "x" is 1
end

After display the information in the screen, change the value of “x” from 0 to 1

so the next time “x” will represents 1. The “ends” represents the edge of the loop but the loops is going to be active until the condition x < 5 is true.

4. Educating a robot

When I finally developed a basic understanding of how computers work, I could understand that computers aren’t super intelligent or magic. They’re just good students who follow the instructions they’re given by human teachers.

For this reason the human brain continues to be fundamental in the functioning of machines. Despite what we might think, imagination and creativity now more than ever are essential traits of human nature to create more complete technology. Because after all, the calculations can be left to computers.

As far as programmers — we are all programmers, from 5-year-old me inputting the correct time into the VHS, to a software engineer in Silicon Valley developing an advanced application. All of us, in some form or another, are teachers to computers. The difference only lies in the subject, the complexity, and the level of instruction, just like what happens in a conventional education system. Five-year-old me would be a kindergarten teacher, and the Silicon Valley engineer would be the university professor.

--

--

Daniel Eduardo

Software engineering student, learning to walk into the awesome world of methods and loops