Computers – The Digital World Primer

Share This Entry
  •  
  •  
  •  
  •  

cyber2Computers are a complex idea to wrap heads around. It plays games, runs calculations, browses the web, handles email, emulates hardware, plays video and stores documents. At supermarket cashiers type away at their consoles while scanning groceries. We interface with touch screens at the gas pumps. In some restaurants waitresses send orders to the chefs in the back from tablets. Reading this article you’re likely looking through the display of a computer right now (or you are some kind of wizard). Computers are a foundation of our world.

Despite computers finding every day use in our lives few gamers, let alone every day people, have any idea how all these accomplishments are made behind the scenes. Media, like movies and books, would paint us an image of Artificial Intelligence becoming self aware. They write about ghosts existing in computers and lighting bolts striking systems giving them sudden intelligence. Few people question these ideas because they have no better concept of how computers work than the screen writers.

This article is going to give you a primer on everything you didn’t know about computers. Hopefully you are interested in knowing since you are reading the article. I may use some GURPS nomenclatures but the focus is on understanding just what it is your computer does behind the scenes. After reading you will have a clearer picture of how a computer works and that might help you work them into your table top games.

Some parts will get a bit technical, but I provide examples to help illustrate what I am speaking of.

Computers are Dumb

internet-problem-solution-on-internetEven at software companies, like the one I am employed in, people get into discussions about the intelligence of a computer. People will cite the idea that the human mind is more or less intelligent than a computer. They will talk about ideas such as emotional capacity and creativity. There will even be existential debates of self awareness.

Those debates are utter nonsense. Computers have no intelligence at all. To even relate a computer to the idea of intelligence is similar to comparing a rock to a human. They’re both made of matter, and we might have minerals in common, but the differences are enough that few people will compare the thought capabilities of a stone to a Human… unless they are calling you as dumb as a rock. This is likely because our nature to give objects human qualities in our mind.

The First Commandment

Our current computer architecture works on the principal of instructions. These are similar to the idea of a recipe book, only far less detailed. The computer blindly follows these instructions, one by one. It accomplishes this task with a processor chip. These chips only understand so many instructions. Each one of these instructions are called operations and they are identified by numbers called opcodes.

To put that in english, you give a computer ordered commands with numbers like 15, 23, 10, 2, 223 and it tells the computer what to do next as it processes these numbers. But what do these numbers mean? What instructions are a computer capable of?

These operations are far more simplistic than you could imagine. Addition, Subtraction, move a data from here to there, read this data , divide that number, write data and compare two numbers are all examples of instructions.

You may be imagining a calculator at the moment. I’m not even talking about what you see on a calculator display. These actions happen inside the CPU. When two numbers are added together by the CPU it does put it somewhere. But displaying the result of the operations for your eyes is even more complex than you can imagine. I’ll get to that later.

Teach Me a Little Bit!

To understand how stupidly simple these operations are you’ll first have to visualize the data in a way a computer does.

You all may have seen data stored on your hard disks, on CD-rom drives, transferred over networks and messed with in Ram. You might have concerned yourself with file sizes on your hard drives. You might also have concerned yourself with Gigabit transfers to and from you smart phone. It’s time to peel back the veil of mystery for what decides the size of data!

Every bit of data I spoke of above follows the same concept. The smallest increment of data on a binary system is a bit. A bit stores two states. Depending on where the bit is used it might be on/off, 0/1, true/false and so on. Each one of these ideas is right! Generally we use 0 and 1 for mathematics reasons. As long as you understand that a bit is two states you get the idea. Since all data and processing in a computer is made up of a bit we call this binary.

A byte is eight bits strung together. You can look at it as something like 00010111, 00000000, 11111111, 01010101, etc. If you were to figure out the number of combinations possible for a byte you would get 256. Since a bit has two states, and there are eight of them, the formula for determining the possibilities is 2 x 2 x 2 x 2 x 2 x 2 x 2 x 2 or 2^8 which is 256.

The computer looks at this as bits, but we look at this as numbers. Actually the computer doesn’t look at anything because it doesn’t have eyes, the stupid box. But I’m going to anthropomorphize it a bit. To a computer it can store 00000000 to 11111111 in a byte, to humans this would be 0 to 255. The computer counts upwards like 00000000, 00000001, 00000010, 00000011, etc. We count upwards like 0, 1 2, 3, etc.

(Yes, those bit values are the computer also counting 0 to 3.)

Byte Examples

Knowing bytes as numbers is cool, but how does this help you understand computers? Let me walk you through some examples of how you use bytes every day and clever programmers hide the concept from you.

paletteYou are looking at a computer screen right now. You know that it has pixels. You know that each pixel can be a different color. What you may not have known is each pixel is a series of three bytes. To visualize this, look in some art software like Microsoft Paint, Photoshop or similar. Go ahead and modify your palette or edit your colors. You probably will see a screen as the one to the right. If you modify the red, green and blue of the color (circled in red), it can only have values 0 to 255!

One byte represents red, one byte represents green, one byte represents blue. We call this 24 bits per pixel, because of 8 + 8 + 8 bits needed to show a pixel. Thus one pixel is 3 bytes!

Now imagine your entire screen resolution. If you have a 1080p screen it’s 1920×1080 pixels, or 2,073,600 pixels. If you think of that in data that’s 6,220,800 bytes! Starting to understand why all your cat pictures on your hard drive are taking so much space? All that image data has to be saved! The bigger the image the more bytes it takes!

Of course, we have compression algorithms that work on math. This is why 3 hour movies aren’t 2 trillion bytes of data. A simple example of a compression algorithm would be Run Length Encoding. The idea is, it would take 100 bytes to store 100 of the same numbers in a row. An example of these numbers would be 10 10 10 10 10 10 … one hundred times in a row. But if you just store the count of 100 and then the value 10 after, that’s two bytes! You just compressed your data instantly!

Knowing how pixels are stored helps you understand why we laugh our butts off when writers say “Enhance” for images on cop shows. Your computer can’t invent random numbers for images that weren’t there before. You can’t take 320×200 image and make it clearer at 1920×1080! You can maybe blur pixels together or use effects to make it seem like it, but it’ll never work on photographs.

Let me give you another example of bytes. If you were to take notepad on windows, type one letter into the document and save it, this would be a 1 byte document. This is because of a number to letter concept called encoding. ANSI, ASCII, UTF-8 and Unicode are examples of encoding. The computer has no idea what a letter is, but Humans do. So we use numbers, which the computer does understand, and map them to letters. Here is the ASCII table. Notice it goes 0 to 255?

A final example outside of your PC and into the old world of gaming. Some of us were avid gamers on NES. We used to play Zelda. In Zelda you collect rupees in the game. I bet you noticed you could only get 255 of them. That’s because they stored the count in a byte.

It’s neat understanding a byte, but the reason we can visually see them on the screen is a bunch of effort on the part of various programmers. See, a byte can’t be displayed on the screen so easily. It’s stored in the ram and processor, but to look at it is quite the feat! To show you numbers in bytes, like the letters in a file, someone wrote a bunch of instructions that read memory, look up pictures out of font files, copy those pictures onto the video card’s buffer space, and the video card hardware then takes those pictures and renders them to your monitor over the data cable. You see the letter you typed, the computer is just responding to electrical impulses! It has no idea what a letter is, because it’s not conscious! Just like an assembly line at a factory has no idea what a car is. It’s just doing what we tell it (Unless it’s made up of people, sorry guys!)

One last example to help you understand the numbers to letters. If you were to open the Character Map program in Windows you will see U+0021 or U+0321 and stuff like that in the bottom left corner. This is the number value for that graphic in the font. Programs are written to look up those graphics and copy them byte by byte to the screen as I mentioned above. So if your file has a 65 (0041 in hexadecimal instead of decimal), the picture for an ‘A’ is looked up and drawn! Notice this is also the same meaning of 65 (41 in hexadecimal) in that ASCII table I linked before? We humans standardized the numbers to mean letters! So if you were to write a text file of 2,000,000 bytes, it would have 2,000,000 letters in it! 256 different characters is good for English or Latin languages, but languages like Japanese have thousands of characters. That’s why we have Unicode, UTF-8, and other standards. They use more than one byte to represent a character.

Byte is the smallest data set most programmers can work with (we use bits at times, but not as often), but there are larger data sets. 2 bytes or 16 bits is a word. 4 bytes or 32 bits is a double word, 8 bytes or 64 bits is a quad word. To figure out their maximum numeric value simply use the formula: (2^bit count) – 1. Here is the values.

8 bit = 255

16 bit = 65535

32 bit = 4,294,967,295

64 bit = 18,446,744,073,709,551,615

That’s data in a nutshell. Bytes for data don’t mean anything until you put the number in context. A 65 in a byte for a color isn’t the same as 65 for a letter. It could be the number of lives left you have in a video game. The computer is too stupid to know this. It’s the programmer who gives it meaning. Time to understand how!

Back to You, Instructions!

Your computer never rests. It’s constantly executing instructions unless it’s powered down or in sleep mode. The processor keeps reading instructions and executing them blindly. If it gets a bad instruction it crashes. Some crashes it can recover from, some it can’t. But what is an instruction?

Bytes in memory have numbers that cause the processor to do the work they’re supposed to. These instructions are placed in order in memory. Your processor walks each one to know what to do. People who design processors put out documents detailing how these numbers are supposed to be placed in memory to get the processor to do things. Programmers like to break these numbers down into opcode tables. Click here for an example 80×86 opcode table. There’s only a couple hundred commands a computer can execute. These commands generally do arithmetic, move bytes around, write bytes, compare bytes, point the computer to different places in memory to execute and other simple actions.

That’s it! There’s no voodoo or crazy artificial mind in it. You can see why comparing a computer to a brain is like comparing a rock to a brain. The computer is a machine like an assembly line in a factory. It simply does repeat actions over and over.

The Programmer is Smart

mqdefault

Zero Cool?! I thought you was black!

The computer is dumb, but it’s the programmer who is smart. Programmers set up the sequence of tasks to do something meaningful on the computer. We put the numbers together to do the alchemy you see before you. That may seem ludicrous but we don’t type in numbers all day. At least not the ones who aren’t designing the processors. We use assemblers and compilers to program. I’ll explain assemblers first then compilers.

Assembly

The Human representation of opcodes is called Assembly. Instead of 5, 15, 34, 15, we type things like add, mov, sub and such. We use text editors, like Notepad to write the program. We then execute a program that walks the text file and converts what we typed into the number sequence. This program is called an Assembler.

Here is an example of assembly (I stripped out the comments in the code):

push edi
push esi

mov esi,[esp + 010h]
mov ecx,[esp + 014h]
mov edi,[esp + 0Ch]

mov eax,ecx

mov edx,ecx
add eax,esi

cmp edi,esi
jbe 039h

cmp edi,eax
jb 039h

00-37-23

Did you know The Terminator movie showed assembly on his HUD?

The command comes first and then what the command is operating on. For example, add edx, ecx would take the number stored in ecx and add it to edx. mov edi, [address] would take whatever is in memory at the address (an address being a location in memory, 0 being first byte, and n being the size of your ram) and move it into edi.

The processor has little pieces of memory inside of it called registers. That’s what ecx, edi and such are. There’s only so many. Roughly 24. They are just for temporary storage of bytes and used to accomplish tasks. Ram and such is where long term storage happens. Hard disks, flash cards and the like are for more permanent storage.

Your processor can also output data through the motherboard to other locations where other small computers pick up the data and work with it. These locations will map to things like your hard drive, your video card, your sound card, usb handling, mouse, keyboard and more! These all have their own processors to do work, most of which software developers don’t touch. That’s why your main processor is called the CPU or Central Processing Unit, it’s not the only processor in your computer. Just the main one that executes software.

Assembly is neat sometimes, but if you had to write utterly complex software like web browsers, video games and Microsoft Office in assembly, you’d go insane. We programmers would too. That’s why someone invented the concept of compilers and high level languages.

Compiler

Compilers take high level concepts and turn them into the numbers the computer can understand. Programmers use languages like C#, Visual Basic, C++ and more to accomplish this. Instead of seeing assembly like:

mov eax, [a]

mov ebx, [b]

cmp eax, ebx

je 0388h

Compilers allow a programmer to type:

int a = 5;

int b = 6;

if(a == b)

printf(“They are equal!”);

else

printf(“They are not equal!”);

You’re not a programmer and even you probably got a small idea of what that does. That’s called a high level language. Specifically that was C++ you saw. Complex software has a bunch of people sitting in a room writing things like the above. They end up with hundreds, thousands or millions of lines like you see and when put together they form your software. A compiler program takes all those typed text files and turns them into executable files! There is no intelligence in a computer.

Hardware

With the idea of bits, commands, programmers and processors in mind getting a clearer understanding of how hardware works comes into play. Data can be transferred, stored, read and manipulated. Hardware within the computer just moves bits and bytes around.

USB or Universal Serial Bus, just sends bits one at a time to the computer. These bits contain information from the device. As you move your mouse it’s sending things like “I moved 2 units left and 3 units down!” This interrupts the processor much like someone tapping you on the shoulder while you’re working.

Programmers place little bits of code in that is executed when certain interrupts happen. These interrupts are identified by IRQ numbers. I bet some of you have seen IRQs in your boot up screens, BIOS configurations or windows hardware area. The programmer literally populates a table that says “When the IRQ number for the mouse happens, jump over to this code”. They package this in a driver and windows loads it up.

So when the mouse interrupts the processor, it stops what it’s doing for enough time to handle the communication, then goes back to what it was doing! This little block of programming reads the 2 units X and 3 units Y, then adds to your cursor x/y numbers somewhere in ram. When the OS updates the screen it reads those x/y numbers and updates the cursor on the screen. If you click a mouse button, the computer is interrupted again, it has little pieces of code that examine which buttons are clicked, and tells the OS. The OS uses the previous x/y number to walk all sorts of numbers and computations to figure out which window or app your over and gives it the event. The app responds with setting where to draw your text cursor, determining buttons were clicked, sending packets over the network to tell buddies you fired your equipped BFG gun or anything else software is designed to do on mouse clicks.

Let’s look at your sound card. It has a buffer, or memory with bytes. The sound waves are converted into numbers to shape the wave and uploaded into that buffer. The sound device’s processor just takes over from there, converting these shapes into signals to your speakers and sound comes out! Sound is rated in bits, or the precision of the sound. 0-255 is less precise for a sound wave than 0-65535. That’s the difference of 8 bit and 16 bit sound. Then the khz determines how many wave numbers are needed a second to represent the sound. So more bits and more hertz takes more memory to store sound! Of course, these bits of data are compressed with concepts like mp3 so it’s not 50 megs per song on your computer.

Video cards are a bunch of buffers that represent textures and other images. Programmers upload bytes to the screen buffer to make pixels appear on the screen. They send commands to render triangles for 3d software. Programmers even will upload tiny pieces of software called shaders to video cards so the video card’s processor will execute it! These little programs usually have complex math formulas for lighting and other visual effects. The video card’s processor runs through all these commands and buffers to show you all the amazing pixels you have on your screen.

Computers Have Harddisks

Your hard disk inside. Looks like a record player doesn’t it?

Hard drives, CD-ROM drives and similar devices receive little byte commands too. These commands tell it to move the arm and transfer bytes into or from a block of memory. This lets programmers read sectors, made up of bytes, into ram so programmers can save and restore file data! Once it’s in ram we can use our little instructions to do things to it!

In the old days, programmers had to write all this stuff themselves. As hardware became more complex, operating systems started taking over the work. Instead of programmers having to encode commands to a video card, we tell the operating system “Hey, I want to draw a triangle with these vertices” and the operating system tells the video card driver. The video card driver encodes that into logic for the specific type of video card processor and it’s done! That’s why every video card has it’s own driver; why hard drives have drivers; why usb head sets have drivers! It lets us swap hardware in and out and let manufacturers do the work. Programmers can just send generic commands and we sit back and relax pretty!

To summarize, your computer is an over glorified byte mover than can perform math operations. It’s the programmer that breaks everything down for it to understand. The number of instructions it can handle a second are related to the hertz of the system. 2ghz system means it can perform 2 billion cycles a second. Not every instruction is 1 cycle. Some are more. However, more hertz means programs get through their work faster and that means less sluggishness for the user!

GURPS Me Up Good!

As you can imagine it’d be utterly impossible to write all that in a book for role playing consumption. How would you begin to stat all those concepts above simply? This is why GURPS created Complexity. It’s a general relative idea of the work a computer can handle. Software has complexity because it may be more robust or perform many operations! This is all hidden away behind Complexity. This is good enough for gaming.

Artificial Intelligence

data-descent

Data from Star Trek and the MS-DOS video game Descent use matrices for AI. They use the same math theories for learning!

If computers are so stupid, how does artificial intelligence work? I’m glad you asked. Well, at least I’m glad I asked for you.

Artificial intelligence is a complex subject. What artificial intelligence means depends on the context of the question. Some companies will claim their complex algorithm to convert sound wave numbers into text (Text to speech) is artificial intelligence. Others will claim the fact Goombas walk to the left in Super Mario Brothers is artificial intelligence. Where do we draw the line?

The line is more blurry than you could imagine. Your favorite media sources did just enough research to get an idea of how we could use programming to create intelligence. They then employed it in the big screen. Many AI practices, some which are very genius, are employed within video games. Not only today but longer than you would expect. The difference is, media will break the rules when it helps the story. Programmers don’t get a magic wand to turn an AI into something incredible.

To best understand how AI relates to programming and a computer, I will use GURPS Ultratech to explain. I will talk about each intelligence method and how it is programmed. It’ll be high level overview since whole books are written on the subject. You’re free to read these books when you get a chance.

Cyborg Brain

The cyborg brain, or the brain in a jar, is often the easiest intelligence to understand. You take a living creature and simply wire it to systems. This process would likely follow feeding and receiving nerve impulses, converting these impulses into something the computer can manage in numbers, then simply following code, interrupts and other methods to perform tasks.

Imagine if a nerve normally wired to a hand fires. This could cause programming code to move actuators and twist artificial joints. That leads to the ability to create cybernetic bodies. The key hardware is of course nerve interceptors. The second key ingredient is a means to keep a brain alive in a jar. Strangely this may not be a brain that was once a living animal. Scientists might actually grow brain cells to stuff in the jar and you have an infant robot. Sound far fetched? That’s exactly what a scientist did here.

While this may not seem like artificial intelligence, neurons are a foundation of many AI practices. Growing a mass of cells and putting them together for learning certainly is artificial. This also shows how programming relates to the Cyborg Brain.

Drone

Remember my talk on how the mouse interface works above? The drone follows this hardware concept. Bytes of data are transferred to some receiver in the drone. This may be received over radio waves, wired in, connected by infrared or maybe magic. The device gets signaled data arrives. The software inside will read the bytes and blindly execute commands as it reads them. This might be a remote control stick that sends bytes saying “Lift flaps, speed up motor, spin treads in opposite directions” and similar to give control.

Either way the drone can do nothing without input. Nothing at all. Therefore this does not count for intelligent, but it still requires software to interpret the multitude of commands. There might be video feeds that get sent between pilot’s hud and the drone with these bytes, and even very complex control methods.

GURPS rates the sheer number of commands and interpretation of them as Complexity 3.

Weak Dedicated AI

I think a better term for this AI would be Static AI. The idea behind this AI is it fulfills a specific task. It may store a few data values temporarily to complete the task, but that hardly makes it able to learn. Some ways this might be accomplished is a Finite State Machine. This is certainly a classic AI for video games.

The idea of the FSM AI is everything can be defined in states. These states are various values that shift based on input. MMO games and similar RPGs are great examples of the FSM.

You might have a state for your health level. It might say your health is full, and the code just ignores it. It might say your health is halfway, which is dangerous. This might trigger a chances for some subsystem in the code to warn others of the state. This might also cause the AI to seek help if nothing higher priority is happening. If the health is very low, this might cause the code to flee, hide, cast heal spells, or other events may happen in these games. The states are rather simple in this example. An android or other object might have hundreds or thousands of states it evaluates to select various actions in priority.

The flow of the AI will be very much like a programmer’s code because it is. The AI just acts based on the state of things. It can not learn because the states never change. It cannot really be called intelligent because it’s basically just following commands like the processor does. It’ll be predictable because it will always respond to the states in predictable patterns. There might be some routines to store subsets of data, like nodes on a path finding tree to record important locations. However, it doesn’t learn a different behavior. There might be enough of a mind to make people believe this device is thinking behind the scenes, but it really isn’t. You smile at it, it smiles back. You frown, it asks why you are sad. That seems like intelligence, but this could just be facial recognition software triggering vocal responses.

GURPS rates the AI as Complexity (IQ/2). This is reasonable because the software is likely to be relatively simple, but have to scale by the sheer volume of possibilities. As I said, it can have a few states of being with a few responses. It could have thousands, even millions all with situations, priorities and actions to respond to.

Non-Volitional AI

Non-Volitional AI is able to learn, but it learns enough to perform various tasks better. This is much like the Weak Dedicated AI, but it has some level of self improvement. The enemies in various video games with learning AI are examples of this. They stick to their programming and task, but might have some level of freedom how to accomplish it. This might teach them a better means of aiming, this might also teach them a better piloting maneuver to survive.

Generally the programming has a means of tracking success and failure in a task for improvement of methods. If you, for example, tell this AI to attack someone, the software might rate it’s performance in time it took and how much damage it received. The software might respond by tweaking the path of bullet fire or how to evade incoming attack to improve results. This can seem like it can grow beyond the code, but at the end of the day, it’s just an advanced tweakable heuristic system.

This AI might fool some people. Following the smile example above, it might be told to entertain you so it makes a funny face. You don’t smile so it tells a pre-recorded joke. You smile and that gives it feedback that jokes are a better method to make you smile. It selects another joke but it was bad. You frown. It might start weighting which jokes are entertaining. It might try another funny face. However, pulling a gun and shooting you in the face aren’t in it’s options.

It always sticks to the directives. Entertain you was the directive. It can’t go beyond that. It will continue to try until you tell it to stop. It might programmatically determine the best method from it’s subset of possibilities on how to accomplish it, but that’s not intelligent.

GURPS rates the AI as Complexity (IQ/2) + 2. This is going to have decently complex programming. Creating a changing systems that can determine heuristics and select best values for accomplishing something is difficult. Especially with a limited set of computer resources.

Volitional AI

terminator

Both The Terminator and the huge creatures in Black & White by Lionhead Studios used Neural Nets for their Artificial Intelligence.

This is a step into the harder parts of programming. Volitional AI is going to seem every bit as intelligent as it can be. It’ll learn, have strengths, weaknesses and more. It can even have emotions!

The question is, how does this work for a computer that has no thinking? Simple. The program that runs this intelligence will likely have a digital method of charting cause and effect and trying many different things. This might be a neural net that somewhat simulates a neuron, or it might be matrix math data sets with continuous updates. However, tracking data, altering it, and weighting actions, feedback and more is a huge part of this.

This AI will really blow people away if given all the inputs of a Human for feedback. Touch, sight, voice, hearing and more. Something odd could come about if this software is fed different inputs for “sensory”. Input is just hardware in the body. The downside is, someone could get into the working data model and readjust all the weights to change the behavior of this AI. After all, it’s still a software running through iterations of data sets.  The weights and algorithms are applied through the data sets. And outputs are returned. Change the algorithms and data sets and you change the entire personality.

GURPS rates the AI as Complexity (IQ/2)+3. The data becomes complexly large. The software will have to be fast and potent to manage the sheer data sizes. The programmer will also have to have a strong grasp of math theories with hardware to give feedback on every level.

Mind Emulation

This one becomes very complex to nail down. We’re talking physics level software able to completely simulate a neuron. The sheer math involved would be staggering. A Core i7 processor might be able to do it, but the amount of time it’d take to do all the math on one neuron might stagger someone. See you in a few years after the computer finishes computing a split second worth of time for changes on the neuron! After which it’d have to repeat for the next time slice. What about a brain full of neurons? We’re talking a model of a physical reality, or close enough to accomplish all the tasks a neuron does.

This is why many sources say quantum computers will be able to create such a simulated environment. They accomplish complex mathematics instantly or near instantly. That is something that is required to emulate billions of neurons in a physics environment. We managed to get 10,000 emulated by today. Perhaps quantum computing might bring this to the next level. I’m not the only one who thinks so.

This may be something that can only be accomplished on a hardware level. There may need to be some physical nanoscale material to mimic a neuron and be able to be fashioned into a brain. Though I’m betting the quantum computer will be fast enough at math.

Conclusion

Hopefully this article puts computers in digestible terms. Clarifying devices, programming and AI might actually lead to more possibilities. While this is speaking purely on current technologies, the future has wild notions for computers that may come into play. Biological systems have been used for processing in experimentation. Quantum computing is on the rise. And you never know what science might come up with.

 

 

 

 

Posted in Tips and tagged , , , , , , , , , , , .

Leave a Reply

Your email address will not be published. Required fields are marked *