Script from Bits and Bytes 1

Written on April 21, 2017

Hello, this is Bits and Bytes meeting, a regular talk about computer science. Since this is our first meeting, we are looking at the essentials of computer science. This month we’re talking about algorithms. Later we will go into more advanced topics like programming, computer hardware, and efficiency.

((Opinion: the call for interaction here would be boring unless people already know what an algorithm is but even that would be boring. And if they do know, they would be tempted to give out the full explanation. Therefore it is better to go through this very quickly instead of dwelling on “what is an algorithm,” because anyone who doesn’t know what an algorithm is won’t know at the end of the class. But they will know after thinking about it.))

If the word algorithm or computer science invokes fear, do not feel ashamed. I too thought computer science was a mystery (black box). Prior to my education, I had thought that programming was designated for a special person and that normal people couldn’t learn how to program. Fortunately, computers were designed for everyday people, so it doesn’t take divine intervention to learn computer science, it just takes some mentoring and effort.

One question you might have is why learn computer science. Computer science today intersects so many different professions, that it may one day be a required skill set like Microsoft Office is today. One day everyone in the office might be expected to know code. Today children are learning code, and in 10 years, we will have high school graduates with that many years of experience in programming. That means in the land of tomorrow, the regular fast food employee will know computer science. This is great. And we can be part of this future by accepting the challenge of thinking programatically [coined from MIT’s CS department of undergraduate studies].

Go to board for writing down diagram and example with input, function, and output as terms.

Thinking programatically is simply a process to solve a problem. Programs in computers solve problems. [Drawing.] Computer programs take a value, places it into a box, and then shoots out an answer. The process of receiving a value is called input. The box where that input goes is called a function, such as adding the values 2 and 2 to make 4. The inputs are 2 and 2. Therefore we get the output, which is 4. The number that comes out of the box is called the output.

Input: 2 2 Function: addition Output: 4

Write algorithm under diagram. Write “Algorithms solve problems.”

This entire procedure is called an algorithm. Algorithms solve problems.

What is an example of a problem that computers solve? [Don’t wait for response.] Counting. Computers are well-adapted to counting. But so are we. We have ten fingers that can represent different numbers. [Raise fingers to represent numbers.] One. Two. Three. Ten. After ten, we run into some difficulties. So we can use tallies (or unary notation) of five. [Write down ten tallies.] This is an abstraction of our fingers. There are ten tallies. I have ten fingers. The ten tallies represents ten fingers used; however, because I’m not actually holding up ten fingers, I can go onward to twenty tallies, which can represent two people holding up ten fingers. For even more complicated numbers, I use symbols such as 1, 2, 3, 10. The number 10 is ten tallies. Two different ideas, yet they’re layers of abstraction. Ten fingers to ten tallies to the number 10. The design of computer relies on this sort of abstraction to project data on a screen.

Computers use and manage electricity as signals. After many decades of technological advances, computer scientists were able to create a machine that fluctuates the input of energy into information. There are two states of electricity. On and off. When we attribute values to those states between on and off, we create a binary value system of 1 and 0: 1 is on, and 0 is off.

The problem with a binary system is that it’s made of two values, off and on. Alone, communications could only work similarly to how the telegram worked: a series of short and prolonged on and off signals. Computers combine values of binary into sequence. A bit is 4 binary values. A byte is 8 binary values. Each value works like numbers. There is a 1’s place, a 10’s place, and a 100’s place.


Take me home.

Check out the archive.