Part I
M for the Novice Programmer
1
A Brief Introduction to Computers
The purpose of this chapter is to provide an introduction to computers that will set the stage for writing computer programs. Its goal is to present information that will enable a novice to understand a little more about what goes on inside a computer as it affects oneâs ability to control the operations of the machine. In other words, instead of describing what a computer can do for you, this chapter helps you to understand how a computer does things for you, and it sets the stage to allow you, the user, to tell the computer how to do new and different things for you â things that you yourself will tell it to do.
In order to make sure that each reader has the same understanding of basic concepts of computers, this presentation begins at a very primitive level in describing computers. However, if you have not programmed a computer before, you may do well to read the entire chapter, because the information is presented in a different way from conventional texts.
So relax, be patient, and see if you can find some elements in the next few pages that are new to you as you also learn what it is that we will be concentrating on in the remainder of the book: writing programs to instruct the computer to do our bidding.
Note to Experienced Programmers: This chapter summarizes the principal components of computers and describes the concepts of hardware, software, operating systems, applications packages, and programming languages, both interpreted and compiled. Anyone with experience in programming will already be familiar with most of the material presented in this chapter and should skip directly to Chapter 2.
Basic components of computers
Human beings have used aids to help count and perform arithmetic since the beginning of recorded history. Fingers were used before that, and no doubt pebbles, sticks or other objects served similar purposes. A fascinating account of how our use of numbers evolved may be found in Karl Menningerâs Number Words and Number Symbols (MIT Press, 1969). The abacus, developed in the Orient, appeared some time in the last few hundred years (more recently than most people realize). Mechanical calculating devices used by the ancient Greeks have been recovered from the floor of the Aegean Sea, and a number of interesting calculating aids were developed during the Renaissance and later in our pre-electronic history.
One of the important historical contributions shortly after 1800 was the invention of a loom that could be instructed to print complex patterns without setting the threads manually. This loom, called the Jacquard loom in honor of its inventor, revolutionized weaving and created one of the first dire predictions that automation would drive humans out of useful work. An important concept in that invention was the idea of a set of instructions stored on wooden paddles that were fed into the loomâs insides to program the movement of threads to produce specific patterns. This invention contained the beginnings of what computers do. First, we have some hardware (in this case a loom), which is instructed how to behave by a program, in this case produced by externally stored codes that were fed into the loom to produce the output of a consistent pattern. A very few years later, an Englishman named Babbage conceived of a notion whereby the same sorts of instructions could be stored inside his âanalytical engineâ to perform calculations. From his work came the idea of a stored program, a fundamental concept in computers today. Unfortunately, the machine tools for constructing hardware in the mid-1800s were not sufficiently precise to turn Babbageâs concepts into reality, and his work languished after several unsuccessful attempts.
Not much happened from the mid-1880s until World War II, by which time electronics had been discovered and advanced to the point where it was possible to perform electronically what Babbage had tried to do mechanically. The old vacuum tubes used in early computers were not too reliable, but they were all that was needed to take the fundamental concepts and implement them in âhardwareâ that would work, at least some of the time. Since the 1940s, the progress of computing technology has been truly staggering, as all of us know. A Time Magazine article on the âMan of the Yearâ in 1982 (actually the...