Section 1: Introduction to Computer Programs and Computer Programming
This section gives you an understanding of the relationship between computer programs and programming languages, as well as providing you with an understanding of how code is executed on a computer.
This section has the following chapters:
- Chapter 1, Introduction to Computer Programs
- Chapter 2, Introduction to Programming Languages
- Chapter 3, Types of Applications
- Chapter 4, Software Projects and How We Organize Our Code
Chapter 1: Introduction to Computer Programs
Programming is the art and science of writing instructions that a computer can follow to accomplish a task. This task can be playing a game, performing a calculation, or browsing the web, for example. However, before we can learn how to write programs, we should understand what a program is and how a computer can understand and execute the instructions we give it. In this chapter, we will study this in more detail, along with the basics of what a computer is, how it works, and its history.
Even a basic level of understanding of these topics will help us later on when we discuss the different aspects of writing programs, as we can then relate to how the computer will treat the code we write.
In this chapter, we will cover the following topics:
- A perspective on the history and origins of the computer
- Background knowledge of the original ideas behind programming
- Understanding what a computer program is
- Learning how a computer program works
- An understanding of what machine code is
A brief history of computing
Humans have always built tools and made innovations to make life more comfortable and to allow us to do more things faster and more efficiently. We need to go back in time a few hundred years in order to see the first attempts at building a tool that could resemble a computer. However, before we do that, we might want to define what a computer is. Wikipedia offers the following definition:
A computer is a machine that can be instructed to carry out sequences of arithmetic or logical operations automatically via computer programming.
So, a computer is a programmable machine that performs arithmetic or logical operations. Let's review a few inventions from the past using this definition to ascertain which of them could be considered a computer.
To begin, we can rule out the Jacquard machine, which was the automated loom invented in the early years of the 19th century. These looms could be programmed using punch cards, but they produced woven silk, which, of course, is not the result of an arithmetic or logical operation. Programmability, using punch cards, was an idea that survived well into the computer age, but these looms were not computers.
If we go even further back in time, we find devices such as the abacus that helped us to get the results of arithmetic operations; however, they were not programmable.
In the 1770s, Pierre Jaquet-Droz, a Swiss watchmaker, created some mechanical dolls that he called automata. These dolls could read instructions and could thereby be considered programmable, but they did not perform arithmetic or logical operations. Instead, he created one doll that could play music, one that could make drawings, and one that could write letters (they are referred to as the musician, the draughtsman, and the writer):
Figure 1.1: The Jaquet-Droz automata (photograph by Rama, Wikimedia Commons; Cc-by-sa-2.0-fr)
In order to see something that resembles a computer, we will need to look at Charles Babbage's inventions. He originated the concept of a programmable computer with his ideas for a machine, called the Difference Engine, and later, a more advanced version called the Analytical Engine. Of the two, the Analytical Engine was particularly groundbreaking as it could be programmable, which meant it could be used to solve different problems. He presented his work in the first half of the 19th century, and even if the machines were never completed, we can agree that Babbage is a very important person behind the basic concept of the programmable computer.
During the first half of the 20th century, we witnessed some analog computers, but it was not until the second world war, and the years following, that we saw the birth of real digital computers. The difference between an analog and a digital computer is that the former is a mechanical machine that works with an analog input such as voltage, temperature, or pressure. In comparison, a digital computer works with input that can be represented by numbers.
Many people consider the Electronic Numerical Integrator and Computer (ENIAC), constructed by J. Presper Eckert and John Mauchly between 1943 and 1946, as the first digital computer because it was the first one that was both completed and fully functional:
Figure 1.2: Betty Jean Jennings and Fran Bilas, both programmers, operate ENIAC's main control panel – U.S. Army Photo (Public Domain [PD])
Since then, we have seen tremendous development up until the point we are at today. However, even though our modern computers can do so much more and at a much faster rate than these earlier inventions, the basic principles of how they operate remain the same.
A brief history of programming
A programmable computer needs to be, well, programmed. So, of course, the history of programming goes hand in hand with the evolution of computers.
In 1833, Charles Babbage met Ada Lovelace, daughter of poet Lord Byron. She became very impressed and interested in Babbage's plans for his programmable machines, and their collaboration began. Among other things, she wrote some notes outlining her ideas for how the Babbage Analytical Engine could be programmed. We can call her the inventor of programming, even if we had to wait over 100 years until we had the machine that could make her ideas come true. Her status today is summarized in a History Extra article, from 2017, by James Essinger:
Today, Ada is quite rightly seen as an icon of feminist scientific achievement, a heroine of the mind, and one of the earliest visionaries in the early history of the computer.
In her notes, Lovelace did a couple of remarkable things. The first was that she wrote an algorithm for how Bernoulli numbers, a sequence of rational numbers often used in number theory, could be calculated by the Analytical Engine. This algorithm is considered by many to be the first computer program. Second, she outlined the future of what these machines could do, and, in her vision, she saw that they could be used to draw pictures and compose music. The fact is that when we finally could build a computer, the way they were programmed was heavily influenced by her ideas:
Figure 1.3: Ada Lovelace, aged 17 (portrait by Joan Baum; PD-Art)
The first digital computers were programmed using machine code – the only thing a computer understands. Later in this chapter, we will talk more about machine code and explore what it is. And, as you will discover, it is just a sequence of numbers.
In 1949, John Mauchly proposed something called Brief Code, which was later renamed to Short Code. Short Code can be considered to be one of the first higher-level programming languages. A higher-level programming language is a way for us to write inst...