Lecture 4: Introduction to Programming



Any program is nothing more than a set of instructions for the computer. The computer will execute the command one after the other, in principle in the order as they are written (apart from so-called branching instructions (if, if..else, switch) that we will see later). It will do nothing more and nothing less than what we tell it to do.

Moreover, we have to give (write) the instruction with a lot of care. The computer understands only the thing we taught it to understand


Software engineering


Creating programs always consists of the following steps:
 
  1. Think! Study and analyze the problem. Collect information. Come up with a possible solution. Don't touch that keyboard yet! Use pen and paper.
  2. Write a program. Use whatever editor to enter your program
  3. Eliminate the errors from the program. This is called "debugging". There are several types of errors:
    • compile-time errors. Writing errors (For example when we write prinntf instead of printf). Easy to elliminate; the compiler is going to help us by telling us something like "type-mismatch error in line 34"
    • programmatic errors. For example forgetting to initialize (setting to a certain value) our variables. This can also cause so-called "run-time errors", for example "division by zero error".
    • logical errors (for example, we do not know that the Pythagoras rule is a2 + b2 = c2). The program will run without generating errors, but the result will not be what we wanted.
  4. Analyze the result. Is this what you wanted? Maybe the program wrote "The square-root of 9 is 4" Clearly not what we wanted.
  5. If necessary, go to step 3, 2 or even 1.
Spending some more time on point 1 can often save a lot of time in the other steps.
The First Computer Bug 

Grace Murray Hopper, working in a temporary World War I building at Harvard University on the Mark II computer, found the first computer bug beaten to death in the jaws of a relay. She glued it into the logbook of the computer and thereafter when the machine stops (frequently) they tell Howard Aiken that they are "debugging" the computer. The very first bug still exists in the National Museum of American History of the Smithsonian Institution. Edison had used the word bug and the concept of debugging previously but this was probably the first verification that the concept applied to computers. (copied from http://www./firstcomputerbug.html)



 

The C programming language


C was invented in Bell Labs in 1971-1973. It was an evolution of the language B, which in turn was based on BCPL. In 1983 the language was standardized and that became the official version. It is probably the most used programming language in the world.
The evolution of C went hand-in-hand with the evolution of the UNIX operating system which we are going to use in our lectures (in the form of Linux, which is a graphical variant of UNIX). In fact, UNIX itself was written in C.

A program is a sequence of instructions, or statements which inform the computer of a specific task we want it to do.
Most modern program languages are in a very readible format, close to English, making it easy for humans to read and write programs. This in contrast to earlier programming languages, which were closer to things the computer understand. See for example the assembler language (aula 2).

A very simple C program:

#include <stdio.h>

main()
{
  printf("Hello World\n");
}
 

Let's take a look at this program.

Reserved keywords in ANSI C
auto
break
case
char
continue
default
do
double
else
extern
float
for
goto
if
int
long
register
return
short
sizeof()
static
struct
switch
typedef
union
unsigned
void
while
Note that there is only one function defined in the C language, namely sizeof(). All the other functions are described in so-called libraries. For example, printf can be found in the libary stdio. We therefore have to put the compiler directive #include <stdio.h> in the beginning of our code.

Identifiers


Identifiers, as the name already says, are used for identifying things. This can be names of  functions and names of variables and. This we will see in later aulas. Like in most languages, names of identifiers have some restrictions:


Structured programming

The most important thing in programming is to write clear, logical and structured programs.
 
           main(l
      ,a,n,d)char**a;{
  for(d=atoi(a[1])/10*80-
 atoi(a[2])/5-596;n="@NKA\
CLCCGZAAQBEAADAFaISADJABBA^\
SNLGAQABDAXIMBAACTBATAHDBAN\
ZcEMMCCCCAAhEIJFAEAAABAfHJE\
TBdFLDAANEfDNBPHdBcBBBEA_AL\
 H E L L O,    W O R L D! "
   [l++-3];)for(;n-->64;)
      putchar(!d+++33^
           l&1);}
The program above shows an example of how NOT to program. Do you manage to predict what the program does? Don't worry, neither do the specialists. If you want to know what will be the output of this program, click here.
(program copied from http://www.ioccc.org/)

Quick test:

To test your knowledge of what you have learned in this lesson, click here for an on-line test. Note that this NOT the form the final test takes!


Peter Stallinga. Universidade do Algarve, 14 outubro 2002