When was the last time you put your personal computer to work calculating the infinite digits of an irrational number? Unless you're one of those people who recreationally run programs that calculate millions of digits of π, it's unlikely that any program you use calculates more digits than your favorite calculator utility.

While it is obvious that Alan Turing established many principles and concepts of computer programming in his paper, computing the infinite digits of real numbers is certainly not typical of the activities of computers past, present, or future.

Instead, computers perform complex tasks that programmers have divided into small chunks called functions or procedures or subroutines or methods (depending on the particular programming language). These functions generally perform some specific job in a finite period of time. They begin with some input, crunch that input to create output, and then end, releasing control to some other function.

The concept of functions originated in mathematics. In general terms, a function is a mathematical entity that transforms input into output. The input is known as the *argument* to the function, or the *independent* variable; the output is known as the function's *value*, or the *dependent* variable. Often functions are restricted to particular types of numbers or other objects. The allowable input is known as the function's *domain.* The possible resultant output values is known as the *range.*

Turing mentioned the ...

Start Free Trial

No credit card required