Pages

What's wrong with Simon?

It was a dark and stormy night when our heroine, the lovely Ada — precocious programmer, arm-chair historian, died-in-the-wool bibliophile — first heard about Simon. She had in her hands a 1949 copy of Edmund Callis Berkeley’s classic cookbook, Giant Brains, or Machines that Think. Ada was what Norman E. Steenrod calls a grasshopper reader. She was hopping around in Chapter 3:
We shall now consider how we can design a very simple machine that will think. Let us call it Simon, because of its predecessor, Simple Simon. ...

We can say that Simon has a mentality of 4. We mean not age 4 but the simple fact that Simon knows only 4 numbers and can do only 4 operations with them. But Simon can keep on doing these operations in all sorts of routines as long as Simon has instructions. We decide that Simon will know just four numbers, 0, 1, 2, 3, in order to keep our model mechanical brain very simple. ...

What are the four operations with numbers which Simon can carry out? ...

[W]e could have the operations of addition and negation defined as follows: ...

With only these two operations in Simon, we should probably find him a little too dull to tell us much. Let us, therefore, put into Simon two more operations. Let us choose operations involving both numbers and logic: in particular, (1) finding which of two numbers is greater and (2) selecting. In this way we shall make Simon a little cleverer. ...

Let us turn now to the operation selection. By selecting we mean choosing one number a if some statement P is true and choosing another number b if that statement is not true. As before, let p be the truth value of that statement P, and let it be equal to 1 if P is true and 0 if P is false.

Smitten with the sublime simplicity of Berkeley’s machine, Ada embarked on a quest to simulate Simon’s selection operation. She created a class called Simon. (Java was the language du jour.) Then, using Java's built-in selection operator, she composed a method she called select. It was the epitome of reinventing the wheel:

class Simon
{
    public static int select(boolean P, int a, int b)
    {
        final int p = P ? 1 : 0;
        return a*p + b*(1 - p);
    }
}
Java's selection operator, P ? a : b, returns one value a if some boolean expression P evaluates to true and another value b if that expression evaluates to false.

Ada had her hammer. She needed a nail. So she cracked open Knuth. Very gravely, she began at the beginning. Her search was brief. On page 2 she discovered just the sort of thing she was looking for. So she stopped. This is what Ada found:

By 1950, the word algorithm was most frequently associated with “Euclid’s algorithm,” a process for finding the greatest common divisor of two numbers which appears in Euclid’s Elements (Book 7, Propositions 1 and 2.) It will be instructive to exhibit Euclid’s algorithm here:

Algorithm E (Euclid’s algorithm).  Given two positive integers m and n, find their greatest common divisor, i.e. the largest positive integer which evenly divides both m and n.

E1. [Find remainder.] Divide m by n and let r be the remainder. (We will have 0 ≤ r < n.)
E2. [Is it zero?] If r = 0, the algorithm terminates; n is the answer.
E3. [Interchange.] Set mn, nr, and go back to step E1. ▮

With Simon.select champing at the bit, Ada drafted another class. She called it Euclid, and she implemented a method called gcd. Initially, Ada’s program looked like this:

public class Euclid
{
    public static int gcd(int m, int n)
    {
        final int r = m % n;
        return Simon.select(r == 0, n, gcd(n, r));
    }
    public static void main(String[] args)
    {
        final int m = Integer.parseInt(args[0]);
        final int n = Integer.parseInt(args[1]);
        System.out.print(gcd(m, n));
    }
}

Ada paused. She thought. Then she jumped. Aha! She refactored. Then gcd looked like this:

    public static int gcd(int m, int n)
    {
        return Simon.select(n == 0, m, gcd(n, m % n));
    }

Finally, Ada tested her code. She ran her program, passing it 8 and 12. She expected 4 in return. Ada stared at the output. This is what her program produced:

Exception in thread "main" java.lang.ArithmeticException: / by zero
 at Euclid.gcd(Euclid.java:13)
 at Euclid.gcd(Euclid.java:13)
 at Euclid.main(Euclid.java:19)
Command exited with non-zero status 1
Stymied, Ada called her friend Grace, an experienced debugger, who leaped at the opportunity to help. (Grace was a hopper too.) Programmers are often troubled by bugs. But Grace looked at Ada’s logic and smiled.

“Bugs that short circuits are anathema.” Grace said — almost, it seemed, to herself.

“But in your case,” Grace obliquely explained to Ada, “a bug has emerged from the absence of a short circuit of a different sort in your ridiculous reinvention of the wheel.”

Grace gave Ada a tender kiss on the cheek and whispered, “Keep it simple, silly.”

On her way out, Grace turned and commanded, “Tell Simon I said, Goodbye!

Dramatis personæ

Edmund Callis Berkeley (22 February 1909 – 7 March 1988)

Berkeley was born in New York City in 1909, graduated with an AB summa cum laude in mathematics and logic from Harvard College in 1930, entered the computer field in 1938 as an actuary using punched-card machines for the Prudential Insurance Company of America, and worked with Howard Aiken during the war as an active-duty naval reserve officer. After demobilization, he returned briefly to Prudential, where he participated in studies that led to the purchase of a Univac I. In 1947, he invited seven friends to a meeting that resulted in the establishment of the ACM (then known as the Eastern Association for Computing Machinery). ...

In 1949 he wrote the first carefully crafted and widely accepted popularization of computers, Giant Brains, Or Machines that Think, a book that fastened the “brain” name on computers and presented a somewhat optimistic view of what computers could do or would do soon. Berkeley never recanted and insisted all his life that he had it right in his first book. Giant Brains was followed in 1956 by Computers-Their Operations & Applications, and 13 other books that had total sales in excess of $110,000. The objective of all his writing and editing was not to entertain but to educate, uplift, and improve his readers, although he sometimes despaired of being able to do so. His later books and articles were often concerned with the problems of how to think clearly and act wisely. His language, in writing and speaking, reflected his mind: precise, careful, and crisp-determined to be both correct and clear, not just brief, amusing, or acceptable.

In 1972 ACM honored Berkeley as its singular founder at its 25th anniversary dinner. His acceptance speech was a direct denunciation of those in computing who worked on the killing devices used in the Vietnam War, or computing companies that made such horrors, and of ACM for ignoring this immorality. He said that it was a “gross neglect of responsibility” that ACM was not investigating whether computer applications were good or evil and how computers could be used to increase the good of society. Several prominent ACM members, employees of the firms and government military agencies that Berkeley had pointed to, ostentatiously walked out of the banquet room while he was speaking. The leaders of ACM were clearly embarrassed by their honoree, and ACM never publicly referred to his speech in any way.

Berkeley's lifetime goal, only partly achieved at his death, was to educate his readers so that they could do as he did: think clearly about important matters, reach wise conclusions, and act bravely in support of their principles. He aspired to be, and was accepted by many as, the conscience of the computer industry because of his devotion to the idea that computers should work for the good of and not the destruction of mankind.

Donald Ervin Knuth (born 10 January 1938)

Donald Ervin Knuth is an American computer scientist, mathematician, and professor emeritus at Stanford University. He is the author of the multi-volume work The Art of Computer Programming. In addition to fundamental contributions in several branches of theoretical computer science, Knuth is the creator of the TeX computer typesetting system, the related METAFONT font definition language and rendering system, and the Computer Modern family of typefaces. Professor Knuth lives on the Stanford campus with his wife, Jill. They have two children, John and Jennifer. Music is his main avocation.

Dramatis inspirare

Ada Lovelace (10 December 1815 – 27 November 1852)

Augusta Ada King-Noel, Countess of Lovelace was an English mathematician and writer, chiefly known for her work on Charles Babbage's proposed mechanical general-purpose computer, the Analytical Engine. She was the first to recognise that the machine had applications beyond pure calculation, and published the first algorithm intended to be carried out by such a machine. As a result, she is often regarded as the first to recognise the full potential of a “computing machine” and the first computer programmer.

Lovelace first met Charles Babbage in June 1833, through their mutual friend Mary Somerville. Later that month Babbage invited Lovelace to see the prototype for his Difference Engine. She became fascinated with the machine and used her relationship with Somerville to visit Babbage as often as she could. Babbage was impressed by Lovelace's intellect and analytic skills. He called her “The Enchantress of Number”.

After her work with Babbage, Lovelace continued to work on other projects. In 1844 she commented to a friend Woronzow Greig about her desire to create a mathematical model for how the brain gives rise to thoughts and nerves to feelings (“a calculus of the nervous system”). She never achieved this, however. In part, her interest in the brain came from a long-running pre-occupation, inherited from her mother, about her 'potential' madness. As part of her research into this project, she visited the electrical engineer Andrew Crosse in 1844 to learn how to carry out electrical experiments. In the same year, she wrote a review of a paper by Baron Karl von Reichenbach, Researches on Magnetism, but this was not published and does not appear to have progressed past the first draft. In 1851, the year before her cancer struck, she wrote to her mother mentioning “certain productions” she was working on regarding the relation of maths and music.

Grace Hopper (9 December 1906 – 1 January 1992)

Grace Brewster Murray Hopper was an American computer scientist and United States Navy rear admiral. One of the first programmers of the Harvard Mark I computer, she was a pioneer of computer programming who invented one of the first compiler related tools. She popularized the idea of machine-independent programming languages, which led to the development of COBOL, an early high-level programming language still in use today.

Grace was very curious as a child; this was a lifelong trait. At the age of seven, she decided to determine how an alarm clock worked and dismantled seven alarm clocks before her mother realized what she was doing (she was then limited to one clock). For her preparatory school education, she attended the Hartridge School in Plainfield, New Jersey. Hopper was initially rejected for early admission to Vassar College at age 16 (her test scores in Latin were too low), but she was admitted the following year.

While she was working on a Mark II Computer at a US Navy research lab in Dahlgren, Virginia in 1947, her associates discovered a moth that was stuck in a relay; the moth impeded the operation of the relay. While neither Hopper nor her crew mentioned the phrase “debugging” in their logs, the case was held as an instance of literal “debugging.” For many years, the term bug had been in use in engineering. The remains of the moth can be found in the group's log book at the Smithsonian Institution's National Museum of American History in Washington, D.C.