Algorithmic pricing

Michael Eisen found a good example of algorithmic pricing on Amazon which resulted in two booksellers pricing a book on fly genetics at almost $24 million (via yewknee).

On the day we discovered the million dollar prices, the copy offered by bordeebook was 1.270589 times the price of the copy offered by profnath. And now the bordeebook copy was 1.270589 times profnath again. So clearly at least one of the sellers was setting their price algorithmically in response to changes in the other’s price. I continued to watch carefully and the full pattern emerged.

Once a day profnath set their price to be 0.9983 times bordeebook’s price. The prices would remain close for several hours, until bordeebook “noticed” profnath’s change and elevated their price to 1.270589 times profnath’s higher price. The pattern continued perfectly for the next week.

I’m waiting for the algorithmic pricing that messes up in the other direction and nets me a Gutenberg bible for pocket change.


Engelbart’s Chord

At the famed “Mother of All Demos, Douglas Engelbart presented the mouse and keyboard that we know, as well as an input device called a chorded keyboard.

A computer input device that allows the user to enter characters or commands formed by pressing several keys together, like playing a “chord” on a piano. The large number of combinations available from a small number of keys allows text or commands to be entered with one hand, leaving the other hand free.

Ah, it’s similar to playing piano chords, that’s probably why the thing never caught on… raise your hand if you bombed out of piano lessons in spectacular fashion. I’m sure with years of practice on a chorded keyboard, you could become a machine. For now, I’ll stick with my keyboard. It may be inelegant, but so is marching around the apartment naked while banging pots and pans together. Sometimes you just need to make music.

If you want to try a simulation of the device, check out Engelbart’s Chord by Paul Tarjan.



Our new overlords?

Ken Jenning's final answer vs Watson

The recent set of Jeopardy! matches between Ken Jennings, Brad Rutter and Watson, IBM’s question-answering machine, has been entertaining. Sure, there’s a lot of self-promotion going on, but that’s par for the course with the show. As a trivia nut and computer scientist, this challenge appealed to me on multiple levels. As much as I would have loved to see our carbon-based brethren dominate, I’m not surprised by Watson’s victory. The format of the show presents some challenges. Watson bombed some questions in spectacular fashion, but for straight-up knowledge questions, the machine was dominant.

If you want to read more about the match, there’s lots of commentary, so I’ll leave it at that. However, there is a particular article that I wanted to point out. In his answer to the final question, Jennings riffed on a classic Simpson’s line in welcoming our new computer overlords. Ben Zimmer takes exception to the comment in his piece about the match for The Atlantic.

Elsewhere, Ferrucci has been more circumspect about Watson’s level of “understanding.” In an interview with IBM’s own magazine ForwardView, he said, “For a computer, there is no connection from words to human experience and human cognition. The words are just symbols to the computer. How does it know what they really mean?” In other words, for all of the impressive NLP programming that has gone into Watson, the computer is unable to penetrate the semantics of language, or comprehend how meanings of words are shot through with allusions to human culture and the experience of daily life.

We still have a long way to go before we have computers with true natural language processing.

Baker’s undoubtedly right about that, but we’re still dealing with the limited task of question-answering, not anything even vaguely approaching full-fledged comprehension of natural language, with all of its “nuance, slang, and metaphor.” If Watson had chuckled at that “computer overlords” jab, then I’d be a little worried.

After the Toronto answer to the question about U.S. cities, I remember thinking that Watson must be joking. I actually thought the machines were mocking us on national television. It concerned me. Time to figure out where I put that red pill.



Creating the 6502 processor

The MOS 6502 was created as a cheap alternative to the Motorolla 6800, and powered a lot of the consumer electronics in the 1980’s. Its layout was created by one man in one iteration with no errors.

These days, you can’t design and lay out a computer chip without a computer. An Intel Core 2 chip has hundreds of millions of transistors. The 6502 had 3,510, and an engineer—a person, not a computer—had to draw each one by hand to lay out the chip. Mainly it was a single engineer, Bill Mensch.

The original paper plans are long gone, but a team managed to a create a visual simulation of the 6502 at transistor level.



MacPaint source

Apple has donated the the MacPaint source code to the Computer History Museum. Bill Atkinson was responsible for the code, including QuickDraw, which formed a large portion of the MacOS.

A reporter asked Steve Jobs, “How many man-years did it take to write Quick Draw?” Steve asked Bill, who said, “Well, I worked on it on and off for four years.” Steve then told the reporter, “Twenty-four man-years”. Obviously Steve figured, with ample justification, that one Atkinson year was the equivalent of six ordinary programmer years.

The main source is written in Pascal, and is quite beautiful to read — you can tell that he took pride in it. The rest of the code is written in assembler language for the 68000 processor.