Daily Log photo archive

COBOL coding form from http://www.csis.ul.ie/cobol/course/COBOLIntro.htm

1959-05-28: A meeting at the Pentagon lays the foundations for the computer language that will later be known as COBOL, which goes on to become a mainstay of business computing for the next four decades.

COBOL, short for Common Business-Oriented Language, was one of the earliest computer languages. It was also, along with Fortran, one of the first programming languages to be based on English words.

It owes its existence to Grace Hopper, one of the earliest computer programmers. Hopper cut her programming teeth in the U.S. Naval Reserve, writing machine code for the Harvard Mark I computer during World War II. In the late 1950s, she came up with the idea that computer languages could be made to resemble human language, making them far more understandable than the assembly language and machine code used for all computer programming up to that point.

Sensing an opportunity to make computer programming more accessible and useful for business, the 1959 Pentagon meeting set up several working committees. They included reps from various computer manufacturers, so the language would be machine-independent. The most productive of those committees quickly developed the initial language specification, using Hopper’s Flow-Matic language as a starting point, and extending it with ideas from IBM’s business-oriented Fortran sibling, Comtran.

By December 1959, the committee had finished its specifications and named the language COBOL. The first COBOL compilers were built shortly thereafter, in 1960. The language evolved somewhat and became an ANSI specification in 1968.

COBOL’s appeal to business programmers was its readability, accessibility and the ease with which it could be used to compute business functions. By 1997, the Gartner Group estimated that 80 percent of the world’s businesses ran on COBOL, with a cumulative total of 200 billion lines of code in existence.

That legacy turned into an enormous burden, as IT administrators made the belated discovery that COBOL’s language constructs had encouraged programmers to store year data with just two digits. That spurred fears of potential system crashes when the year 2000 rolled around, because (for instance) such software would suddenly start reporting (for instance) the age of someone born in 1959 as -59 (00 – 59 = -59), instead of 41 (2000 – 1959 = 41). Suddenly, thousands of COBOL programmers were pulled out of retirement to comb through stacks of old code, updating programs to ensure their continued viability after the year 2000.

While most of those programs survived Y2K, COBOL itself hasn‘t fared so well. To be sure, it‘s still in use in many places (particularly old mainframe and minicomputer systems). Programming expert Grady Booch told Wired magazine in 2003 that “even an old COBOL system can end up pushed out onto the web, driving a new site.”

But COBOL itself is no longer a field of active research and study. Nobody goes to college planning to study COBOL programming, and you‘d probably be laughed out of the IT department if you suggested your company‘s next big programming effort should be based on the language. An effort to modernize and update COBOL standards got started in the early 2000s, but that group doesn‘t appear to have updated its website since 2005.

For all intents and purposes, COBOL is on the wane. But its existence spurred the development of many other high-level computer languages that use quasi-English syntax, from BASIC to PHP, and helped put computer programming within reach of a far wider group of people than before. That‘s a trend that we hope never goes out of style, by the grace of Grace. (Text from Wired’s This Day In Tech—May 28, 1959: Inventing a New Language for Business, retrieved 2011-05-30. Image from Introduction to COBOL, also retrieved 2011-05-30.)

memo sketch of Ethernet from http://ironbark.bendigo.latrobe.edu.au/subjects/DC/lectures/14/

1973-05-22: Robert Metcalfe devised the Ethernet method of network connection at the Xerox Palo Alto Research Center. He wrote: “On May 22, 1973, using my Selectric typewriter … I wrote … “Ether Acquisition” … heavy with handwritten annotations—one of which was “ETHER!”—and with hand-drawn diagrams—one of which showed ‘boosters’ interconnecting branched cable, telephone, and radio ethers in what we now call an internet…. If Ethernet was invented in any one memo, by any one person, or on any one day, this was it.”—Robert M. Metcalfe, “How Ethernet Was Invented,” IEEE Annals of the History of Computing, Volume 16, No. 4, Winter 1994, p. 84. (Text adapted from the Computer History Museum’s Timeline of Computer History, retrieved 2011-05-22. Image from Data Communications: Lecture 14—Ethernet, also retrieved 2011-05-22.)

page 5 of the original VisiCalc reference card from http://bricklin.com/history/refcard5.htm

1979-05-11: VisiCalc was given its first public demonstration on this date. VisiCalc, the first spreadsheet, was one of the key products that helped bring the microcomputer from the hobbyist’s desk into the office. Before the release of this groundbreaking software, microcomputers were thought of as toys; VisiCalc changed that.

VisiCalc was first released for the Apple II, which quickly became an invaluable tool for businesspeople—at least until IBM moved into the “personal computing” market in 1981.

For more of this story, read VisiCalc and the Rise of the Apple II and Software Arts & VisiCalc: The Idea. (Text adapted from the Low End Mac’s Before the Macintosh and the Computer History Museum’s This Day in History, both retrieved 2011-05-09. Image from VisiCalc progenitor Dan Bricklin’s website, also retrieved 2011-05-09.)

original 2-button Microsoft mouse

1983-05-02: Microsoft Corp. announced the two-button Microsoft Mouse, which it introduced to go along with its new Microsoft Word processor. Microsoft built about 100 000 of these fairly primitive units for use with IBM and IBM-compatible personal computers but sold only 5 000 before finding success in a 1985 version that featured, amongst other improvements, near-silent operation on all surfaces. But for a more interesting story—with great photos—read Mouse Design: 1963 to 1983. (Text adapted from the Computer History Museum’s This Day in History, retrieved 2011-05-02. Image from Mouse Design: 1963 to 1983 also retrieved 2011-05-02.)

Shannon and his famous electromechanical mouse Theseus which he tried to have solve the maze in one of the first experiments in artificial intelligence—Wikipedia

1916-04-30: Claude Shannon is born in Gaylord, Michigan. Known as the inventor of information theory, Shannon is the first to use the word “bit.” Shannon, a contemporary of John von Neumann, Howard Aiken, and Alan Turing, sets the stage for the recognition of the basic theory of information that could be processed by the machines the other pioneers developed. He investigates information distortion, redundancy, and noise, and provides a means for information measurement. He identifies the bit as the fundamental unit of both data and computation.

Shannon was inspired by the work of George Boole (2 November 1815 – 8 December 1864), an English mathematician and philosopher. Boole’s work was relatively obscure, except among logicians. At the time, it appeared to have no practical uses. However, approximately seventy years after Boole’s death, Shannon recognized that Boole’s work could form the basis of mechanisms and processes in the real world and that it was therefore highly relevant. He proved that circuits with relays could solve Boolean algebra problems—and employing the properties of electrical switches to process logic is the basic concept that underlies all modern electronic digital computers.

Shannon is shown here with his famous electromechanical mouse Theseus in one of the first experiments in artificial intelligence. (Text liberally adapted from Wikipedia’s article on Boole, retrieved 2010-10-26, the Computer History Museum’s This Day in History, retrieved 2011-04-23, and Wikipedia’s article on Shannon. Image from This Day in History also retrieved 2011-04-23.)

2011-04-13: Day of Pink is the International Day against Bullying, Discrimination, and Homophobia in schools and communities. Day of Pink organizers invite everyone to celebrate diversity by wearing a pink shirt and by organizing activities in their workplaces, organizations, communities, and schools.

Have you ever seen a friend hurt because of discrimination? Have you been hurt yourself? Discrimination comes in many forms including racism, sexism, homophobia, transphobia, ableism, agism, and anti-Semitism—just to name a few. These social diseases create barriers, bullying, harassment, hate, and violence. No one should have to experience the negativity created by discrimination. Day of Pink is more than just a symbol of a shared belief in celebrating diversity—it’s also a commitment to being open-minded, accepting differences, and learning to respect each other.

The Toronto District School Board supports anti-bullying through a Board-wide call to action, urging every school and every staff member and student to join the anti-bullying campaign by wearing pink on Wednesday, April 13, to show our support against bullying and for tolerance. (Image and abridged text from Day of Pink; additional abridged text from TDSB. All retrieved 2011-04-11.)

1992-04-06: Microsoft Corporation releases Windows 3.1, an operating system that provided IBM and IBM-compatible PCs with a graphical user interface (though Windows was not the first such interface for PCs). Retail price was 149 USD. In replacing the previous DOS command-line interface with its Windows system, however, Microsoft created an OS similar to the Macintosh operating system and was (unsuccessfully) sued by Apple for copyright infringement. (Image from Guidebook: Graphical User Interface Gallery and text abridged from This Day in History, both retrieved 2011-04-04.)

2006-03-21: The first tweet is sent. Read about the Twitter 5th Anniversary in The Globe and Mail and at geek.com. (Image from Twitter retrieved 2011-03-21.)

In Cupertino, California, high-school friends Steve Wozniak and Steve Jobs produced their first computer, the single-board Apple I, in a garage workshop in 1976. After selling around 200 computers, Jobs attracted the attention of some investors, co-founding Apple Computer, Inc., with Wozniak, and introducing the Apple II computer Adobe Reader icon in 1977.

It was the engineering skill of Wozniak (known affectionately as “Woz”), the marketing ability of Jobs, and the hard work of many of the early employees that contributed to Apple’s early success.

Apple was the first company to mass-market the graphical user interface in a computer: the Macintosh, introduced in 1984, was a product that re-defined personal computing. (PDF, image, and abridged text from Selling the Computer Revolution: Marketing brochures in the collection of the Computer History Museum, both retrieved 2011-02-27.)

A 10.8 × 10.8 cm card of core memory of 64 × 64 bits, as used in a CDC 6600, with inset close-up

1956-02-28: Jay Forrester at MIT is awarded a patent for his coincident current magnetic core memory. Forrester’s invention, given U.S. patent 2 736 880 for a “multicoordinate digital information storage device,” became the standard memory device for digital computers until supplanted by solid state (semiconductor) RAM in the mid-1970s. The photo is of a 0.8 × 10.8 cm card of core memory of 64 × 64 bits, as used in a CDC 6600. (Text abridged from This Day in History: February 28; image from Magnetic-core memory: Wikipedia. Both retrieved 2011-02-27.)

Image from DailyShite.com retrieved 2011-02-04.

Whirlwind computer elements

1952-12-14: The US Navy asks MIT to develop the Airplane Stability and Control Analyzer program, the beginning of project Whirlwind. The Whirlwind computer was the first to operate in real time, the first to use video displays for output, and the first that was not simply an electronic replacement of older mechanical systems. Its development led indirectly to almost all business computers and minicomputers in the 1960s. (Text abridged from This Day in History: December 14 and Whirlwind (computer): Wikipedia; image from Whirlwind (computer): Wikipedia; all retrieved 2010-12-14.)

portrait of Ada Lovelace

10 December 1815 / 27 November 1852: Augusta Ada King, Countess of Lovelace, born Augusta Ada Byron, was an English writer known chiefly for her work on Charles Babbage’s early mechanical general-purpose computer, the analytical engine. Her notes on the engine include what is recognized as the first algorithm intended to be processed by a machine; as such she is regarded as the world’s first computer programmer. (Text abridged from Wikipedia and image from Computing Northern Ireland Degrees and Careers, retrieved 2010-12-06.)

Lady Lovelace’s father was Lord Byron, the famous poet. She was educated by private tutors and undertook advanced study in mathematics with Augustus De Morgan, the equally famous British mathematician and logician. In the 1970’s, the ADA programming language was named after her. Based on the language Pascal, ADA was designed for the United States Department of Defense (DoD) to supersede the hundreds of programming languages then used by the DoD. Ada is strongly typed and compilers are validated for reliability in mission-critical applications, such as avionics software. Ada is an international standard; the current version (known as Ada 2005) is defined by a joint ISO/ANSI standard. (Text abridged from Wikipedia, retrieved 2010-12-06.)

ARPANET log

21 November 1969: First ARPANET Link Put Into Service—ARPANET was an early computer network developed by JCR Licklider, Robert Taylor, and other researchers for the U.S. Department of Defense’s Advanced Research Projects Agency (ARPA). It connected a computer at UCLA with a computer at the Stanford Research Institute in Menlo Park, California. In 1973, the government commissioned Vinton Cerf and Robert E Kahn to create a national computer network for military, governmental, and institutional use. The network used packet-switching, flow-control, and fault-tolerance techniques developed by ARPANET. Historians consider this worldwide network to be the origin of the Internet. (Text retrieved 2010-11-22 from This Day in History: November 21.)

The image above is of the record of the first message ever sent over the ARPANET. It took place at 22h30 on 29 October 29 1969. This record is an excerpt from the IMP Log that was kept at UCLA. Leonard Kleinrock was supervising the student/programmer Charley Kline (CSK) and they set up a message transmission to go from the UCLA SDS Sigma 7 Host computer to the SRI SDS 940 Host computer. The transmission itself was simply to “login” to SRI from UCLA. We succeeded in transmitting the “l” and the “o” and then the system crashed! Hence, the first message on the Internet was “Lo!” They were able to do the full login about an hour later. (Image and text retrieved 2010-11-22 from The Day the Infant Internet Uttered its First Words.)

13-yr-old Bill Gates with future Microsoft co-founder Paul Allen in 1968

18 November 1970: Microsoft Corp. co-founder and CEO Bill Gates gets his start in computer programming at the Lakeside School in Seattle. The school owned some early computers and Gates and his friends spent nearly all their time pushing the machines to their limits. Time on Lakeside and other machines in the Seattle area was costly, however, so the newly formed Lakeside Programmers Group offered Information Sciences Inc. free programming services on its PDP-10 in return for free time on the computer. The group designed a payroll program for the company. (Image of a 13-year-old Bill Gates with future Microsoft co-founder Paul Allen in 1968 from Bill Gates—Founder of Microsoft and text from This Day in History: November 18, both retrieved 2010-11-16.)

Windows 1.0 screenshot

10 November 1983: Microsoft announces a new product, Windows, to compete with other graphical environments for computers, such as the interface on the Apple Lisa. After several delays, Windows 1.0 finally became available to the public in 1985. Its major features included pull-down menus, tiled windows, mouse support, and cooperative multitasking of the program’s applications. Although Windows 1.0 saw some use, the Windows interface did not gain general acceptance until version 3.0. (Image from Galerie d'images : De Windows 1.0 à Windows 7 : 23 ans d’évolutions en images and text from This Day in History: November 10, both retrieved 2010-11-07.)

silver A.M. Turing award on white background

The Turing Award, recognized as the highest distinction in Computer science and the Nobel Prize of computing, is named after Alan Mathison Turing, a British scientist, mathematician, and Reader in Mathematics at The University of Manchester. Turing is frequently credited as the Father of theoretical computer science and artificial intelligence. (Text liberally adapted from, Wikipedia’s article on the Turing Award retrieved 2010-11-01.) ACM’s most prestigious technical award is accompanied by a prize of 250 000 USD. It is given to an individual selected for lasting contributions of major technical importance to the computer field. (Image and text from ACM: Association for Computing Machinery, retrieved 2010-11-01.)

black and white line drawing of George Boole

George Boole (2 November 1815 – 8 December 1864) was an English mathematician and philosopher. Boole’s work was relatively obscure, except among logicians. At the time, it appeared to have no practical uses. However, approximately seventy years after Boole’s death, Claude Shannon (an American mathematician and electronic engineer known as the father of information theory and cryptography) recognized that Boole’s work could form the basis of mechanisms and processes in the real world and that it was therefore highly relevant. He proved that circuits with relays could solve Boolean algebra problems—and employing the properties of electrical switches to process logic is the basic concept that underlies all modern electronic digital computers. Hence, Boolean algebra became the foundation of practical digital circuit design, and Boole provided the theoretical grounding for the Digital Age. (Image from, and text liberally adapted from, Wikipedia’s article on Boole, retrieved 2010-10-26.)

Black and white image of a woman standing behind a DEC word processor and line printer.

Publicity photo for a DEC word processor and line printer (Retrieved 2010-10-13 from DEC word processor and line printer.)

IBM 350 brochure photo

The 350 Disk Storage Unit consisted of the magnetic disk memory unit with its access mechanism, the electronic and pneumatic controls for the access mechanism, and a small air compressor. Assembled with covers, the 350 was 60 inches long, 68 inches high and 29 inches deep. It was configured with 50 magnetic disks containing 50 000 sectors, each of which held 100 alphanumeric characters, for a capacity of 5 million characters. (Quotation and photograph retrieved 2010-10-04 from IBM Archives: IBM 350 disk storage unit.)

Cray-1 supercomputer at the Computer History Museum in Mountain View, California. Photograph by Ed Toton, 2007-04-15

The Cray-1 was a supercomputer designed, manufactured, and marketed by Cray Research. The first Cray-1 system was installed at the U.S. Los Alamos National Laboratory in 1976, and it went on to become one of the best known and most successful supercomputers in history. The Cray-1’s architect was Seymour Cray and the chief engineer was Cray Research co-founder Lester Davis. (Quotation retrieved 2010-09-21 from Cray-1.)

Students operating IBM 029 keypunches

Students operating IBM 029 keypunches in the keypunch room of the University of Waterloo Mathematics and Computer Building during the ’67–’68 school year. The newer 029 keypunches were used by graduate students while undergraduates were limited to using the older 026 keypunch terminals. (Quotation and photo retrieved 2010-09-08 from A Chronology of Computing at The University of Waterloo.)

women programming Eniac

ENIAC, [perhaps] the world’s first electronic computer, was not programed but wired up to solve a problem. It became operational just in time for Stan Frankel and Nick Metropolis to run the calculations for “the Los Alamos problem,” which sought to find out if a hydrogen bomb [were] feasible. (Quotation and photo retrieved 2010-09-05 from The 9100 Project.)