Who Really Invented the Internet?


Contrary to legend, it wasn't the federal government, and the Internet had nothing to do with maintaining communications during a war.

A telling moment in the presidential race came recently when Barack Obama said: "If you've got a business, you didn't build that. Somebody else made that happen." He justified elevating bureaucrats over entrepreneurs by referring to bridges and roads, adding: "The Internet didn't get invented on its own. Government research created the Internet so that all companies could make money off the Internet."

It's an urban legend that the government launched the Internet. The myth is that the Pentagon created the Internet to keep its communications lines up even in a nuclear strike. The truth is a more interesting story about how innovation happens—and about how hard it is to build successful technology companies even once the government gets out of the way.

For many technologists, the idea of the Internet traces to Vannevar Bush, the presidential science adviser during World War II who oversaw the development of radar and the Manhattan Project. In a 1946 article in The Atlantic titled "As We May Think," Bush defined an ambitious peacetime goal for technologists: Build what he called a "memex" through which "wholly new forms of encyclopedias will appear, ready made with a mesh of associative trails running through them, ready to be dropped into the memex and there amplified."

That fired imaginations, and by the 1960s technologists were trying to connect separate physical communications networks into one global network—a "world-wide web." The federal government was involved, modestly, via the Pentagon's Advanced Research Projects Agency Network. Its goal was not maintaining communications during a nuclear attack, and it didn't build the Internet. Robert Taylor, who ran the ARPA program in the 1960s, sent an email to fellow technologists in 2004 setting the record straight: "The creation of the Arpanet was not motivated by considerations of war. The Arpanet was not an Internet. An Internet is a connection between two or more computer networks."

Full story.

A response piece from the LA Times.

Artificial Intelligence Pioneer John McCarthy Dies at 84

by Stan Schroeder

John McCarthy, the inventor of programming language Lisp and the man who coined the term “artificial intelligence,” has died at the age of 84.

Born in 1927, McCarthy had a PhD in mathematics and was a long-standing professor at Stanford University. He was the first to use the term “artificial intelligence” at a conference at Dartmouth College in 1956 and is one of the founders of the field of A.I. research.

His programming language, Lisp, together with its dialects, is often the language of choice for artificial intelligence applications.

Full story.

Big Blue: 100 years making things compute

By Michael Hill and Jordan Robertson, Associated Press

Endicott, N.Y. - Google, Apple and Facebook get all the attention. But the forgettable everyday tasks of technology - saving a file on your laptop, swiping your ATM card to get 40 bucks, scanning a gallon of milk at the checkout line - that's all IBM.

International Business Machines turns 100 on Thursday without much fanfare. But its much younger competitors owe a lot to Big Blue.

After all, where would Groupon be without the supermarket bar code? Or Google without the mainframe computer?

"They were kind of like a cornerstone of that whole enterprise that has become the heart of the computer industry in the U.S.," says Bob Djurdjevic, a former IBM employee and president of Annex Research.

IBM dates to June 16, 1911, when three companies that made scales, punch-clocks for work and other machines merged to form the Computing Tabulating Recording Co. The modern-day name followed in 1924.

Full story.

Inquisitiveness of Milwaukee native leads to a Nobel prize

By Mark Johnson of the Journal Sentinel Posted: Oct. 8, 2009

Many miles and years removed from the competitive dinner-table debates of his childhood in Milwaukee and Wauwatosa, Yale chemist Thomas A. Steitz awoke at 5:20 Wednesday morning to the sound of a ringing phone, long distance from Sweden.

Steitz, the caller said, had won the 2009 Nobel Prize in chemistry. One by one, members of the Nobel Committee then got on the phone to offer personal congratulations.

"They wanted to be sure I knew this was not a hoax," Steitz said in an interview with the Journal Sentinel. "Since I knew some of the members of the committee, I could recognize their voices."

Sharing the prize and the $1.4 million with Steitz, 69, were Venkatraman Ramakrishnan of the MRC Laboratory of Molecular Biology in England and Ada E. Yonath of the Weizmann Institute of Science in Israel. The three scientists were honored for fundamental work that revealed the structure and function of ribosomes, which transform our DNA into the proteins necessary for virtually every human action from breathing to thinking.

Full story.

Physics Legends

Physics Legends
Critical Point: November 2006

The history of science is full of mythical stories that we repeat, even when we suspect that they are probably wrong. Robert P Crease recounts several and asks for yours

Richard Feynman starts his book QED: The Strange Theory of Light and Matter with a remarkable confession. He tells a brief story about the origins of his subject – quantum electrodynamics – and then says that the "physicist’s history of physics" that he has just related is probably wrong. "What I am telling you", Feynman says, “is a sort of conventionalized myth-story that the physicists tell to their students and those students tell to their students, and is not necessarily related to the actual historical development, which I do not really know!”

. . .

False legends

In contrast, many other common legends are entirely unfounded. Sometimes they persist because they conveniently reinforce established dogma, such as the story that the Catholic Church condemned the use of zero and Arabic numerals. Naturalists are also said to have convincingly proved evolution in action by showing in the 1950s that the increased abundance of industrial soot in the environment led to more melanic (darker, mutant) peppered moths. This experiment is now known to be badly flawed.

Other false stories are popular simply because they are fun. An example is the one about physicist Donald Glaser coming up with the idea of the bubble chamber one night at a bar after popping open a beer. A few years ago, after hearing this story one too many times, I called Glaser to ask if it were true. He assured me that it was false – he came up with the idea behind the bubble chamber via the application of cold, hard reason. However, Glaser admitted that, for sheer amusement, he once tried to see charged particle tracks in soda bottles.

Full story.

Inventing on Decline

Science: Wanna be an inventor? Don't bother

Jeffrey MacMillan for USN&WR

Posted 7/7/05
By Thomas Hayden

It's a feeling all too common to anyone who has ever dreamed of being a great innovator: All the really good stuff has already been invented. Pressing on regardless, surely, is what separates the Benjamin Franklins and Thomas Edisons from the rest of us. Or is it?

Sitting there reading this on a computer screen, listening to your iPod, and taking calls on your cellphone, it's hard to believe that we're not living in the golden age of invention. But a pair of new reports suggests that coming up with new ideas is getting harder every year.

In an analysis to be published in Technological Forecasting and Social Change, Jonathan Huebner, a physicist working at the Naval Air Warfare Center in China Lake, Calif., tracks the rate of innovation through history. Plotting a timeline of 7,200 major technological advances dating to the Renaissance against world population, he found that the number of key inventions per person actually peaked in 1873 and has been on the decline ever since. In a similar analysis of U.S. patent records dating back to 1790, Huebner found that Americans reached their peak inventiveness in 1915. Despite ever greater education and research funding, Huebner told the British science magazine New Scientist, he expects per capita technological advance to hit medieval rates by 2024.

Full story.

25 years later, 1980 Bayh-Dole act honored as foundation of an industry

The building of biotech
25 years later, 1980 Bayh-Dole act honored as foundation of an industry
Bernadette Tansey, Chronicle Staff Writer

Tuesday, June 21, 2005

Philadelphia -- In 1980, Birch Bayh, a veteran Indiana senator, was defeated after serving 18 years in a job he loved. But in the final hour of a lame-duck session held after the election he lost, he managed to squeak a last bill through Congress.

Twenty five years later, Bayh is being hailed as a visionary whose hard- won legislation helped create the biotechnology industry by spawning a whole generation of scientist-entrepreneurs.

The Bayh-Dole act allowed universities and their faculty members to stake patent claims on discoveries they made through research funded by such federal agencies as the National Institutes of Health, instead of leaving ownership of the intellectual property with the government.

That change accelerated the use of academic breakthroughs like gene splicing to develop biotech drugs and other products, giving rise to a three- way partnership of government, universities and startup firms that is "the envy of every nation,'' said Biogen Idec Inc. Chief Executive Officer James Mullen.

Full story.

Jack Kilby, whose 1958 invention led to today's ubiquitous microchip, dies at 81

Jack Kilby, whose 1958 invention led to today's ubiquitous microchip, dies at 81

AP Technology Writer
DALLAS — Nobel laureate Jack Kilby, whose 1958 invention of the integrated circuit ushered in the modern electronics age and made possible the microprocessor, has died after a battle with cancer.

Kilby died Monday at age 81 at his Dallas home, said Texas Instruments Inc., where he worked for many years.

Before the integrated circuit, electronic devices relied on bulky and fragile circuitry, including glass vacuum tubes. Afterward, electronics could become increasingly more complex, reliable and efficient: powering everything from the iPod to the Internet.

During his first year at Texas Instruments, using borrowed equipment, Kilby built the first integrated circuit into a single piece of semiconducting material half the size of a paper clip. Four years later in 1962, Texas Instruments won its first major integrated circuit contract, for the Minuteman missile.

. . .

He earned degrees in electrical engineering from the universities of Illinois and Wisconsin, and began his career in 1947 with the Centralab Division of Globe Union Inc. in Milwaukee, developing ceramic-based, silk-screened circuits for electronic products.

Full story.

Engineer who worked on Apollo space program dies

Engineer who worked on Apollo space program dies

The Associated Press
BARNSTABLE, Mass. -- Edward Schwarm, an electrical engineer whose work on the Apollo space program helped NASA land the first man on the moon, died of skin cancer last month at his home on Cape Cod. He was 82.

Schwarm was working at the Massachusetts Institute of Technology when the school teamed up with NASA on the Apollo missions.

He developed some of the technology used in the Apollo 11 mission, the first lunar landing, and was part of the team that helped the Apollo 13 astronauts return safely to Earth.

Schwarm also was an accomplished inventor who owned 11 patents for innovations in space aviation and electronic power systems.

"He was an inventor, and he always looked at problems from a practical view," said his daughter, Shutesbury resident Claudia Gere.

During World War II, the Milwaukee native left the University of Wisconsin at Madison to join the Army.

Full story.

Harold Wooster, 86; Computer Pioneer

Harold Wooster, 86; Computer Pioneer

By Yvonne Shinhoster Lamb
Washington Post Staff Writer
Friday, June 3, 2005; Page B06

Harold Wooster, whose decades-long career in information science influenced the development of computer technology and medical television, died of a heart attack May 20 at the Carlisle (Pa.) Regional Medical Center. He was 86.

As chief of the information sciences division of the Air Force Office of Scientific Research in the 1960s, Dr. Wooster awarded crucial early grants to many of the scientists and engineers whose research spurred the development of the Internet and personal computer. In the 1970s, while working at the National Library of Medicine, he supervised pioneering experiments in telemedicine, including studies on how residents of remote Alaskan islands could receive advice from doctors on the mainland.

Full story.