History of the Internet: Over 50 Years Ago

Internet

HISTORY OF THE INTERNET: OVER 50 YEARS AGO

On October 29, 1969, at 10:30 PM, a computer grad student at U.C.L.A. named Charley Kline sent a message to S.R.I. (Stanford Research Institute.) It was the first connection between computer networks. The Internet began!

 

Charley Kline

Charley Kline

We set up a telephone connection between us and the guys at SRI…
We typed the L and we asked on the phone,

“Do you see the L?”
Yes, we see the L,” came the response.

We typed the O, and we asked,
“Do you see the O?”
Yes, we see the O.”

Then we typed the G, and the system crashed…
Yet a revolution had begun…

 

 

Talked to SRI

October 29, 1969 log book

 

Q: What is the Internet?

Early ARPAnet console

It’s the “network of interconnected networks.” By that definition, August 29 was the beginning. It was funded by ARPA (Advanced Research Projects Agency, founded by the U.S. Department of Defense) and became known as ARPANET. Four universities were initially connected: U.C.L.A., Stanford, U.C. Santa Barbara, and the University of Utah.

 

Q: Is that all there is to it?

Vint Cerf

Vint Cerf

There is quite a bit more to the history, both before and after. Vint Cerf, “the father of the Internet,” now VP at Google, developed TCP/IP (Transmission Control Protocol/Internet Protocol) as the universal language that enabled computers to talk to each other over the “internet” via Ethernet cables. He submitted the design to the IEEE (Institute of Electrical and Electronics Engineers) in 1974, and in 1981 a complete TCP/IP protocol specification was created.

Vint explained it as “an engineering problem intended to be a resource for sharing time on different brands of computers” — and there were many brands back then.

ARPANET

December 1969—the U.S. Department of Defense’s Advanced Research Projects Agency (ARPA) connected four computer network nodes at the University of California, Los Angeles (U.C.L.A.), the Stanford Research Institute (S.R.I.) in Menlo Park, Calif., U.C. Santa Barbara (U.C.S.B.), and the University of Utah. The “Sigma 7” note next to the circle depicting the U.C.L.A. node refers to the Sigma 7 computer at U.C.L.A.’s Network Measurement Center that Vint Cerf connected to ARPANET.

 

Q: Is the Internet synonymous with the World Wide Web?

No, though confusingly, many people use the terms interchangeably. But they’re not identical.

 

Q: How is the W.W.W. different from the Internet?

wwwAt the risk of oversimplifying, I like to say that the World Wide Web is a “personality” of the Internet. In other words, people used to work on the Internet with a command-line interface, typing commands, usually into a UNIX console, using a variety of protocols. This was classical Geek. You had to know these commands by heart.

Over the next several years, esoteric programs like wais, search engines like Archie, and protocols like Gopher allowed the I.T. high priesthood to navigate this nascent platform. I first got on the Internet in 1985 when the startup I was working for got internet access to Sun Microsystems, part of the main backbone of the Internet at that time.

 

Then two things happened beginning at the end of the ’80s, like moving from Morse code to a graphical presentation of information:

Tim Berners Lee

Tim Berners-Lee

1) On March 12, 1989, Tim Berners-Lee of Oxford, a physicist and researcher at the High Energy Physics lab at CERN (European Organization for Nuclear Research) in Geneva, developed a hyper-text methodology of connecting a page of information to another with “links.”

He called this protocol HyperText Transfer Protocol or HTTP. You see it at the beginning of a web link. His team also developed URLs (Universal Resource Locators) and HTML (HyperText Markup Language) to create websites. An early version of the World Wide Web Project is described here.

Now Sir Tim Berners-Lee is the director of the World Wide Web Consortium (W3C). He was awarded the renowned Turing Prize for inventing the World Wide Web, algorithms, and protocols that allowed the Web to scale. The first-ever website from 1991 can be found here.

 

Q: When did the W.W.W. start?

Marc Andreessen

Marc Andreessen

2) In 1993, Marc Andreessen was working at the National Center for Supercomputing Applications (NCSA) at the University of Illinois, where he met Tim Berners-Lee. Andreessen developed this user-friendly computer program called a “web browser,” which he called Mosaic.

It allowed people to see a graphical representation of a “page” on their computer. The pages were displayed in text and pictures. And objects or words could be clicked on with a mouse, and you could jump to another page elsewhere on the Internet.

 

MosaicThe following year, I got my hands on the browser and created my first internal Product Marketing website while working at Sun Microsystems. Back then, you could read the Web’s “What’s New” page that discussed all the new websites that had appeared overnight. You can’t do that anymore.

Andreessen came to Silicon Valley, just down the road from where I worked at Sun Microsystems, and started Netscape, the commercialization of the Mosaic browser. (It is also the basis for Internet Explorer, but that’s a long story.)

He licensed a technology called Java from Sun (I was a Java evangelist for Sun) to include in his browser. His company had developed a scripting language called LiveScript, which Sun convinced him to rename to JavaScript.

 

 

Q: What happened next?

AOLIn 1989 commercially available dial-up Internet Service Providers (I.S.P.s) became available with a modem — MOdulation/DEModulation to convert a computer signal to a phone signal and back again — allowing non-technical users to access the Internet. Popular options included CompuServe and America Online (AOL). Email had been developed a decade and a half earlier, but now average users could log in to AOL and hear “You’ve Got Mail.”

We called this phenomenon, in retrospect, Web 1.0. Instead of saying “the Net,” we now say “the Web.” Although “www” was available to the public, few knew about it or had access to it. Those of us at high-tech companies like Sun, or those at universities, weapons labs (like the Department of Energy’s Laurence Livermore National Labs, Sandia, or Los Alamos), or major defense contract companies did know about it and used it regularly through the ’70s and ’80s.

Most people didn’t see a standard non-proprietary computer browser until sometime in 1995 or 1996. For example, I created billpetro.com in 1995, as more people had access to “the Web” then. My history articles, which I had previously posted to the newsgroups on USENET, I could now host on my website.

In 1995 Amazon.com, Craigslist and eBay came online. The Google search engine began to obsolete all others, including Alta Vista and Yahoo’s search, starting in 1998.

Between 1996-98 I traveled the world for Sun Microsystems, talking about the Internet, Intranet, and Java. In 1998-99, the dot-com world expanded. In 2000 there was a dot-bomb as many overvalued “web companies” crashed.

 

Q: What about Web 2.0?

Web 2.0Tim O’Reilly coined the term in 2004 of what we called Web 2.0, or the second generation of the Internet, was an attempt to move beyond static web pages, or “brochureware,” to create interactive content. Not just “read-only” but read-write.

The term Web 2.0 was overhyped and overused at the time, but it did give rise to the explosive and infectious use of social media and social networks. This would include early examples like MySpace and GeoCities and later examples like Facebook, Twitter, and LinkedIn. Weblogs or Blogs could be created by using Blogger, LiveJournal, and WordPress.

 

Q: What is, or will be, Web 3.0?

Web3.0Web 2.0 uses centralized cloud-based technologies in the “web as a platform.” Many decry the centralization of all this social graph capital by “Big Tech” companies. Amidst Web 2.0, the term Web 3.0 or Web3 was coined by Gavin Wood of Ethereum but gained traction more recently because of greater interest in cryptocurrencies. The meaning of Web 3.0 remains in the eyes of the beholder. Sometimes the terms Web3 and Web 3.0 are used synonymously, sometimes very differently.

 

web3Initially, Berners-Lee described the “semantic web” as a component of Web 3.0, but more recently, Web3 is described as involving decentralized blockchain-based technologies, cryptocurrency, and security. Other technologies mentioned with Web3 are AI (Artificial Intelligence,) ML (Machine Learning,) and nonfungible tokens (NFTs.) Going beyond the cloud technologies of Web 2.0, Web 3.0 suggests decentralized “edge computing.”

 

If Web 1.0 was “read” and Web 2.0 was “read-write” Web 3.0 is “read-write-execute.”

Net-Web

 

Each wave of the Web takes ten to fifteen years to materialize fully:

  • Web 1.0: 1989 by Tim Berners-Lee, but not popularized to end-users until browsers became widely available after 1995.
  • Web 2.0: coined by Tim O’Reilly in 2004 and popularized by social networks.
  • Web 3.0: coined by Gavin Wood in 2014 and popularized in 2021 by cryptocurrency.

Web 3.0 may be a marketing buzzword, but also a subject of academic study and high-tech innovation.

 

Q: Is the Internet revolution over?

Apple_Watch_Series 7Not by a long shot. We are not even at the beginning of the end, more like the end of the beginning. With the advent of smartphones in the early 2000s and the popularity brought by the introduction of Apple’s iPhone in 2007, the use of data networks and WiFi technology took the Internet mobile. More people access information online via “apps” (applications) on mobile devices than on any other computer.

And not just mobile phones but “wearables” like smartwatches and exercise devices communicate data wirelessly to and from the Internet. We now see these interconnected devices as the Internet of Things.

 

Q. How many devices are there in the Internet of Things (IoT)?

There are billions of IoT devices, but the number is growing. There are 14-18B today; by 2025, 152K devices will connect to the Internet per minute, but over 25B are expected by 2030. Network-connected devices include sensors, cameras, and telemetry that “stream” vast amounts of “unstructured” data from the “edge” of the Internet to enormous data lakes of information to be processed by Big Data analytics engines which help companies and individuals gain insights to make critical decisions.

WiFi, cellular, and now Bluetooth-connected devices include smartphones, headphones, hearing aids, pacemakers, house security cameras and alarms, thermostats, robotic vacuum cleaners, house lights, and coffee mugs.

Ember

Ember coffee mug

I’m not kidding; each morning, I pour my cappuccino into a mug with an embedded heating coil and battery. Via Bluetooth, it connects to an app on my iPhone that will keep the coffee at a set temperature until the battery runs out.

 

 

The future of the Internet is calling… you can answer it on your Apple Watch.

 

 

Bill Petro, your friendly neighborhood historian
billpetro.com

Subscribe to have future articles delivered to your email. If you enjoyed this article, please consider leaving a comment.

About billpetro

Bill Petro has been a technology sales enablement executive with extensive experience in Cloud Computing, Automation, Data Center, Information Storage, Big Data/Analytics, Mobile, and Social technologies.

Leave a Comment