Encyclopedia Britannica

  • History & Society
  • Science & Tech
  • Biographies
  • Animals & Nature
  • Geography & Travel
  • Arts & Culture
  • Games & Quizzes
  • On This Day
  • One Good Fact
  • New Articles
  • Lifestyles & Social Issues
  • Philosophy & Religion
  • Politics, Law & Government
  • World History
  • Health & Medicine
  • Browse Biographies
  • Birds, Reptiles & Other Vertebrates
  • Bugs, Mollusks & Other Invertebrates
  • Environment
  • Fossils & Geologic Time
  • Entertainment & Pop Culture
  • Sports & Recreation
  • Visual Arts
  • Demystified
  • Image Galleries
  • Infographics
  • Top Questions
  • Britannica Kids
  • Saving Earth
  • Space Next 50
  • Student Center

proportions of World Wide Web content constituting the surface web, deep web, and dark web

  • Who controls the Internet?
  • Is the Internet “making us stupid”?
  • Is cancel culture (or “callout culture”) good for society?

multiple exposures, depiction of deep web, internet

World Wide Web

Our editors will review what you’ve submitted and determine whether to revise the article.

  • Workforce LibreTexts - The World Wide Web
  • Academia - WWW (World Wide Web)
  • World Wide Web Foundation - History of the Web
  • ACM Digital Library - The World-Wide Web
  • LiveScience - World Wide Web: Definition, history and facts
  • NPR - The World Wide Web became available to the broader public 30 years ago
  • The University of Oklahoma - World-Wide Web
  • World Wide Web (WWW) - Student Encyclopedia (Ages 11 and up)

Recent News

history of the world wide web essay

World Wide Web (WWW) , the leading information retrieval service of the Internet (the worldwide computer network ). The Web gives users access to a vast array of mass media and content—via the deep web , the dark web , and the commonly accessible surface web—that is connected by means of hypertext or hypermedia links—i.e., hyperlinks , electronic connections that link related pieces of information in order to allow a user easy access to them. Hypertext allows the user to select a word or phrase from text and thereby access other documents that contain additional information pertaining to that word or phrase. Hypermedia documents feature links to images, sounds, animations, and movies. The Web operates within the Internet’s basic client-server format; servers are computer programs that store and transmit documents to other computers on the network when asked to, while clients are programs that request documents from a server as the user asks for them. Browser software allows users to view the retrieved documents. Special browsers and platforms such as Tor allow users to do so anonymously.

A hypertext document with its corresponding text and hyperlinks is written in HyperText Markup Language ( HTML ) and is assigned an online address called a Uniform Resource Locator ( URL ).

history of the world wide web essay

The development of the World Wide Web was begun in 1989 by Tim Berners-Lee and his colleagues at CERN , an international scientific organization based in Geneva, Switzerland. They created a protocol , HyperText Transfer Protocol ( HTTP ), which standardized communication between servers and clients. Their text-based Web browser was made available for general release in January 1992.

The World Wide Web gained rapid acceptance with the creation of a Web browser called Mosaic , which was developed in the United States by Marc Andreessen and others at the National Center for Supercomputing Applications at the University of Illinois and was released in September 1993. Mosaic allowed people using the Web to use the same sort of “point-and-click” graphical manipulations that had been available in personal computers for some years. In April 1994 Andreessen cofounded Netscape Communications Corporation , whose Netscape Navigator became the dominant Web browser soon after its release in December 1994. BookLink Technologies’ InternetWorks, the first browser with tabs, in which a user could visit another Web site without opening an entirely new window, debuted that same year. By the mid-1990s the World Wide Web had millions of active users.

The software giant Microsoft Corporation became interested in supporting Internet applications on personal computers and developed its own Web browser (based initially on Mosaic), Internet Explorer (IE), in 1995 as an add-on to the Windows 95 operating system . IE was integrated into the Windows operating system in 1996 (that is, it came “bundled” ready-to-use within the operating system of personal computers), which had the effect of reducing competition from other Internet browser manufacturers, such as Netscape. IE soon became the most popular Web browser.

Apple ’s Safari was released in 2003 as the default browser on Macintosh personal computers and later on iPhones (2007) and iPads (2010). Safari 2.0 (2005) was the first browser with a privacy mode, Private Browsing, in which the application would not save websites in its history, downloaded files in its cache , or personal information entered on Web pages.

history of the world wide web essay

The first serious challenger to IE’s dominance was Mozilla’s Firefox , released in 2004 and designed to address issues with speed and security that had plagued IE. In 2008 Google launched Chrome , the first browser with isolated tabs, which meant that when one tab crashed, other tabs and the whole browser would still function. By 2013 Chrome had become the dominant browser, surpassing IE and Firefox in popularity. Microsoft discontinued IE and replaced it with Edge in 2015.

In the early 21st century, smartphones became more computer-like, and more-advanced services, such as Internet access, became possible. Web usage on smartphones steadily increased, and in 2016 it accounted for more than half of Web browsing.

YOUR FINAL GRADE - GUARANTEED UK Essay Experts

Disclaimer: This essay is provided as an example of work produced by students studying towards a information technology degree, it is not illustrative of the work produced by our in-house experts. Click here for sample essays written by our professional writers.

Any opinions, findings, conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of UKEssays.com.

History Of The World Wide Web Information Technology Essay

✅ Free Essay ✅ Information Technology
✅ 1551 words ✅ 1st Jan 2015

Reference this

Introduction

What is the world wide web.

Get Help With Your Essay

If you need assistance with writing your essay, our professional essay writing service is here to help! Find out more about our Essay Writing Service

Hypertext Documents

Web browser, origins of the world wide web, 2002-present, the essentials, uses of the world wide web, entertainment, cite this work.

To export a reference to this article please select a referencing stye below:

Related Services

Student working on a laptop

Essay Writing Service

Student reading book

  • Dissertation Writing Service

Student reading and using laptop to study

  • Assignment Writing Service

DMCA / Removal Request

If you are the original writer of this essay and no longer wish to have your work published on UKEssays.com then please:

Our academic writing and marking services can help you!

  • Find out more about our Essay Writing Service
  • Undergraduate 2:2
  • 7 day delivery
  • Marking Service
  • Samples of our Service
  • Full Service Portfolio

Related Lectures

Study for free with our range of university lecture notes!

  • All Available Lectures

Academic Knowledge Logo

Freelance Writing Jobs

Looking for a flexible role? Do you have a 2:1 degree or higher?

Study Resources

Free resources to assist you with your university studies!

  • Dissertation Resources at UKDiss.com
  • How to Write an Essay
  • Essay Buyers Guide
  • Referencing Tools
  • Essay Writing Guides
  • Masters Writing Guides

The World Wide Web: The Invention That Connected The World

Editorial feature.

By Google Arts & Culture

CDC 6600 Super Computer (1968) by Control Data Limited Science Museum

As we reach the web’s 30th birthday, we reflect on its history – from its hardware foundations to the 5 billion person network we see today

The internet is a huge network of computers all connected together, but it was the world wide web that made the technology into something that linked information together and made it accessible to everyone. In essence, the world wide web is a collection of webpages found on this network of computers – your browser uses the internet to access the world wide web. The world wide web was invented by Sir Tim Berners-Lee in 1989 – originally he was trying to find a new way for scientists to easily share the data from their experiments. Hypertext (text displayed on a computer display that links to other text the reader can immediately access) and the internet already existed, but no one had thought of a way to use the internet to link one document directly to another.

CDC 6600 Super Computer (From the collection of Science Museum)

Tim Berners-Lee, pioneer of the World Wide Web (1990) by CERN Science Museum

Tim Berners-Lee, c. 1990s (From the collection of CERN)

Berners-Lee created the world wide web while he was working at CERN, the European Organization for Nuclear Research in Switzerland. His vision soon went beyond a network for scientists to share information, in that he wanted it to be a universal and free 'information space' to share knowledge, to communicate, and to collaborate. You can find out more about how his work on the world wide web at CERN began, here . There are three main ingredients that make up the world wide web. URL (uniform resource locator), which is the addressing scheme to find a document; HTTP (hypertext transfer protocol), which connects computers together; and HTML (hypertext markup language), which formats pages containing hypertext links.

CERN Mundaneum

Data Center of CERN (From the collection of Munaneum)

Berners-Lee also made the world’s first web browser and web server. During the 1990s the amount of web browsers being produced rapidly multiplied and a whole load more web-based technologies started sprouting up. To get a sense of how the world wide web has developed since its creation, check out this video below:

Original NeXT computer used by Sir Tim Berners-Lee to design the World Wide Web (1990) by NeXT Science Museum

Original NeXT computer used by Tim Berners-Lee to design the world wide web (From the collection of Science Museum)

The world wide web opened up the internet to everyone, not just scientists. It connected the world in a way that made it much easier for people to get information, share, and communicate. It has since allowed people to share their work and thoughts through social networking sites, blogs, video sharing, and more.

An image of the first page of Tim Berners-Lee's proposal for the World Wide Web in March 1989. (1989-03-01) by CERN / Tim Berners-Lee CERN

An image of the first page of Tim Berners-Lee's proposal for the world wide web in March 1989 (From the collection of CERN)

A screenshot showing the NeXT world wide web browser (1990-01-01) by Tim Berners-Lee CERN

A screenshot showing the NeXT world wide web browser by Tim Berners-Lee (From the collection of CERN)

Explore more: – How Computers Transformed Communication

Visite guidée du Mundaneum

The hunt for the higgs boson, the science of lightsabers: the science museum celebrates star wars day on #maythe4th, science museum, art or archive, the birth of the world wide web, nicolas flamel: alchemy and the legend of the philosopher’s stone, norbert ghisoland, no small matter: exploring the strange world of antimatter, close to the bone: wilhelm röntgen, celebrity physicist, palimpseste, 10 things you (probably) didn't know about cern, staring at the sun: how warren de la rue captured an eclipse.

  • Earth and Environment
  • Literature and the Arts
  • Philosophy and Religion
  • Plants and Animals
  • Science and Technology
  • Social Sciences and the Law
  • Sports and Everyday Life
  • Additional References

Encyclopedia.com -- Online dictionary and encyclopedia of facts, information, and biographies

  • Encyclopedias almanacs transcripts and maps

History of the Internet and World Wide Web (WWW)

History of the internet and world wide web (www).

In its short history, the Internet has had a revolutionizing effect, not only on communications and computing, but also on broader areas of life such as economics, culture, language, and social relations. In that same time, however, the Internet and, subsequently, the World Wide Web have undergone a number of permutations, and the intentions of its developers have not always coincided with the ways in which the technology has been realized. As the technology and its influence spread, of course, the designs of the original planners were diluted. From its origins as a military-based, Pentagon-funded networking architecture for experimental communications, the Internet flowered into perhaps the most sweeping revolution in the history of communications technology. The World Wide Web , meanwhile, grew from a vehicle designed to universalize the Internet and democratize electronically based information to a commercial juggernaut that transformed the way business is conducted.

THE PREHISTORY OF THE INTERNET

Although in the popular imagination the Internet is a feature of the 1990s, the earliest inklings of the possibilities of networked computers can be traced to the early 1960s. In 1962, J.C.R. Licklider at the Massachusetts Institute of Technology (MIT) first elucidated his dream of a "Galactic Network" connecting computers across the globe for the distribution and access of data and programs. Licklider went on to become the first director of the Defense Advanced Research Projects Agency (DARPA), an arm of the U.S. Department of Defense and the body that funded and coordinated the original research into what became the Internet.

Licklider's MIT colleagues Leonard Kleinrock and Lawrence G. Roberts performed the ground-breaking work toward the development of the Inter-net's architecture. First, Kleinrock published a revolutionary paper touting the plausibility of using packet switching rather than circuits for communications, thereby paving the way for the necessary computer networking. Roberts built on Kleinrock's theories to devise the first wide-area computing network, using a regular, circuit-based telephone line to allow computers in Massachusetts and California to communicate directly. While the computers were indeed able to run programs and exchange data, Roberts was convinced that Roberts's insistence on the superiority of packet switching was correct.

Having joined DARPA, Roberts in 1967 presented a paper outlining his vision for the original version of the Internet, known as ARPANET, the specifications of which were set by the following fall. Roberts's main position was that the network DARPA was building could be expanded and put to greater use once it was completed. Kleinrock relocated to UCLA just in time for DARPA to send a proposal for the further development of his packet switching ideas for the network DARPA was constructing. Keleinrock and a handful of other interested scholars at UCLA established the Network Measurement Center for the ARPANET project.

ARPANET's first host computer was set up at Kleinrock's Network Measurement Center at the University of California - Los Angeles (UCLA) in 1969, and other nodes, at Stanford Research Institute (SRI), UC Santa Barbara (UCSB), and the University of Utah in Salt Lake City , were connected shortly afterward. As computers were added to ARPENET, the Network Working Group worked to devise a communication protocol that would enable different networks to talk to each other, resulting in the host-to-host Network Control Protocol (NCP), which was rolled out in 1970. Thus the Internet as we know it today began to bloom.

OPENING THE INTERNET

Still, for the first few years of its existence, ARPANET was largely unknown outside of the relatively esoteric group of technologists that was developing it. That changed in 1972, Robert Kahn of Bolt Beranek and Newman (BBN), one of the chief figures in the development of the ARPANET architecture, organized a conference at the International Computer Communication Conference (ICCC) where ARPANET was first demonstrated publicly. That same year, the first major Internet application, called electronic mail , or e-mail, was introduced. Over the next decade, e-mail was the most widely used network application in existence.

The early years of ARPANET saw the network grow slowly, as nodes were gradually added and the vast array of computers plugged into it demanded software and interface hardware so as to adequately interact with ARPANET. As ARPANET expanded into what is now referred to as the Internet, it was grounded on what is known as an open architecture network. In such an environment, other networks could connect to and interact with the Internet and all other networks to which it is connected, but the technology used to build each network could be decided by that network's provider and needn't be dictated by any particular architecture. Packet switching, pioneered by Kleinrock, allowed for such architectural freedom to connect networks on a peer, rather than hierarchical, basis. In fact, open-architecture networking was originally referred to as "Internetting" when it was introduced to DARPA in 1972.

While this greatly expanded the uses of the Internet in its limited environment of the day, enabling network designers to tailor their architectures to the specific needs of their users while still linking it to the overall Internet, it resulted in the lack of a common user interface on the Internet. In fact, most of the early networks connected to the Internet were designed for a closed community of researchers and scholars, so the issue of cross-network capacity was a very low priority. For academics, military officials, and scientists, this was satisfactory on the whole as the Internet was geared toward very specialized users. It limited the overall availability of the Internet, however, in a manner that wouldn't be remedied until the 1990s and the introduction of the World Wide Web .

For several years, the bulk of the research involving Internet communications, including work on the various networking and transmission logistical concerns, was funded primarily by the United States Department of Defense, and thus was primarily designed around and translated into military concerns. For instance, the first demonstration of an Internet transmission linking three different kinds of gateways, including a mobile packet radio in California, the Atlantic Packet Satellite Network (SATNET), and several ground-level ARPANET systems through the eastern United States and Europe , were designed to mimic military scenarios the depended on linking mobile units to central command stations across an intercontinental network.

Network Control Protocol, however, proved limited in an open-architecture environment since it was dependent on the ARPANET network design for endto-end reliability, and any transmission packets that were compromised could bring the protocol to an abrupt stop. To get multiple packet networks to communicate with each other regardless of the underlying networking technology, a common communication protocol was needed. The first efforts toward this end were the work on the Transmission Control Protocol (TCP) by Vinton Cerf at the Stanford Research Institute and Robert Kahn at BBN. TCP was designed specifically to sidestep any centralized global control at the level of internetworking operations using the communications protocol. The design called for gateways, or routers, to connect networks to the Internet without calling for any network reconfiguration. After several years of research and design, the first TCP specification was published in December 1974. Just a few months later, DARPA transferred ARPANET as a fully operational Internet to the Defense Communications Agency (later renamed the Defense Information Systems Agency).

By the late 1970s, the U.S. military became interested in Internet technology not just as an experimental and theoretical tool, but as an actually existing military communications system. As a result, the military began to use Internet communications protocols in packet radio systems and various ground-satellite stations in Europe. The transfer of voice messages highlighted complications in these radio-based networks and led to the development of a complementary Internet Protocol (IP), which was combined with TCP to produce the TCP/IP protocol suite. TCP/IP quickly emerged as the standard for all military Internet systems, and, by extension, the Internet itself.

Through the early 1980s, Internet products consolidated into the TCP/IP protocol, setting the stage for the opening of commercial applications. Sure enough, according to Vinton Cerf, in the mid-1980s a substantial market for Internet-based products began to flower. In large part this was due to the NSFNet initiative. This program, which was born of a network designed to link supercomputers together based on software designed by David Mills of the University of Delaware , and which was led by Dennis Jennings at the National Science Foundation (NSF), quickly generated supporting software and systems by IBM, MCI, and Merit to accommodate the quickly escalating networking demand. Thanks to the outgrowth of technologies stemming from NSFNet, the number of computers connected to the Internet jumped from only several hundred in 1983 to over 1.3 million in 1993, while the number of networks leapt from a tiny handful to over 10,000. By 1990, the NSFNet, in fact, had generated such a profound transformation in the Internet's backbone and reach that ARPANET itself was decommissioned. Soon commercial e-mail carriers, already devising systems and software for use in intranets, began exploiting the possibilities of Internet-based e-mail; commercial Internet service providers came along in their wake, sprouting up from the original handful of networks brought to life under NSFNet. For several years, however, these services were still primarily geared toward researchers and businesses-those few groups that already had a need for and access to the Internet. The Internet as a household resource was still largely unheard of.

The rapid expansion of the Internet in the 1980s necessitated new methods of management such as the Domain Name System (DNS). In its earliest incarnations, users had to memorize numerical addresses to access the fairly limited number of host networks, but that became unfeasible as the number of connected networks took off. With the proliferation of local area networks (LANs), Internet managers designed the DNS to create easily identifiable hierarchies of hosts to facilitate easy Internet navigation.

In the late 1980s and early 1990s, a series of policy initiatives, including a forum at the Harvard Kennedy School of Government on "The Commercialization and Privatization of the Internet" and a National Research Council committee report titled "Towards a National Research Network," paved the way for the next steps of Internet evolution, including the sponsorship by the U.S. government of high-speed computer networks that would serve as the backbone for the explosion of the information super-highway and e-commerce in the 1990s.

THE WORLD WIDE WEB

Perhaps the invention that most facilitated the growth of the Internet as a global information-sharing system is the World Wide Web. Unlike the Internet, however, the early design and development of the World Wide Web was primarily the doing of just one person: Tim Berners-Lee . Working as a contract programmer at the Geneva, Switzerland-based Centre Europen de Recherche Nucleaire (European Laboratory for Particle Physics, or CERN), Berners-Lee repeatedly proposed to develop a global interactive interface for use on the Internet so as to turn the fragmented and relatively exclusive Internet into a popular and seamless whole. After several rejections, Berners-Lee simply developed a prototype using the laboratory's phone-book entries in 1989. Called Enquire Within Upon Everything, the prototype was designed to link and connect elements much in the way that the brain makes random connections and associations. Unlike the average database system, according to Berners-Lee, the Web was to be designed to make random associations between arbitrary objects in the files.

Just as the Internet evolved to ensure the greatest possible flexibility and interoperability, so the Web's original architectural design specifically minimized the degree of specification so as to minimize constraints on the user. In this way, the design could be modified and updated while leaving the basic architecture undisturbed. Thus, for instance, users could enter the existing File Transfer Protocol (FTP) in the address space and it would be as workable as the new Hypertext Transfer Protocol (HTTP). HTTP was the communications protocol that allowed the Web to transfer data to and from any computer connected to the Internet, and was designed as an improvement on the FTP standard in that it took advantage of the Web's capacity to read and translate intricate features. The intermixing of these protocols and file formats was the key, for Berners-Lee, to ensuring not only the widest proliferation but also the greatest durability of his creation. Not only would the Web in this way be able to evolve with changing systems and protocols, but the early adoption would be made smoother in that users could adopt the Web from whatever systems they were currently using as a parallel or supplementary system. Shortly after the successful demonstration of the phonebook prototype, the Internet community, still relatively esoteric, began experimenting with browser platforms for viewing the Web. One of the early successes was the Mosaic program written by Marc Andreessen , later the founder of Netscape.

Taking advantage of the Internet's gateways and bypassing centralized registries, Berners-Lee devised the universal resource locators (URLs) that are the basis for Web addresses under the DNS. URLs were built to highlight the central power of the Web: that any link can connect to any other document or resource anywhere on the Internet, or in the "universe of information," as Berners-Lee puts it. URLs are structured to identify the kind of space that is being accessed (for instance, by the prefixes "http:" or "ftp:") followed by the specific address within that information space.

The last piece of the WWW puzzle was the medium's lingua franca : Hypertext Markup Language (HTML), a language of codes, built on hypermedia principles dating back to the 1940s, that informs the browser how to interpret the files for the Web. By 1991, all the elements were in place, and the World Wide Web was released from Berners-Lee's laboratory to the public free of any charge.

Perhaps the biggest story in the development of the Web through the early and mid-1990s was the fight to stave off the fragmentation of Web standards that could potentially undermine the ability of the Web to fulfill its original function-namely, to create a seamless universe of information. The World Wide Web Consortium (W3C), of which Berners-Lee was the founder, was born in 1994 just as the Web was beginning to hit critical mass . The organization, though not a governing body, was founded to guide and oversee the Web's development and minimize proprietary battles over standards and protocols in an effort to keep the Web nonproprietary and freely accessible. Based at MIT, the W3C is a neutral organization that brings together technicians, researchers, policy advocates, software vendors, and business interests to compromise on technical standards and specifications to ensure that the Web remains undivided.

COMMERCIALIZATION

Beginning in the mid-1990s, the World Wide Web helped propel the Internet to a new stage of mass consumption, and in the process both were radically transformed, as was the society that used them. The Internet and World Wide Web opened new fields of debate over social and cultural concerns, including the right to privacy, the protection of children from harmful or inappropriate materials, freedom of speech as it pertains to electronic networks, intellectual property, issues of social equality, the security of financial and personal data online, and a host of other issues.

As businesses grew increasingly interested in the Internet and the Web for their own strategies, the race to take advantage of the emerging e-commerce markets highlighted the needs of commercial interests in the Internet architecture, in Web-and e-mail-based security measures, and in business models structured on Internet communications and technology. In turn, businesses used these technologies as tools to enter and take advantage of new markets throughout the world, in the process furthering the proliferation of the Internet and the globalization of the world's economies. In the process, the range of social and cultural concerns connected to the Web and the Internet were intensified.

It is clear that, far from the special provenance of technicians, computer scientists, and scholarly researchers, the Internet and the World Wide Web by the mid-1990s had evolved into critical components of the national — and increasingly the international — infrastructure, components with which the rest of economic and social life were increasingly intertwined. As a result, the spate of questions, concerns, cautions, and enthusiasm about these technologies required careful negotiation to ensure that these forces served the good of everyone they affected. Several organizations sprouted up for just that purpose, including the W3C and the Internet Society , which brought together diverse interests to attempt to oversee the development of these technologies within the context of the overall common good. While these debates remained contentious as competing groups wrangled to assert their positions, and consensus over the future direction of these technologies was far from realization, there was little doubt that the Internet and the World Wide Web were thoroughly enough integrated into the fabric of society that they would both affect and be affected by the social forces that attempt to guide them.

FURTHER READING:

Berners-Lee, Tim, and Mark Fischetti. Weaving the Web: The Original Design and Ultimate Destiny of the World Wide Web by its Inventor. San Francisco , CA: HarperCollins, 1999.

— — . "The World Wide Web: Past, Present, and Future." Cambridge, MA: World Wide Web Consortium, August 1996. Available from www.w3.org/People/Berners-Lee .

Cerf, Vinton. "How the Internet Came to Be," in Bernard Aboba, The Online User's Encyclopedia. Boston, MA: Addison-Wesley, 1993.

Internet Society (ISOC). "All About the Internet: History of the Internet." Reston, VA: Internet Society, May 2001. Available from www.isoc.org .

SEE ALSO: ARPAnet; Berners-Lee, Tim; BITNET; Communications Protocol; Internet; Internet Society (ISOC); Local Area Network (LAN); Three Protocols, The; URL (Uniform Resource Locator); World Wide Web (WWW); World Wide Web Consortium (W3C)

Cite this article Pick a style below, and copy the text for your bibliography.

" History of the Internet and World Wide Web (WWW) . " Gale Encyclopedia of E-Commerce . . Encyclopedia.com. 15 Aug. 2024 < https://www.encyclopedia.com > .

"History of the Internet and World Wide Web (WWW) ." Gale Encyclopedia of E-Commerce . . Encyclopedia.com. (August 15, 2024). https://www.encyclopedia.com/economics/encyclopedias-almanacs-transcripts-and-maps/history-internet-and-world-wide-web-www

"History of the Internet and World Wide Web (WWW) ." Gale Encyclopedia of E-Commerce . . Retrieved August 15, 2024 from Encyclopedia.com: https://www.encyclopedia.com/economics/encyclopedias-almanacs-transcripts-and-maps/history-internet-and-world-wide-web-www

Citation styles

Encyclopedia.com gives you the ability to cite reference entries and articles according to common styles from the Modern Language Association (MLA), The Chicago Manual of Style, and the American Psychological Association (APA).

Within the “Cite this article” tool, pick a style to see how all available information looks when formatted according to that style. Then, copy and paste the text into your bibliography or works cited list.

Because each style has its own formatting nuances that evolve over time and not all information is available for every reference entry or article, Encyclopedia.com cannot guarantee each citation it generates. Therefore, it’s best to use Encyclopedia.com citations as a starting point before checking the style against your school or publication’s requirements and the most-recent information available at these sites:

Modern Language Association

http://www.mla.org/style

The Chicago Manual of Style

http://www.chicagomanualofstyle.org/tools_citationguide.html

American Psychological Association

http://apastyle.apa.org/

  • Most online reference entries and articles do not have page numbers. Therefore, that information is unavailable for most Encyclopedia.com content. However, the date of retrieval is often important. Refer to each style’s convention regarding the best way to format page numbers and retrieval dates.
  • In addition to the MLA, Chicago, and APA styles, your school, university, publication, or institution may have its own requirements for citations. Therefore, be sure to refer to those guidelines when editing your bibliography or works cited list.

More From encyclopedia.com

About this article, you might also like, nearby terms.

National Academies Press: OpenBook

Funding a Revolution: Government Support for Computing Research (1999)

Chapter: 7 development of the internet and the world wide web, 7 development of the internet and the world wide web.

The recent growth of the Internet and the World Wide Web makes it appear that the world is witnessing the arrival of a completely new technology. In fact, the Web—now considered to be a major driver of the way society accesses and views information—is the result of numerous projects in computer networking, mostly funded by the federal government, carried out over the last 40 years. The projects produced communications protocols that define the format of network messages, prototype networks, and application programs such as browsers. This research capitalized on the ubiquity of the nation's telephone network, which provided the underlying physical infrastructure upon which the Internet was built.

This chapter traces the development of the Internet, 1 one aspect of the broader field of data networking. The chapter is not intended to be comprehensive; rather, it focuses on the federal role in both funding research and supporting the deployment of networking infrastructure. This history is divided into four distinct periods. Before 1970, individual researchers developed the underlying technologies, including queuing theory, packet switching, and routing. During the 1970s, experimental networks, notably the ARPANET, were constructed. These networks were primarily research tools, not service providers. Most were federally funded, because, with a few exceptions, industry had not yet realized the potential of the technology. During the 1980s, networks were widely deployed, initially to support scientific research. As their potential to improve personal communications and collaboration became apparent, additional academic disciplines and industry began to use the technol-

ogy. In this era, the National Science Foundation (NSF) was the major supporter of networking, primarily through the NSFNET, which evolved into the Internet. Most recently, in the early 1990s, the invention of the Web made it much easier for users to publish and access information, thereby setting off the rapid growth of the Internet. The final section of the chapter summarizes the lessons to be learned from history.

By focusing on the Internet, this chapter does not address the full scope of computer networking activities that were under way between 1960 and 1995. It specifically ignores other networking activities of a more proprietary nature. In the mid-1980s, for example, hundreds of thousands of workers at IBM were using electronic networks (such as the VNET) for worldwide e-mail and file transfers; banks were performing electronic funds transfer; Compuserve had a worldwide network; Digital Equipment Corporation (DEC) had value-added networking services; and a VNET-based academic network known as BITNET had been established. These were proprietary systems that, for the most part, owed little to academic research, and indeed were to a large extent invisible to the academic computer networking community. By the late 1980s, IBM's proprietary SNA data networking business unit already had several billions of dollars of annual revenue for networking hardware, software, and services. The success of such networks in many ways limited the interest of companies like IBM and Compuserve in the Internet. The success of the Internet can therefore, in many ways, be seen as the success of an open system and open architecture in the face of proprietary competition.

Early Steps: 1960-1970

Approximately 15 years after the first computers became operational, researchers began to realize that an interconnected network of computers could provide services that transcended the capabilities of a single system. At this time, computers were becoming increasingly powerful, and a number of scientists were beginning to consider applications that went far beyond simple numerical calculation. Perhaps the most compelling early description of these opportunities was presented by J.C.R. Licklider (1960), who argued that, within a few years, computers would become sufficiently powerful to cooperate with humans in solving scientific and technical problems. Licklider, a psychologist at the Massachusetts Institute of Technology (MIT), would begin realizing his vision when he became director of the Information Processing Techniques Office (IPTO) at the Advanced Research Projects Agency (ARPA) in 1962. Licklider remained at ARPA until 1964 (and returned for a second tour in 1974-1975), and he convinced his successors, Ivan Sutherland and Robert Taylor, of the importance of attacking difficult, long-term problems.

Taylor, who became IPTO director in 1966, worried about the duplication of expensive computing resources at the various sites with ARPA contracts. He proposed a networking experiment in which users at one site accessed computers at another site, and he co-authored, with Licklider, a paper describing both how this might be done and some of the potential consequences (Licklider and Taylor, 1968). Taylor was a psychologist, not a computer scientist, and so he recruited Larry Roberts of MIT's Lincoln Laboratory to move to ARPA and oversee the development of the new network. As a result of these efforts, ARPA became the primary supporter of projects in networking during this period.

In contrast to the NSF, which awarded grants to individual researchers, ARPA issued research contracts. The IPTO program managers, typically recruited from academia for 2-year tours, had considerable latitude in defining projects and identifying academic and industrial groups to carry them out. In many cases, they worked closely with the researchers they sponsored, providing intellectual leadership as well as financial support. A strength of the ARPA style was that it not only produced artifacts that furthered its missions but also built and trained a community of researchers. In addition to holding regular meetings of principal investigators, Taylor started the "ARPA games," meetings that brought together the graduate students involved in programs. This innovation helped build the community that would lead the expansion of the field and growth of the Internet during the 1980s.

During the 1960s, a number of researchers began to investigate the technologies that would form the basis for computer networking. Most of this early networking research concentrated on packet switching, a technique of breaking up a conversation into small, independent units, each of which carries the address of its destination and is routed through the network independently. Specialized computers at the branching points in the network can vary the route taken by packets on a moment-to-moment basis in response to network congestion or link failure.

One of the earliest pioneers of packet switching was Paul Baran of the RAND Corporation, who was interested in methods of organizing networks to withstand nuclear attack. (His research interest is the likely source of a widespread myth concerning the ARPANET's original purpose [Hafner and Lyon, 1996]). Baran proposed a richly interconnected set of network nodes, with no centralized control system—both properties of today's Internet. Similar work was under way in the United Kingdom, where Donald Davies and Roger Scantlebury of the National Physical Laboratory (NPL) coined the term "packet."

Of course, the United States already had an extensive communications network, the public switched telephone network (PSTN), in which digital switches and transmission lines were deployed as early as 1962.

But the telephone network did not figure prominently in early computer networking. Computer scientists working to interconnect their systems spoke a different language than did the engineers and scientists working in traditional voice telecommunications. They read different journals, attended different conferences, and used different terminology. Moreover, data traffic was (and is) substantially different from voice traffic. In the PSTN, a continuous connection, or circuit, is set up at the beginning of a call and maintained for the duration. Computers, on the other hand, communicate in bursts, and unless a number of "calls" can be combined on a single transmission path, line and switching capacity is wasted. Telecommunications engineers were primarily interested in improving the voice network and were skeptical of alternative technologies. As a result, although telephone lines were used to provide point-to-point communication in the ARPANET, the switching infrastructure of the PSTN was not used. According to Taylor, some Bell Laboratories engineers stated flatly in 1967 that "packet switching wouldn't work." 2

At the first Association for Computing Machinery (ACM) Symposium on Operating System Principles in 1967, Lawrence Roberts, then an IPTO program manager, presented an initial design for the packet-switched network that was to become the ARPANET (Davies et al., 1967). In addition, Roger Scantlebury presented the NPL work (Roberts, 1967), citing Baran's earlier RAND report. The reaction was positive, and Roberts issued a request for quotation (RFQ) for the construction of a four-node network.

From the more than 100 respondents to the RFQ, Roberts selected Bolt, Beranek, and Newman (BBN) of Cambridge, Massachusetts; familiar names such as IBM Corporation and Control Data Corporation chose not to bid. The contract to produce the hardware and software was issued in December 1968. The BBN group was led by Frank Heart, and many of the scientists and engineers who would make major contributions to networking in future years participated. Robert Kahn, who with Vinton Cerf would later develop the Transmission Control Protocol/Internet Protocol (TCP/IP) suite used to control the transmission of packets in the network, helped develop the network architecture. The network hardware consisted of a rugged military version of a Honeywell Corporation minicomputer that connected a site's computers to the communication lines. These interface message processors (IMPs)—each the size of a large refrigerator and painted battleship gray—were highly sought after by DARPA-sponsored researchers, who viewed possession of an IMP as evidence they had joined the inner circle of networking research.

The first ARPANET node was installed in September 1969 at Leonard Kleinrock's Network Measurement Center at the University of California at Los Angeles (UCLA). Kleinrock (1964) had published some of the

earliest theoretical work on packet switching, and so this site was an appropriate choice. The second node was installed a month later at Stanford Research Institute (SRI) in Menlo Park, California, using Douglas Engelbart's On Line System (known as NLS) as the host. SRI also operated the Network Information Center (NIC), which maintained operational and standards information for the network. Two more nodes were soon installed at the University of California at Santa Barbara, where Glen Culler and Burton Fried had developed an interactive system for mathematics education, and the University of Utah, which had one of the first computer graphics groups.

Initially, the ARPANET was primarily a vehicle for experimentation rather than a service, because the protocols for host-to-host communication were still being developed. The first such protocol, the Network Control Protocol (NCP), was completed by the Network Working Group (NWG) led by Stephen Crocker in December 1970 and remained in use until 1983, when it was replaced by TCP/IP.

Expansion of the Arpanet: 1970-1980

Initially conceived as a means of sharing expensive computing resources among ARPA research contractors, the ARPANET evolved in a number of unanticipated directions during the 1970s. Although a few experiments in resource sharing were carried out, and the Telnet protocol was developed to allow a user on one machine to log onto another machine over the network, other applications became more popular.

The first of these applications was enabled by the File Transfer Protocol (FTP), developed in 1971 by a group led by Abhay Bhushan of MIT (Bhushan, 1972). This protocol enabled a user on one system to connect to another system for the purpose of either sending or retrieving a particular file. The concept of an anonymous user was quickly added, with constrained access privileges, to allow users to connect to a system and browse the available files. Using Telnet, a user could read the remote files but could not do anything with them. With FTP, users could now move files to their own machines and work with them as local files. This capability spawned several new areas of activity, including distributed client-server computing and network-connected file systems.

Occasionally in computing, a "killer application" appears that becomes far more popular than its developers expected. When personal computers (PCs) became available in the 1980s, the spreadsheet (initially VisiCalc) was the application that accelerated the adoption of the new hardware by businesses. For the newly minted ARPANET, the killer application was electronic mail, or e-mail. The first e-mail program was developed in 1972 by Ray Tomlinson of BBN. Tomlinson had built an

earlier e-mail system for communication between users on BBN's Tenex time-sharing system, and it was a simple exercise to modify this system to work over the network. By combining the immediacy of the telephone with the precision of written communication, e-mail became an instant hit. Tomlinson's syntax ( user@domain ) remains in use today.

Telnet, FTP, and e-mail were examples of the leverage that research typically provided in early network development. As each new capability was added, the efficiency and speed with which knowledge could be disseminated improved. E-mail and FTP made it possible for geographically distributed researchers to collaborate and share results much more effectively. These programs were also among the first networking applications that were valuable not only to computer scientists, but also to scholars in other disciplines.

From Arpanet to Internet

Although the ARPANET was ARPA's largest networking effort, it was by no means the only one. The agency also supported research on terrestrial packet radio and packet satellite networks. In 1973, Robert Kahn and Vinton Cerf began to consider ways to interconnect these networks, which had quite different bandwidth, delay, and error properties than did the telephone lines of the ARPANET. The result was TCP/IP, first described in 1973 at an International Network Working Group meeting in England. Unlike NCP, which enabled the hosts of a single network to communicate, TCP/IP was designed to interconnect multiple networks to form an Internet. This protocol suite defined the packet format and a flow-control and error-recovery mechanism to allow the hosts to recover gracefully from network errors. It also specified an addressing mechanism that could support an Internet comprising up to 4 billion hosts.

The work necessary to transform TCP/IP from a concept into a useful system was performed under ARPA contract by groups at Stanford University, BBN, and University College London. Although TCP/IP has evolved over the years, it is still in use today as the Internet's basic packet transport protocol.

By 1975, the ARPANET had grown from its original four nodes to nearly 100 nodes. Around this time, two phenomena—the development of local area networks (LANs) and the integration of networking into operating systems—contributed to a rapid increase in the size of the network.

Local Area Networks

While ARPANET researchers were experimenting with dedicated telephone lines for packet transmission, researchers at the University of

Hawaii, led by Norman Abramson, were trying a different approach, also with ARPA funding. Like the ARPANET group, they wanted to provide remote access to their main computer system, but instead of a network of telephone lines, they used a shared radio network. It was shared in the sense that all stations used the same channel to reach the central station. This approach had a potential drawback: if two stations attempted to transmit at the same time, then their transmissions would interfere with each other, and neither one would be received. But such interruptions were unlikely because the data were typed on keyboards, which sent very short pulses to the computer, leaving ample time between pulses during which the channel was clear to receive keystrokes from a different user.

Abramson's system, known as Aloha, generated considerable interest in using a shared transmission medium, and several projects were initiated to build on the idea. Two of the best-known projects were the Atlantic Packet Satellite Experiment and Ethernet. The packet satellite network demonstrated that the protocols developed in Aloha for handling contention between simultaneous users, combined with more traditional reservation schemes, resulted in efficient use of the available bandwidth. However, the long latency inherent in satellite communications limited the usefulness of this approach.

Ethernet, developed by a group led by Robert Metcalfe at Xerox Corporation's Palo Alto Research Center (PARC), is one of the few examples of a networking technology that was not directly funded by the government. This experiment demonstrated that using coaxial cable as a shared medium resulted in an efficient network. Unlike the Aloha system, in which transmitters could not receive any signals, Ethernet stations could detect that collisions had occurred, stop transmitting immediately, and retry a short time later (at random). This approach improved the efficiency of the Aloha technique and made it practical for actual use. Shared-media LANs became the dominant form of computer-to-computer communication within a building or local area, although variations from IBM (Token Ring) and others also captured part of this emerging market.

Ethernet was initially used to connect a network of approximately 100 of PARC's Alto PCs, using the center's time-sharing system as a gateway to the ARPANET. Initially, many believed that the small size and limited performance of PCs would preclude their use as network hosts, but, with DARPA funding, David Clark's group at MIT, which had received several Altos from PARC, built an efficient TCP implementation for that system and, later, for the IBM PC. The proliferation of PCs connected by LANs in the 1980s dramatically increased the size of the Internet.

Integrated Networking

Until the 1970s, academic computer science research groups used a variety of computers and operating systems, many of them constructed by the researchers themselves. Most were time-sharing systems that supported a number of simultaneous users. By 1970, many groups had settled on the Digital Equipment Corporation (DEC) PDP-10 computer and the Tenex operating system developed at BBN. This standardization enabled researchers at different sites to share software, including networking software.

By the late 1970s, the Unix operating system, originally developed at Bell Labs, had become the system of choice for researchers, because it ran on DEC's inexpensive (relative to other systems) VAX line of computers. During the late 1970s and early 1980s, an ARPA-funded project at the University of California at Berkeley (UC-Berkeley) produced a version of Unix (the Berkeley System Distribution, or BSD) that included tightly integrated networking capabilities. The BSD was rapidly adopted by the research community because the availability of source code made it a useful experimental tool. In addition, it ran on both VAX machines and the personal workstations provided by the fledgling Sun Microsystems, Inc., several of whose founders came from the Berkeley group. The TCP/IP suite was now available on most of the computing platforms used by the research community.

Standards and Management

Unlike the various telecommunications networks, the Internet has no owner. It is a federation of commercial service providers, local educational networks, and private corporate networks, exchanging packets using TCP/IP and other, more specialized protocols. To become part of the Internet, a user need only connect a computer to a port on a service provider's router, obtain an IP address, and begin communicating. To add an entire network to the Internet is a bit trickier, but not extraordinarily so, as demonstrated by the tens of thousands of networks with tens of millions of hosts that constitute the Internet today.

The primary technical problem in the Internet is the standardization of its protocols. Today, this is accomplished by the Internet Engineering Task Force (IETF), a voluntary group interested in maintaining and expanding the scope of the Internet. Although this group has undergone many changes in name and makeup over the years, it traces its roots directly to Stephen Crocker's NWG, which defined the first ARPANET protocol in 1969. The NWG defined the system of requests for comments (RFCs) that are still used to specify protocols and discuss other engineer-

ing issues. Today's RFCs are still formatted as they were in 1969, eschewing the decorative fonts and styles that pervade today's Web.

Joining the IETF is a simple matter of asking to be placed on its mailing list, attending thrice-yearly meetings, and participating in the work. This grassroots group is far less formal than organizations such as the International Telecommunications Union, which defines telephony standards through the work of members who are essentially representatives of various governments. The open approach to Internet standards reflects the academic roots of the network.

Closing the Decade

The 1970s were a time of intensive research in networking. Much of the technology used today was developed during this period. Several networks other than ARPANET were assembled, primarily for use by computer scientists in support of their own research. Most of the work was funded by ARPA, although the NSF provided educational support for many researchers and was beginning to consider establishing a large-scale academic network.

During this period, ARPA pursued high-risk research with the potential for high payoffs. Its work was largely ignored by AT&T, and the major computer companies, notably IBM and DEC, began to offer proprietary networking solutions that competed with, rather than applied, the ARPA-developed technologies. 3 Yet the technologies developed under ARPA contract ultimately resulted in today's Internet. It is debatable whether a more risk-averse organization lacking the hands-on program management style of ARPA could have produced the same result.

Operation of the ARPANET was transferred to the Defense Communication Agency in 1975. By the end of the decade, the ARPANET had matured sufficiently to provide services. It remained in operation until 1989, when it was superseded by subsequent networks. The stage was now set for the Internet, which was first used by scientists, then by academics in many disciplines, and finally by the world at large.

The NSFNET Years: 1980-1990

During the late 1970s, several networks were constructed to serve the needs of particular research communities. These networks—typically funded by the federal agency that was the primary supporter of the research area—included MFENet, which the Department of Energy established to give its magnetic fusion energy researchers access to supercomputers, and NASA's Space Physics Analysis Network (SPAN). The NSF began supporting network infrastructure with the establishment

of CSNET, which was intended to link university computer science departments with the ARPANET. The CSNET had one notable property that the ARPANET lacked: it was open to all computer science researchers, whereas only ARPA contractors could use the ARPANET. An NSF grant to plan the CSNET was issued to Larry Landweber at the University of Wisconsin in 1980.

The CSNET was used throughout the 1980s, but as it and other regional networks began to demonstrate their usefulness, the NSF launched a much more ambitious effort, the NSFNET. From the start, the NSFNET was designed to be a network of networks—an ''internet''—with a high-speed backbone connecting NSF's five supercomputer centers and the National Center for Atmospheric Research. To oversee the new network, the NSF hired Dennis Jennings from Trinity College, Dublin. In the early 1980s, Jennings had been responsible for the Irish Higher Education Authority network (HEANet), and so he was well-qualified for the task. One of Jennings' first decisions was to select TCP/IP as the primary protocol suite for the NFSNET.

Because the NSFNET was to be an internet (the beginning of today's Internet), specialized computers called routers were needed to pass traffic between networks at the points where the networks met. Today, routers are the primary products of multibillion-dollar companies (e.g., Cisco Systems Incorporated, Bay Networks), but in 1985, few commercial products were available. The NSF chose the "Fuzzball" router designed by David Mills at the University of Delaware (Mills, 1988). Working with ARPA support, Mills improved the protocols used by the routers to communicate the network topology among themselves, a critical function in a large-scale network.

Another technology required for the rapidly growing Internet was the Domain Name Service (DNS). Developed by Paul Mockapetris at the University of Southern California's Information Sciences Institute, the DNS provides for hierarchical naming of hosts. An administrative entity, such as a university department, can assign host names as it wishes. It also has a domain name, issued by the higher-level authority of which it is a part. (Thus, a host named xyz in the computer science department at UC-Berkeley would be named xyz.cs.berkeley.edu. ) Servers located throughout the Internet provide translation between the host names used by human users and the IP addresses used by the Internet protocols. The name-distribution scheme has allowed the Internet to grow much more rapidly than would be possible with centralized administration.

Jennings left the NSF in 1986. He was succeeded by Stephen Wolff, who oversaw the deployment and growth of the NSFNET. During Wolff's tenure, the speed of the backbone, originally 56 kilobits per second, was increased 1,000-fold, and a large number of academic and regional net-

works were connected to the NSFNET. The NSF also began to expand the reach of the NSFNET beyond its supercomputing centers through its Connections program, which targeted the research and education community. In response to the Connections solicitation, the NSF received innovative proposals from what would become two of the major regional networks: SURANET and NYSERNET. These groups proposed to develop regional networks with a single connection to the NSFNET, instead of connecting each institution independently.

Hence, the NSFNET evolved into a three-tiered structure in which individual institutions connected to regional networks that were, in turn, connected to the backbone of the NSFNET. The NSF agreed to provide seed funding for connecting regional networks to the NSFNET, with the expectation that, as a critical mass was reached, the private sector would take over the management and operating costs of the Internet. This decision helped guide the Internet toward self-sufficiency and eventual commercialization (Computer Science and Telecommunications Board, 1994).

As the NSFNET expanded, opportunities for privatization grew. Wolff saw that commercial interests had to participate and provide financial support if the network were to continue to expand and evolve into a large, single internet. The NSF had already (in 1987) contracted with Merit Computer Network Incorporated at the University of Michigan to manage the backbone. Merit later formed a consortium with IBM and MCI Communications Corporation called Advanced Network and Services (ANS) to oversee upgrades to the NSFNET. Instead of reworking the existing backbone, ANS added a new, privately owned backbone for commercial services in 1991. 4

Emergence of the Web: 1990 to the Present

By the early 1990s, the Internet was international in scope, and its operation had largely been transferred from the NSF to commercial providers. Public access to the Internet expanded rapidly thanks to the ubiquitous nature of the analog telephone network and the availability of modems for connecting computers to this network. Digital transmission became possible throughout the telephone network with the deployment of optical fiber, and the telephone companies leased their broadband digital facilities for connecting routers and regional networks to the developers of the computer network. In April 1995, all commercialization restrictions on the Internet were lifted. Although still primarily used by academics and businesses, the Internet was growing, with the number of hosts reaching 250,000. Then the invention of the Web catapulted the Internet to mass popularity almost overnight.

The idea for the Web was simple: provide a common format for

documents stored on server computers, and give each document a unique name that can be used by a browser program to locate and retrieve the document. Because the unique names (called universal resource locators, or URLs) are long, including the DNS name of the host on which they are stored, URLs would be represented as shorter hypertext links in other documents. When the user of a browser clicks a mouse on a link, the browser retrieves and displays the document named by the URL.

This idea was implemented by Timothy Berners-Lee and Robert Cailliau at CERN, the high-energy physics laboratory in Geneva, Switzerland, funded by the governments of participating European nations. Berners-Lee and Cailliau proposed to develop a system of links between different sources of information. Certain parts of a file would be made into nodes, which, when called up, would link the user to other, related files. The pair devised a document format called HYpertext Markup Language (HTML), a variant of the Standard Generalized Markup Language used in the publishing industry since the 1950s. It was released at CERN in May 1991. In July 1992, a new Internet protocol, the Hypertext Transfer Protocol (HTTP), was introduced to improve the efficiency of document retrieval. Although the Web was originally intended to improve communications within the physics community at CERN, it—like e-mail 20 years earlier—rapidly became the new killer application for the Internet.

The idea of hypertext was not new. One of the first demonstrations of a hypertext system, in which a user could click a mouse on a highlighted word in a document and immediately access a different part of the document (or, in fact, another document entirely), occurred at the 1967 Fall Joint Computer Conference in San Francisco. At this conference, Douglas Engelbart of SRI gave a stunning demonstration of his NLS (Engelbart, 1986), which provided many of the capabilities of today's Web browsers, albeit limited to a single computer. Engelbart's Augment project was supported by funding from NASA and ARPA. Engelbart was awarded the Association for Computing Machinery's 1997 A. M. Turing Award for this work. Although it never became commercially successful, the mouse-driven user interface inspired researchers at Xerox PARC, who were developing personal computing technology.

Widespread use of the Web, which now accounts for the largest volume of Internet traffic, was accelerated by the development in 1993 of the Mosaic graphical browser. This innovation, by Marc Andreessen at the NSF-funded National Center for Supercomputer Applications, enabled the use of hyperlinks to video, audio, and graphics, as well as text. More important, it provided an effective interface that allowed users to point-and-click on a menu or fill in a blank to search for information.

The development of the Internet and the World Wide Web has had a tremendous impact on the U.S. economy and society more broadly. By

January 1998, almost 30 million host computers were connected to the Internet (Zakon, 1998), and more than 58 million users in the United States and Canada were estimated to be online (Nielsen Media Research, 1997). Numerous companies now sell Internet products worth billions of dollars. Cisco Systems, a leader in network routing technology, for example, reported sales of $8.5 billion in 1998. Netscape Communications Corporation, which commercialized the Mosaic browser, had sales exceeding $530 million in 1997. 5 Microsoft Corporation also entered the market for Web browsers and now competes head-to-head with Netscape. A multitude of other companies offer hardware and software for Internet based systems.

The Internet has also paved the way for a host of services. Companies like Yahoo! and InfoSeek provide portals to the Internet and have attracted considerable attention from Wall Street investors. Other companies, like Amazon.com and Barnes & Noble, have established online stores. Amazon had online sales of almost $150 million for books in 1997. 6 Electronic commerce, more broadly, is taking hold in many types of organizations, from PC manufacturers to retailers to travel agencies. Although estimates of the value of these services vary widely, they all reflect a growing sector of the economy that is wholly dependent on the Internet. Internet retailing could reach $7 billion by the year 2000, and online sales of travel services are expected to approach $8 billion around the turn of the century. Forrester Research estimates that businesses will buy and sell $327 billion worth of goods over the Internet by the year 2002 (Blane, 1997).

The Web has been likened to the world's largest library—with the books piled in the middle of the floor. Search engines, which are programs that follow the Web's hypertext links and index the material they discover, have improved the organization somewhat but are difficult to use, frequently deluging the user with irrelevant information. Although developments in computing and networking over the last 40 years have realized some of the potential described by visionaries such as Licklider and Engelbart, the field continues to offer many opportunities for innovation.

Lessons from History

The development of the Internet demonstrates that federal support for research, applied at the right place and right time, can be extremely effective. DARPA's support gave visibility to the work of individual researchers on packet switching and resulted in the development of the first large-scale packet-switched network. Continued support for experimentation led to the development of networking protocols and applications, such as e-mail, that were used on the ARPANET and, subsequently, the Internet.

By bringing together a diverse mix of researchers from different institutions, such federal programs helped the Internet gain widespread acceptance and established it as a dominant mode of internetworking. Government programs such as ARPANET and NSFNET created a large enough base of users to make the Internet more attractive in many applications than proprietary networking systems being offered by a number of vendors. Though a number of companies continue to sell proprietary systems for wide area networking, some of which are based on packet-switched technology, these systems have not achieved the ubiquity of the Internet and are used mainly within private industry.

Research in packet switching evolved in unexpected directions and had unanticipated consequences. It was originally pursued to make more-efficient use of limited computing capabilities and later seen as a means of linking the research and education communities. The most notable result, however, was the Internet, which has dramatically improved communication across society, changing the way people work, play, and shop. Although DARPA and the NSF were successful in creating an expansive packet-switched network to facilitate communication among researchers, it took the invention of the Web and its browsers to make the Internet more broadly accessible and useful to society.

The widespread adoption of Internet technology has created a number of new companies in industries that did not exist 20 years ago, and most companies that did exist 20 years ago are incorporating Internet technology into their business operations. Companies such as Cisco Systems, Netscape Communications, Yahoo!, and Amazon.com are built on Internet technologies and their applications and generate billions of dollars annually in combined sales revenues. Electronic commerce is also maturing into an established means of conducting business.

The complementary missions and operating styles of federal agencies are important to the development and implementation of new technologies. Whereas DARPA supported early research on packet switching and development of the ARPANET, it was not prepared to support an operational network, nor did it expand its network beyond DARPA-supported research institutions. With its charter to support research and education, the NSF both supported an operational network and greatly expanded its reach, effectively building the infrastructure for the Internet.

1.  

Several other case studies of the Internet have also been written in recent years. In addition to the references cited in the text, see Leiner et al. (1998) and SRI International (1997).

2.  

Personal communication from Robert W. Taylor, former director of the

Information Processing Techniques Office, Defense Advanced Research Projects Agency, August 1988.

3.  

IBM and AT&T did support some in-house research on packet switching, but at the level of individual researchers. This work did not figure prominently in AT&T's plans for network deployment, nor did it receive significant attention at IBM, though researchers in both organizations published important papers.

4.  

Ferreiro, Mirna. 1996. "The Past and Future History of the Internet," research paper for International 610. George Mason University, Fairfax, Va., November.

5.  

Sales figures in this paragraph derive from annual reports filed by the companies cited.

6.  

Sales revenues as reported in Amazon.com's 1997 Annual Report available online at < >.

The past 50 years have witnessed a revolution in computing and related communications technologies. The contributions of industry and university researchers to this revolution are manifest; less widely recognized is the major role the federal government played in launching the computing revolution and sustaining its momentum. Funding a Revolution examines the history of computing since World War II to elucidate the federal government's role in funding computing research, supporting the education of computer scientists and engineers, and equipping university research labs. It reviews the economic rationale for government support of research, characterizes federal support for computing research, and summarizes key historical advances in which government-sponsored research played an important role.

Funding a Revolution contains a series of case studies in relational databases, the Internet, theoretical computer science, artificial intelligence, and virtual reality that demonstrate the complex interactions among government, universities, and industry that have driven the field. It offers a series of lessons that identify factors contributing to the success of the nation's computing enterprise and the government's role within it.

READ FREE ONLINE

Welcome to OpenBook!

You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

Do you want to take a quick tour of the OpenBook's features?

Show this book's table of contents , where you can jump to any chapter by name.

...or use these buttons to go back to the previous chapter or skip to the next one.

Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

Switch between the Original Pages , where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

To search the entire text of this book, type in your search term here and press Enter .

Share a link to this book page on your preferred social network or via email.

View our suggested citation for this chapter.

Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

Get Email Updates

Do you enjoy reading reports from the Academies online for free ? Sign up for email notifications and we'll let you know about new publications in your areas of interest when they're released.

World Wide Web Foundation

  • History of the Web
  • Sir Tim Berners-Lee

Sir Tim Berners-Lee invented the World Wide Web in 1989.

Image: © CERN

Sir Tim Berners-Lee is a British computer scientist. He was born in London, and his parents were early computer scientists, working on one of the earliest computers.

Growing up, Sir Tim was interested in trains and had a model railway in his bedroom. He recalls :

“I made some electronic gadgets to control the trains. Then I ended up getting more interested in electronics than trains. Later on, when I was in college I made a compute r out of an old television set.”

After graduating from Oxford University, Berners-Lee became a software engineer at CERN , the large particle physics laboratory near Geneva, Switzerland. Scientists come from all over the world to use its accelerators, but Sir Tim noticed that they were having difficulty sharing information.

“In those days, there was different information on different computers, but you had to log on to different computers to get at it. Also, sometimes you had to learn a different program on each computer. Often it was just easier to go and ask people when they were having coffee…”, Tim says .

Tim thought he saw a way to solve this problem – one that he could see could also have much broader applications. Already, millions of computers were being connected together through the fast-developing internet and Berners-Lee realised they could share information by exploiting an emerging technology called hypertext.

In March 1989, Tim laid out his vision for what would become the web in a document called “ Information Management: A Proposal ”. Believe it or not, Tim’s initial proposal was not immediately accepted. In fact, his boss at the time, Mike Sendall , noted the words “Vague but exciting” on the cover. The web was never an official CERN project, but Mike managed to give Tim time to work on it in September 1990. He began work using a NeXT computer, one of Steve Jobs’ early products.

Tim's original proposal. Image: CERN

By October of 1990, Tim had written the three fundamental technologies that remain the foundation of today’s web (and which you may have seen appear on parts of your web browser):

  • HTML: HyperText Markup Language. The markup (formatting) language for the web.
  • URI: Uniform Resource Identifier. A kind of “address” that is unique and used to identify to each resource on the web. It is also commonly called a URL.
  • HTTP: Hypertext Transfer Protocol. Allows for the retrieval of linked resources from across the web.

Tim also wrote the first web page editor/browser (“WorldWideWeb.app”) and the first web server (“httpd“). By the end of 1990, the first web page was served on the open internet, and in 1991, people outside of CERN were invited to join this new web community.

As the web began to grow, Tim realised that its true potential would only be unleashed if anyone, anywhere could use it without paying a fee or having to ask for permission.

He explains : “Had the technology been proprietary, and in my total control, it would probably not have taken off. You can’t propose that something be a universal space and at the same time keep control of it.”

So, Tim and others advocated to ensure that CERN would agree to make the underlying code available on a royalty-free basis, forever. This decision was announced in April 1993 , and sparked a global wave of creativity, collaboration and innovation never seen before. In 2003, the companies developing new web standards committed to a Royalty Free Policy for their work. In 2014, the year we celebrated the web’s 25th birthday , almost two in five people around the world were using it.

Tim moved from CERN to the Massachusetts Institute of Technology in 1994 to found the World Wide Web Consortium (W3C), an international community devoted to developing open web standards . He remains the Director of W3C to this day.

The early web community produced some revolutionary ideas that are now spreading far beyond the technology sector:

  • Decentralisation: No permission is needed from a central authority to post anything on the web, there is no central controlling node, and so no single point of failure … and no “kill switch”! This also implies freedom from indiscriminate censorship and surveillance.
  • Non-discrimination: If I pay to connect to the internet with a certain quality of service, and you pay to connect with that or a greater quality of service, then we can both communicate at the same level. This principle of equity is also known as Net Neutrality.
  • Bottom-up design: Instead of code being written and controlled by a small group of experts, it was developed in full view of everyone, encouraging maximum participation and experimentation.
  • Universality: For anyone to be able to publish anything on the web, all the computers involved have to speak the same languages to each other, no matter what different hardware people are using; where they live; or what cultural and political beliefs they have. In this way, the web breaks down silos while still allowing diversity to flourish.
  • Consensus: For universal standards to work, everyone had to agree to use them. Tim and others achieved this consensus by giving everyone a say in creating the standards, through a transparent, participatory process at W3C.

New permutations of these ideas are giving rise to exciting new approaches in fields as diverse as information (Open Data), politics (Open Government), scientific research (Open Access), education, and culture (Free Culture). But to date we have only scratched the surface of how these principles could change society and politics for the better.

In 2009, Sir Tim co-founded the World Wide Web Foundation with Rosemary Leith. The Web Foundation is fighting for the web we want: a web that is safe, empowering and for everyone.

Please do explore our site and our work . We hope you’ll be inspired by our vision and decide to take action. Remember, as Tim tweeted during the Olympics Opening Ceremony in 2012, “This is for Everyone” .

This is for everyone #london2012 #oneweb #openingceremony @webfoundation @w3c — Tim Berners-Lee (@timberners_lee) July 27, 2012

Important Note: This text is intended as a brief introduction to the history of the web. For a more detailed account, you might want to consider reading:

A Little History of the World Wide Web

  • W3C’s 10th Anniversary ( timeline )
  • “Weaving the Web” by Tim Berners-Lee
  • Frequently Asked Questions , and Answers for Young People , by Sir Tim Berners-Lee on W3C website.
  • With the web becoming an increasingly monitored space, each of us has a role to play in safeguarding online privacy
  • Online Gender-Based Violence Story – Maria, Costa Rica
  • Online Gender-Based Violence Story – Aisha, Nigeria

Sign up for news, events and campaign updates.

Support our work to deliver a web for everyone..

The History and Purpose of the World Wide Web

Historical background.

The World Wide Web is an exciting area for discovery and innovation. It is built around the concept of hypertext . This involves linking one document to another document so that the second can explain a term or concept in the first. For instance, let's say you're reading a document on salmon, and you reach a section where the word "roe" is marked in some way to make it stand out from the rest of the text (usually underlined and/or in color ). If you move your cursor to the word "roe" and click once (not twice as in operating systems!), you'll view a document which explains what roe are and may add some characteristic of salmon roe. Exciting, eh? It's a way of building in definitions and additional explanatory material without disrupting the flow of the text for those who may not need that material.

Many electronic encyclopedias have used this same concept, some with quite interesting audiovisual items in place of text explanations. (Actually, the concept of hypertext pre-dates the WWW.) This is the other exciting part of World Wide Web: the graphics and sound capabilities. You can link documents to images and pictures and sound clips for a more exciting package. Just click on the word "lion," for instance, and you'll hear it roar. Many folks who use World Wide Web home pages as business cards will have a photograph of themselves or the family included on the page. (see my home page for example). A home page is the first page or screen at a World Wide Web site.

Beyond the hypertext and audiovisuals, the World Wide Web is really a web of information and connections. Each item you choose in a Web site will either take you to another spot in the site you were in or will lead you to another part of the Web. You can always get back to where you were by retracing you steps.

What's In A Name?

Just to mention this so that you won't be confused later, the Web has a number of different acronyms and abbreviations. It can be referred to in any of the following ways: World Wide Web (three syllables) WWW (nine syllables) W3 (four syllables) the Web (two syllables)

Who Invented the WWW? (this section of the lesson was adapted from CERN's web site)

In late 1990, Tim Berners-Lee, a computer scientist at the European Laboratory for Particle Physics (CERN) invented the World Wide Web (that you are currently using. You may recall a reference to this in Dan Brown's Angels and Demons.) The "Web" was originally conceived and developed for the large high-energy physics collaborations which have a demand for instantaneous information sharing between physicists working in different universities and institutes all over the world. Tim together with Robert Cailliau wrote the first WWW client (a browser-editor running under NeXTStep, a cousin of the Macintosh) and the first WWW server along with most of the communications software, defining URLs, HTTP and HTML.

The World-Wide Web was first developed as a tool for collaboration in the high energy physics community. From there it spread rapidly to other fields, and grew to its present impressive size. As an easy way to access information, it has been a great success. But there is another side to the Web, its potential as a tool for collaboration between people. Here is some background to the early development of the World-Wide Web, a brief overview of its present state and an introduction to the concepts on which it is based.

In spite of all this enthusiasm for electronic communication, there were many obstacles in the 1980s to the effective exchange of information. There was a great variety of computer and network systems, with hardly any common features. Users needed to understand many inconsistent and complicated systems. Different types of information had to be accessed in different ways, involving a big investment of effort by users. The result was frustration and inefficiency.

This was fertile soil for the invention of the World-Wide Web by Berners-Lee. Using the WWW, scientists could at last access information from any source in a consistent and simple way. The launching of this revolutionary idea was made possible by the widespread adoption of the Internet around that time. This provided a de facto standard for communication between computers, on which WWW could be built. It also brought into being a "virtual community" of enthusiastic computer and communications experts, whose attitude fostered progress via the exchange of information over the Internet.

The first proposal for such a system was made at CERN by Berners-Lee was in 1989, and was further refined by him and Cailliau in 1990. By the end of that year, prototype software for a basic system was already being demonstrated. To encourage the adoption of the system, it was essential to offer access to existing information without having to convert it to an unfamiliar format. This was done by providing an interface to the CERN Computer Centre's documentation and help service, and also to the then-dominant Usenet newsgroups. All this information immediately became accessible via a simple WWW browser, which could be run on any system.

The early system included this browser, along with an information server and a library implementing the essential functions for developers to build their own software. This was released in 1991 to the high energy physics community via the CERN program library (via FTP), so that a whole range of universities and research laboratories could start to use it. A little later it was made generally available via the Internet, especially to the community of people working on hypertext systems. By the beginning of 1993 there were around 50 known information servers.

At this stage, there were essentially only two kinds of browsers. One was the original development version, very sophisticated but only available on NeXT computers. The other was the "line-mode" browser, which was easy to install and run on any computer platform but limited in power and user-friendliness. It was clear that the small team at CERN could not do all the work needed to develop the system further, so Berners-Lee launched a plea via the Internet for other developers to join in.

Early in 1993, the National Center for Supercomputing Applications (NCSA) at the University of Illinois released a first version of their Mosaic graphical web browser. This software ran in the X Window System environment (an interface for UNIX computers), popular in the research community. It could thus offer friendly window-based interaction on a platform in widespread use. Shortly afterwards NCSA also released versions for the Windows and Macintosh environments. By late 1993 there were over 500 known servers, and WWW accounted for 1% of Internet traffic, which seemed a lot in those days,  and the rest is history!

Tim Berners-Lee now works at the Laboratory of Computer Science at the Massachusetts Institute of Technology, MIT, where Berners-Lee has taken up a research appointment. By the way, he makes no royalties for his invention of the WWW!

More about URLs (pronounced "Earls")

The naming convention used to accomplish this trick is called a URL-- Uniform Resource Locater. URLs are the standard naming format for the Internet. This is one of the several technologies developed by Berners-Lee at CERN. They allow web browsers to take you to many different kinds of Internet resources. You can tell which kind of resource you're about to visit by looking at its URL (pronounced "Earl"). Here is a table of some common URL prefixes:

Now that we've discussed the history of the WWW , it's time to talk about a few rules of the road. Understanding netiquette , or Internet etiquette, will keep you from making an electronic faux pas and will enable you to tell when others do. Netiquette is as close as the Internet comes to having rules. Like every society in history, the Net has to have a few mutually agreed upon guidelines to efficiently manage resources and maintain group harmony. Chaos reigns in some corners of the Net, but it's in your best interest to be a solid, upstanding netizen .

One List of Rules

Below I have listed one set of netiquette rules that illustrate the major issues and common errors made in Internet discourse. This section is taken from a classic online document prepared by Arlene Rinaldi for basic netiquette instruction. In her essay, Rinaldi discusses many forms of internet communication, including email, forum postings, web pages, etc. Her suggestions continue to provide the common sense "rules of the road" for communicating efficiently and professionally on the Internet and thus should be something with which every professional should be familiar:

Electronic Communications

(Email, ListServ mailing lists, and other discussion groups)

  • Keep paragraphs and messages short and to the point.
  • Focus on one subject per message and always include a pertinent subject title for the message, that way the user can locate the message quickly.
  • Don't use academic networks (like Augsburg's) for commercial or proprietary work.

Your signature footer should include your name, position, affiliation and Internet addresses and should not exceed more than 10 lines. Optional information could include your address, phone number and URL for a WWW homepage.

  • Capitalize words only to highlight an important point or to distinguish a title or heading. *Asterisks* surrounding a word also can be used to make a stronger point. Capitalizing whole words that are not titles is generally termed as SHOUTING!
  • Limit line length and avoid control (special non-language) characters.
  • Follow chain of command procedures for corresponding with superiors. For example, don't send a complaint via Email directly to the "top" just because you can.
  • Be professional and careful what you say about others. Email is easily forwarded.
  • Cite all quotes, references and sources and respect copyright and license agreements.
  • It is considered extremely rude to forward personal email to mailing lists or Usenet without the original author's permission.
  • Be careful when using sarcasm and humor. Without face to face communications your joke may be viewed as criticism.

Examples: IMHO= in my humble/honest opinion

FYI = for your information

BTW = by the way

Flame = antagonistic criticism

:-) = happy face for humor

However, messages that are filled with acronyms can be confusing and annoying to the reader, i.e. FYI IMHO I FLAME newbies who don't RTFM.

Rinaldi's list gives you the gist of what you should and should not do with email in general. She also has sections in her document that cover other Internet applications we'll discuss later. The full article is available here.

What follows is an expansion on certain of her rules that I would like to stress.

Capitalization

As Rinaldi mentions, capitalization is viewed as shouting except in the case of acronyms or product names. More often than not, a person typing their entire message in caps is not shouting, but is merely less than knowledgeable about the rules. These folks are newbies or new users and should be treated gently. Don't forget--everyone has been a newbie at some point. Subtle reminders should do the job here.

There are cases in the Internet where you may need to use caps (or refrain from them) to use an application that resides on a Unix computer. Unix machines are extremely case-sensitive, and if you attempt to type a command in lower case that is intended to be in upper case you will be stymied. So, if you read about a site you'd like to look at and all the commands are given in upper case, be sure to enter them this way. Otherwise, just type along in mostly lower case and enjoy the laid-back land of the Internet! 8-)

Emoticons, Emoji, or Smileys

Of course, I was being sarcastic in that last line, as I have been from time to time in the lessons. However, it's very difficult to tell what the true intent of a line of text is in email. In this virtual world, there are no facial movements or changes in tone that can make a derogatory comment into a friendly joke. One academic study as shown that people correctly interpret the intended tone of an email only about 50 percent of the time (Kruger, Gordan, and Kuban , 2006). Because of this, we need to make some changes in our standard typing to convey emotions. One way is to emphasize words or phrases using *asterisks* or _underline symbols_. Another is the use of emoticons (for emotion icons)--symbols that stand for emotive content.

The little 8-) character following my last sentence about capitalization is just one example of an emoticon, which are also called smileys or emoji. If you turn your head to the left, it looks a little like a guy with glasses on smiling. Feel free to create your own, but you can choose from the selection below.

:-) basic smiley face ;-) sarcasm :-( user is unhappy 8-) user wears glasses B-) user wears horn-rimmed glasses (-: user is left handed :*) user is drunk :-@ user is screaming d8= your pet beaver is wearing goggles and a hard hat

You can see from this list how quickly the whole communicative aspect deteriorates and the fun begins. 8-) There are lists and dictionaries of smileys available on the Internet (see the assignment for this lesson for a few addresses).

Now we'll return to that capitalization-filled message that isn't a mistake: a flame . Flaming is an often-angry, mean-spirited attack on another person via email. It is a major breach of netiquette to flame someone. It's rather counterproductive and usually the result of either a quick move to judgment or a sadistic temperament. Unfortunately, there is little you can do when you have been flamed. Responding in kind brings only joy to the flamer and provides you with only momentary satisfaction. My advice is to contact a systems administrator at the flamer's institution and register a complaint. Those who flame repeatedly can have their accounts shut down. You may be lucky enough to never be flamed personally (I haven't, unless I'm just too naive 8-) ). I just wanted you to be familiar with the term since it often shows up in heated discussions (i.e., "I'm not trying to flame here, but I think your ideas are full of . . . ").

Subject Headings

I cannot emphasize enough here the importance of a good subject heading for your communications. If a person or a listserv sends me a message without a subject heading, I will generally delete it. The same goes for email attachments and shared documents. Everything you share on the Internet should be properly labeled. Subject headings with non-descriptive titles like "Internet" or "Hi!" can be a bother to deal with.

Now don't get me wrong. I read every piece of email I receive from folks involved in this class. However, when I'm reading messages from other correspondents I need a coherent, meaningful subject heading to help me decide when (or if) I should read the message. Try to be as descriptive as possible in the space provided. Other users out there (who are even pickier than I am) will thank you. Most important, your message will get read.

One issue that crops up periodically is recycling the subject line of your email. We discussed this in the previous lesson, but it bears repeating. The issue is recycling old messages to start a new email conversation. This happens when someone uses an old message left over in their inbox to begin a new correspondence on a new topic, but they leave the old subject line intact. Using an old message to correspond on a new topic may inadvertently cause the message to be mishandled by email applications because the original message's label, as assigned by your email correspondent, will be applied to any replies to that message. So your mis-labeled new message may also be tagged or filed inappropriately (e.g., as spam or filtered to some other folder). The moral of the story? Always start a new email conversation with a fresh new email and always accurately label your subject!

Message and Signature File Length

Rinaldi mentions keeping messages and signature files short, and I shout, "here, here!" There's nothing wrong with sending a long message per se. You want to be sure of two things: one, that your recipient(s) will actually want to see the whole message, and two, that you let them know in the subject line that it is a long message (i.e., "Subj: Report on NAFTA - *Long*"). Long messages aren't much of a mechanical or financial difficulty, but can be annoying. So, keep your messages manageable in size and, if possible, warn your recipients.

With signature files, you generally want to stick with the ten line rule to keep your messages shorter (see above for reasons). Adding a couple of extra lines is no great sin, but more than that drives some people bonkers. The whole line limit comes out of an early prevalence of _long_, _long_ signature files that ate up space and served only artistic (and egotistic) purpose. We're talking 20-30+ lines here. You will occasionally see longer ones, but please try to be courteous. A Movement to Take Back Email Some people believe that email and associated net junk is a problem worth confronting with political action. Hence a group of concerned netizens is attempting to provoke a discussion about better and more efficient ways to use email. The call their movement The Email Charter. Their goal is to reduce and simplify the rules of when and how to send email (the "netiquette of email") by proposing 10 rules. The claim "(t)he average time taken to respond to an email is greater, in aggregate, than the time it took to create it." Take a look at their proposed rules and vote for the idea if you wish at: < http://emailcharter.org/ >. Conclusion

After letting this lesson soak in, you should now be able to confidently dine at the Internet table and use the right fork. With this knowledge in your cranium, I now pronounce you ready to professionally participate on the Internet wherever you wish.

References and links to more information Kruger, J., Gordan, C., and Kuban, J. (2006). Intentions in teasing: When "just kidding" just isn't good enough. Journal of Personality and Social Psychology, 90, 412-425. Rinaldi, A. (1998). Electronic Communications (email, LISTSERV groups, Mailing lists, and Usenet). In The Net: User Guidelines and Netiquette [Online]. Available via WWW at:  http://courses.cs.vt.edu/~cs3604/lib/Netiquette/Rinaldi/

History of the Internet : This up-to-date interactive website provides a comprehensive illustrated history of the Internet including some possibilities for future directions and links to more information.

Internet Lessons version 2.2. Copyright of lessons (C) 2017 by Joseph A. Erickson, All Rights Reserved. Permission Granted for Individual Usage.

If you plan to distribute multiple copies of this work, please contact the author.

Numbers, Facts and Trends Shaping Your World

Read our research on:

Full Topic List

Regions & Countries

  • Publications
  • Our Methods
  • Short Reads
  • Tools & Resources

Read Our Research On:

World Wide Web Timeline

Since its founding in 1989, the World Wide Web has touched the lives of billions of people around the world and fundamentally changed how we connect with others, the nature of our work, how we discover and share news and new ideas, how we entertain ourselves and how communities form and function.

The timeline below is the beginning of an effort to capture both the major milestones and small moments that have shaped the Web since 1989. It is a living document that we will update with your contributions. To suggest an item to add to the timeline, please  message us .

The World Wide Web begins as a CERN (European Organization for Nuclear Research) project called ENQUIRE, initiated by British scientist Tim Berners-Lee . Other names considered for the project include “The Information Mesh” and “The Mine of Information.”

  • AOL launches its Instant Messenger chat service and begins welcoming users with the iconic greeting “You’ve got mail!”

800px-First_Web_Server

  • 42% of American adults have used a computer.
  • World’s first website and server  go live at CERN , running on Tim Berners-Lee’s NeXT computer, which bears the message “This machine is a server. DO NOT POWER DOWN!”
  • Tim Berners-Lee develops the first Web browser WorldWideWeb .
  • Archie , the first tool to search the internet is developed by McGill University student Alan Emtage.

Researchers rig up a live shot of a coffee pot so they could tell from their computer screens when a fresh pot had been brewed. Later connected to the World Wide Web, it becomes the first webcam.

  • The term “ surfing the internet ” is coined and popularized.

LHC5

Tim Berners-Lee posts the first photo , of the band “Les Horribles Cernettes,” on the Web.

  • The line-mode browser launches . It is the first readily accessible browser for the World Wide Web.
  • CERN places its World Wide Web technology in the public domain , donating it to the world.
  • The National Center for Supercomputing Applications (NCSA) releases Mosaic 1.0 , the first web browser to become popular with the general public. “The web as we know it begins to flourish,” Wired later writes .
  • The New York Times writes about the Web browser Mosaic and the World Wide Web for the first time. “Think of it as a map to the buried treasures of the Information Age.”
  • Marc Andreessen  proposes  the IMG HTML tag to allow the display of images on the Web.

  • 11 million American households are “ equipped to ride the information superhighway .”

pizzanet

One of the first known Web purchases takes place: a pepperoni pizza with mushrooms and extra cheese from Pizza Hut .

  • President Bill Clinton’s White House comes online .
  • Yahoo! is created  by Stanford University graduate students Jerry Yang and David Filo. They originally named the site “Jerry and David’s Guide to the World Wide Web.”

The first banner ad for hotwired.com appears, with the text “Have you ever clicked your mouse right HERE? —> YOU WILL.”

  • Two lawyers post the first massive, commercial spam message  with the subject “Green Card Lottery -Final One?”
  • 18 million American homes are now online, but only 3% of online users have ever signed on to the World Wide Web .

amazon

Amazon.com opens for business , billing itself as the “Earth’s Biggest Bookstore.”

  • Craig Newmark starts craigslist , originally an email list of San Francisco events.
  • Match.com, the first online dating site, launches .
  • Entrepreneur Pierre Omidyar launches eBay , originally named “AuctionWeb.” He lists the first item for sale: a broken laser pointer . A collector purchases it for $14.83.
  • Chris Lamprecht becomes the first person to be banned from the internet by judicial decree. “I told the judge computers were my life,” Lamprecht later recalled.
  • Netscape IPO starts the gold rush mentality for Web startups.
  • Microsoft releases Windows 95  and the first version of Internet Explorer .
  • Web hosting service GeoCities launches.
  • 77% of online users send or receive e-mail at least once every few weeks, up from 65% in 1995.

nokia-9000i

Nokia releases the Nokia 9000 Communicator , the first cellphone with internet capabilities.

  • HoTMaiL launches as  one of the world’s first Webmail services , its name a reference to the HTML internet language used to build webpages.
  • The Dancing Baby , a 3D animation, becomes one of the first viral videos.

pathfinder_internet

  • Netflix  launches as a company  that sends DVDs to homes via mail.
  • Go Daddy launches as Jomax Technologies.
  • Google.com  registers as a domain .
  • Jorn Barger  becomes the first person to use the term “Weblog”  to describe the list of links on his website.

aol_old_cdrom_122810

20% of Americans get news from the internet at least once a week, up from 4% in 1995.

  • AOL launches AOL 4.0 and inundates American homes with CD-ROM mailers . AOL membership jumps from 8 million to 16 million members.
  • The Internet Corporations for Assigned Names and Numbers (ICANN) takes over responsibility for the coordination of the global internet’s systems of unique identifiers.
  • Oxford Dictionary adds “spam” and “digerati.”
  • Pew Research Center tests online polling  with mixed results.
  • MP3 downloading service Napster launches, overloading high-speed networks in college dormitories. Many colleges ban the service and it is later shut down for enabling the illegal sharing of music files.
  • Yahoo! acquires GeoCities for $3.6 billion.
  • 43% of internet users say they would miss going online “a lot,” up from 32% in 1995.
  • 78% of internet users who download music  don’t think it’s stealing  to save music files to their computer hard drives.

40 million Americans – or 48% of internet users – have purchased a product online .

  • 32% of internet users (over 30 million people)  sent e-greeting cards to loved ones and friends.

Nasdaq2

The NASDAQ hits a record high of 5,048, before plunging by 78%  during the dot com bust. A 2001 survey finds 71% of Americans who had heard about the dot com troubles  believe a major cause of the dot-com woes is that investors were eager to make a lot of money and took at lot of risks.

  • AOL acquires Time Warner for $165 billion.  New York Times  says “it could be the internet companies that do the buying and the old media that sell out.”
  • Only 3% of internet users say they got most of their information about the 9/11 attacks and the aftermath from the internet.

The average internet user spends  83 minutes  online.

  • Jimmy Wales launches  Wikipedia . Users write over 20,000 encyclopedia entries in the first year.
  • 55 million people now go online from work  and 44% of those who have internet access at work say their use of the internet helps them do their jobs.

Screenshot by Wired

Social networking site Friendster.com launches, but is quickly  overtaken by Facebook .

  • Microsoft launches Xbox Live , its online multiplayer gaming service.”Critics scoffed at the idea, noting how uncommon broadband connections were at the time.”

Apple  launches the iTunes Music Store  with 200,000 songs at 99¢ each. The store sells one million songs in its first week.

  • Skype, a voice-over-IP calling and instant messaging service, launches and quickly becomes a verb, as in “Skype me.”
  • Professional networking site LinkedIn launches.
  • MySpace.com  is founded  and quickly adopted by musicians seeking to share music and build their fan bases.
  • President George W. Bush signs the CAN-SPAM Act into law , establishing the first national standards for the sending of commercial email.
  • WordPress blog publishing system created.
  • 11% of American internet users follow the returns on election night online. One-in-ten internet users sign up for political email newsletters and news alerts during the campaign.

Thefacebook

Harvard student Mark Zuckerberg launches thefacebook.com. 1,200 Harvard students sign up within the first 24 hours . Facebook goes on to become the world’s biggest social networking site, with over a billion users worldwide.

  • Google starts trading on the NASDAQ at $85 a share .
  • Social news website Digg launches. Digg users vote to “digg up” links that they like and “bury” down those they don’t.
  • Mozilla releases Firefox 1.0 .
  • Massively multiplayer online role-playing game(MMORPG)  World of Warcraft launches.
  • 8% of adult American internet users say they participate in sports fantasy leagues online.
  • 9% of internet users (13 million Americans)  went online to donate money  to the victims of Gulf Coast hurricanes Katrina and Rita.

About one-in-six online adults – 25 million people – have sold something online .

  • Broadband connections surpass dial-up connections.
  • Community news site  Reddit  is founded. It is bought by Conde Nast a year later for $20 million.
  • Rupert Murdoch’s News Corp. buys MySpace for $580 million and  sells it  in 2011 for $35 million.
  • YouTube is founded on Valentine’s Day. The first video , an explanation of what’s cool about elephants, is uploaded by co-founder Jawed Karim on April 23. Google acquires the company a year later.

  • The late Senator Ted Stevens describes the internet as “a series of tubes,” during a 2006 speech on net neutrality. His quote is  mocked by Boing Boing  and the  Daily Show  and inspires YouTube remixes.
  • Google acquires YouTube for $1.65 billion . YouTube founders Chad and Steve announce the Google acquisition in a  video recorded in a parking lot : “The king of search and the king of video have gotten together.”
  • Twitter launches. Founder Jack Dorsey sends the first tweet: “just setting up my twttr”
just setting up my twttr — jack (@jack) March 21, 2006
  • 36% of American online adults consult Wikipedia .
  • 32% of Americans have at least heard about Hillary and Bill Clinton’s video parody of the final episode of “The Sopranos” and 19% have actually seen it.
  • 36% of Americans say they would have a hard time giving up their Blackberry or other wireless email device, up from 6% in 2002.

IMG_2298

  • Estonia becomes the world’s first country to use internet voting in a parliamentary election .
  • Three-quarters (74%) of internet users – or 55% of the entire U.S. adult population — say they went online during the presidential election to take part in or get news and information about the campaign.
  • 19% of cellphone owners say they have gone online with their phones .
  • Google releases the Chrome Web browser .
  • HTML5  is introduced.
  • Deal-of-the-day website Groupon launches.
  • Apple launches its App Store  with 552 applications.
  • Microsoft offers to buy Yahoo! for $44.6 billion , but the two companies  cannot agree on a purchase price .
  • World of Warcraft hits 11.5 million subscribers worldwide. Guinness Book of World Records names it the most popular MMORPG .
  • 69% of Americans turn to the internet to cope with and understand the recession .
  • Microsoft’s  Bing  search engine launches.
  • Twitter raises $98 million from investors,  valuing the company at a whopping $1 billion .
  • The Web is transfixed by the tale of a six-year-old boy  flying over Colorado  in a weather balloon. The story later proves to be a hoax .
  • Kanye West’s VMA outburst sparks an internet meme .
  • Viral videos like  David After Dentist , Susan Boyle , Baby Dancing to Beyonce , and the  JK Wedding Entrance Dance  launch ordinary people into newfound Web stardom.

  • 35% of adults have cell phones with apps , but only two-thirds actually use them.
  • Social photo-sharing sites Pinterest and Instagram launch.
  • Wikileaks collaborates with major media organizations to release U.S. diplomatic cables .
  • Ex-Facebook employees launch user-based question and answer site Quora .
  • 15% of social media-using teens say they have been the target of online meanness .
  • 68% of all Americans say the internet has had a major impact on the ability of groups to communicate with members .
  • LinkedIn reaches 100 million users and debuts on NYSE .
  • Microsoft buys Skype  for $8.5 billion.
  • Google+ launches .
  • Young Egyptians use the hashtags #Egypt and #Jan25 on Twitter to spread the word about the Egyptian Revolution.  The government responds by shutting down the internet .
  • Rebecca Black’s “ Friday ” becomes a YouTube sensation .
  • 66% of internet users use Facebook and 12% use Instagram .
  • Among the 13% of US adults who made a financial contribution to a presidential candidate, 50% donated online or via email .
  • Facebook reaches 1 billion monthly active users, making it the dominant social network worldwide. Some analysts start calling it “ Facebookistan .” The company buys Instagram  for $1 billion and  debuts on NASDAQ at $38 a share.
  • South Korean music star PSY’s “ Gangnam Style ” video surpasses Justin Bieber’s “ Baby ” as the most viewed video ever, with over 800 million views.
  • Ecommerce sales top $1 trillion worldwide .
  • The Internet Society founds the Internet Hall of Fame  to “celebrate people who bring the internet to life.”
  • A majority (56%) of Americans  now own a smartphone of some kind .
  • 51% of U.S. adults bank online .

history of the world wide web essay

Former CIA employee and NSA contractor Edward Snowden turns over thousands of classified documents to media organizations, exposing a top-secret government data surveillance program.

  • Apple says app store downloads  top 40 billion , with 20 billion in 2012 alone.
  • Twitter files for its long-awaited IPO . Shares soar 73% above their IPO price of $26 a share on the first day of trading.

https://twitter.com/twitter/statuses/378261932148416512

  • 45% of internet users ages 18-29 in serious relationships say the internet has had an impact on their relationship .
  • Facebook buys messaging app Whatsapp for $19 billion.

Sign up for our weekly newsletter

Fresh data delivery Saturday mornings

Sign up for The Briefing

Weekly updates on the world of news & information

  • Future of the Internet (Project)
  • Platforms & Services

As AI Spreads, Experts Predict the Best and Worst Changes in Digital Life by 2035

The future of human agency, the metaverse in 2040, visions of the internet in 2035, experts doubt ethical ai design will be broadly adopted as the norm within the next decade, most popular.

901 E St. NW, Suite 300 Washington, DC 20004 USA (+1) 202-419-4300 | Main (+1) 202-857-8562 | Fax (+1) 202-419-4372 |  Media Inquiries

Research Topics

  • Email Newsletters

ABOUT PEW RESEARCH CENTER  Pew Research Center is a nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping the world. It conducts public opinion polling, demographic research, media content analysis and other empirical social science research. Pew Research Center does not take policy positions. It is a subsidiary of  The Pew Charitable Trusts .

© 2024 Pew Research Center

Tim Berners-Lee

In response to a request, a one page looking back on the development of the Web from my point of view. Written 1998/05/07

The World Wide Web: A very short personal history

There have always been things which people are good at, and things computers have been good at, and little overlap between the two. I was brought up to understand this distinction in the 50s and 60s and intuition and understanding were human characteristics, and that computers worked mechanically in tables and hierarchies.

One of the things computers have not done for an organization is to be able to store random associations between disparate things, although this is something the brain has always done relatively well. In 1980 I played with programs to store information with random links, and in 1989, while working at the European Particle Physics Laboratory, I proposed that a global hypertext space be created in which any network-accessible information could be refered to by a single "Universal Document Identifier". Given the go-ahead to experiment by my boss, Mike Sendall, I wrote in 1990 a program called "WorldWideWeb", a point and click hypertext editor which ran on the "NeXT" machine. This, together with the first Web server, I released to the High Energy Physics community at first, and to the hypertext and NeXT communities in the summer of 1991. Also available was a "line mode" browser by student Nicola Pellow, which could be run on almost any computer. The specifications of UDIs (now URIs), HyperText Markup Language (HTML) and HyperText Transfer Protocol (HTTP) published on the first server in order to promote wide adoption and discussion.

The dream behind the Web is of a common information space in which we communicate by sharing information. Its universality is essential: the fact that a hypertext link can point to anything, be it personal, local or global, be it draft or highly polished. There was a second part of the dream, too, dependent on the Web being so generally used that it became a realistic mirror (or in fact the primary embodiment) of the ways in which we work and play and socialize. That was that once the state of our interactions was on line, we could then use computers to help us analyse it, make sense of what we are doing, where we individually fit in, and how we can better work together.

The first three years were a phase of persuasion, aided by my colleague and first convert Robert Cailliau, to get the Web adopted. We needed Web clients for other platforms (as the NeXT was not ubiquitous) and browsers Erwise, Viola, Cello and Mosaic eventually came on the scene. We needed seed servers to provide incentive and examples, and all over the world inspired people put up all kinds of things.

Between the summers of 1991 and 1994, the load on the first Web server ("info.cern.ch") rose steadily by a factor of 10 every year. In 1992 academia, and in 1993 industry, was taking notice. I was under pressure to define the future evolution. After much discussion I decided to form the World Wide Web Consortium in September 1994, with a base at MIT is the USA, INRIA in France, and now also at Keio University in Japan. The Consortium is a neutral open forum where companies and organizations to whom the future of the Web is important come to discuss and to agree on new common computer protocols. It has been a center for issue raising, design, and decision by consensus, and also a fascinating vantage point from which to view that evolution.

With the dramatic flood of rich material of all kinds onto the Web in the 1990s, the first part of the dream is largely realized, although still very few people in practice have access to intuitive hypertext creation tools. The second part has yet to happen, but there are signs and plans which make us confident. The great need for information about information, to help us categorize, sort, pay for, own information is driving the design of languages for the web designed for processing by machines, rather than people. The web of human-readable document is being merged with a web of machine-understandable data. The potential of the mixture of humans and machines working together and communicating through the web could be immense.

Back to main Bio

Internet history timeline: ARPANET to the World Wide Web

The internet history timeline shows how today's vast network evolved from the initial concept

Internet history

  • Internet timeline

Additional resources

Bibliography.

In internet history, credit for the initial concept that developed into the World Wide Web is typically given to Leonard Kleinrock. In 1961, he wrote about ARPANET, the predecessor of the internet, in a paper entitled "Information Flow in Large Communication Nets." 

According to the journal Management and Business Review (MBR), Kleinrock, along with other innovators such as J.C.R. Licklider, the first director of the Information Processing Technology Office (IPTO), provided the backbone for the ubiquitous stream of emails, media, Facebook postings and tweets that are now shared online every day.

Firewall: Definition, technology and facts

Latency: Definition, measurement and testing

What is cyberwarfare?

The precursor to the internet was jumpstarted in the early days of the history of computers , in 1969 with the U.S. Defense Department's Advanced Research Projects Agency Network (ARPANET), according to the journal American Scientist . ARPA-funded researchers developed many of the protocols used for internet communication today. This timeline offers a brief history of the internet’s evolution:

Internet timeline: 1960s

1965: Two computers at MIT Lincoln Lab communicate with one another using packet-switching technology.

1968: Beranek and Newman, Inc. (BBN) unveils the final version of the Interface Message Processor (IMP) specifications. BBN wins ARPANET contract.

1969: On Oct. 29, UCLA’s Network Measurement Center, Stanford Research Institute (SRI), University of California-Santa Barbara and University of Utah install nodes. The first message is "LO," which was an attempt by student Charles Kline to "LOGIN" to the SRI computer from the university. However, the message was unable to be completed because the SRI system crashed.

Internet nodes

1970–1980

1972: BBN’s Ray Tomlinson introduces network email. The Internet Working Group (INWG) forms to address need for establishing standard protocols.

1973: Global networking becomes a reality as the University College of London (England) and Royal Radar Establishment (Norway) connect to ARPANET. The term internet is born.

1974: The first Internet Service Provider (ISP) is born with the introduction of a commercial version of ARPANET, known as Telenet.

1974: Vinton Cerf and Bob Kahn (the duo said by many to be the Fathers of the Internet) publish "A Protocol for Packet Network Interconnection," which details the design of TCP .

1976: Queen Elizabeth II hits the “send button” on her first email.

1979: USENET forms to host news and discussion groups.

1980–1990

1981: The National Science Foundation (NSF) provided a grant to establish the Computer Science Network (CSNET) to provide networking services to university computer scientists.

1982: Transmission Control Protocol (TCP) and Internet Protocol (IP), as the protocol suite, commonly known as TCP/IP, emerge as the protocol for ARPANET. This results in the fledgling definition of the internet as connected TCP/IP internets. TCP/IP remains the standard protocol for the internet.

1983: The Domain Name System (DNS) establishes the familiar .edu, .gov, .com, .mil, .org, .net, and .int system for naming websites. This is easier to remember than the previous designation for websites, such as 123.456.789.10.

1984: William Gibson, author of "Neuromancer," is the first to use the term "cyberspace."

1985: Symbolics.com, the website for Symbolics Computer Corp. in Massachusetts, becomes the first registered domain.

1986: The National Science Foundation’s NSFNET goes online to connected supercomputer centers at 56,000 bits per second — the speed of a typical dial-up computer modem. Over time the network speeds up and regional research and education networks, supported in part by NSF, are connected to the NSFNET backbone — effectively expanding the Internet throughout the United States. The NSFNET was essentially a network of networks that connected academic users along with the ARPANET.

1987: The number of hosts on the internet exceeds 20,000. Cisco ships its first router.

1989: World.std.com becomes the first commercial provider of dial-up access to the internet.

World Wide Web

1990–2000

1990: Tim Berners-Lee, a scientist at CERN, the European Organization for Nuclear Research, develops HyperText Markup Language (HTML). This technology continues to have a large impact on how we navigate and view the internet today.

1991: CERN introduces the World Wide Web to the public.

1992: The first audio and video are distributed over the internet. The phrase "surfing the internet" is popularized.

1993: The number of websites reaches 600 and the White House and United Nations go online. Marc Andreesen develops the Mosaic Web browser at the University of Illinois, Champaign-Urbana. The number of computers connected to NSFNET grows from 2,000 in 1985 to more than 2 million in 1993. The National Science Foundation leads an effort to outline a new internet architecture that would support the burgeoning commercial use of the network.

1994: Netscape Communications is born. Microsoft creates a Web browser for Windows 95.

1994: Yahoo! is created by Jerry Yang and David Filo, two electrical engineering graduate students at Stanford University. The site was originally called "Jerry and David's Guide to the World Wide Web." The company was later incorporated in March 1995.

1995: Compuserve, America Online and Prodigy begin to provide internet access. Amazon.com, Craigslist and eBay go live. The original NSFNET backbone is decommissioned as the internet’s transformation to a commercial enterprise is largely completed.

1995: The first online dating site, Match.com, launches.

1996: The browser war, primarily between the two major players Microsoft and Netscape, heats up. CNET buys tv.com for $15,000.

1996: A 3D animation dubbed " The Dancing Baby " becomes one of the first viral videos.

1997: Netflix is founded by Reed Hastings and Marc Randolph as a company that sends users DVDs by mail.

People watching laptop

1997: PC makers can remove or hide Microsoft’s internet software on new versions of Windows 95, thanks to a settlement with the Justice Department. Netscape announces that its browser will be free.

1998: The Google search engine is born, changing the way users engage with the internet.

1998: The Internet Protocol version 6 introduced, to allow for future growth of Internet Addresses. The current most widely used protocol is version 4. IPv4 uses 32-bit addresses allowing for 4.3 billion unique addresses; IPv6, with 128-bit addresses, will allow 3.4 x 1038 unique addresses, or 340 trillion trillion trillion.

1999: AOL buys Netscape. Peer-to-peer file sharing becomes a reality as Napster arrives on the Internet, much to the displeasure of the music industry.

2000–2010

2000: The dot-com bubble bursts. Websites such as Yahoo! and eBay are hit by a large-scale denial of service attack, highlighting the vulnerability of the Internet. AOL merges with Time Warner

2001: A federal judge shuts down Napster, ruling that it must find a way to stop users from sharing copyrighted material before it can go back online.

2003: The SQL Slammer worm spread worldwide in just 10 minutes. Myspace, Skype and the Safari Web browser debut.

2003: The blog publishing platform WordPress is launched.

2004: Facebook goes online and the era of social networking begins. Mozilla unveils the Mozilla Firefox browser.

2005: YouTube.com launches. The social news site Reddit is also founded. 

2006: AOL changes its business model, offering most services for free and relying on advertising to generate revenue. The Internet Governance Forum meets for the first time.

2006: Twitter launches. The company's founder, Jack Dorsey, sends out the very first tweet: "just setting up my twttr."

2009: The internet marks its 40th anniversary.

2010–2020

2010: Facebook reaches 400 million active users.

2010: The social media sites Pinterest and Instagram are launched.

2011: Twitter and Facebook play a large role in the Middle East revolts.

2012: President Barack Obama's administration announces its opposition to major parts of the Stop Online Piracy Act and the Protect Intellectual Property Act, which would have enacted broad new rules requiring internet service providers to police copyrighted content. The successful push to stop the bill, involving technology companies such as Google and nonprofit organizations including Wikipedia and the Electronic Frontier Foundation, is considered a victory for sites such as YouTube that depend on user-generated content, as well as "fair use" on the internet.

2013: Edward Snowden, a former CIA employee and National Security Agency (NSA) contractor, reveals that the NSA had in place a monitoring program capable of tapping the communications of thousands of people, including U.S. citizens.

2013: Fifty-one percent of U.S. adults report that they bank online, according to a survey conducted by the Pew Research Center.

Online banking

2015: Instagram, the photo-sharing site, reaches 400 million users, outpacing Twitter, which would go on to reach 316 million users by the middle of the same year.

2016: Google unveils Google Assistant, a voice-activated personal assistant program, marking the entry of the internet giant into the "smart" computerized assistant marketplace. Google joins Amazon's Alexa, Siri from Apple, and Cortana from Microsoft.

2018: There is a significant rise in internet-enabled devices. An increase in the Internet of Things (IoT) sees around seven billion devices by the end of the year.  

2019: Fifth–generation ( 5G ) networks are launched, enabling speedier internet connection on some wireless devices. 

2020–2022

2021: By January 2021, there are 4.66 billion people connected to the internet. This is more than half of the global population. 

2022: Low–Earth orbit satellite internet is closer to reality. By early January 2022, SpaceX launches more than 1,900 Starlink  satellites overall. The constellation is now providing broadband service in select areas around the world. 

To find out more about the SpaceX satellite internet project, you can watch this video about the mission. Additionally, to read an interview with Leonard Kleinrock, visit the Communications of the ACM website .

  • " Leonard Kleinrock Internet Pioneer ". Management and Business Review (2022). 
  • " The Science of Computing: The ARPANET after Twenty Years ". American Scientist (1989). 
  • " A brief history of the internet ". Association for Computing Machinery (AGM) (2009). 
  • " Internet Protocol, Version 6 (IPv6) Specification ". S. Deering, R. Hinden (1998). 
  • " Distributed denial of service attacks ". IEEE International Conference on Systems, Man and Cybernetics (2000). 
  • " Statistics and Social Network of YouTube Videos ". 2008 16th Interntional Workshop on Quality of Service (2008). 
  • " Social Media and Crisis Communication ".  (Routledge, 2017). 

Sign up for the Live Science daily newsletter now

Get the world’s most fascinating discoveries delivered straight to your inbox.

Kim Ann Zimmermann is a contributor to Live Science and sister site Space.com, writing mainly evergreen reference articles that provide background on myriad scientific topics, from astronauts to climate, and from culture to medicine. Her work can also be found in Business News Daily and KM World. She holds a bachelor’s degree in communications from Glassboro State College (now known as Rowan University) in New Jersey. 

Follow Live Science on social media

Live Science daily newsletter: Get amazing science every day

Giant underwater avalanche decimated Atlantic seafloor 60,000 years ago, 1st-of-its-kind map reveals

Most Popular

  • 2 Listeria outbreak tied to sliced deli meat hospitalizes 57, kills 9
  • 3 What to know about Oropouche virus — the deadly fever that has reached the U.S.
  • 4 Al Naslaa rock: Saudi Arabia's enigmatic sandstone block that's split perfectly down the middle
  • 5 AI 'hallucinations' can lead to catastrophic mistakes, but a new approach makes automated decisions more reliable

history of the world wide web essay

two men wearing suits talking to each other

A Brief History of the Internet

Introduction, published 1997.

Barry M. Leiner, Vinton G. Cerf, David D. Clark, Robert E. Kahn, Leonard Kleinrock, Daniel C. Lynch, Jon Postel, Larry G. Roberts, Stephen Wolff

The Internet has revolutionized the computer and communications world like nothing before. The invention of the telegraph, telephone, radio, and computer set the stage for this unprecedented integration of capabilities. The Internet is at once a world-wide broadcasting capability, a mechanism for information dissemination, and a medium for collaboration and interaction between individuals and their computers without regard for geographic location. The Internet represents one of the most successful examples of the benefits of sustained investment and commitment to research and development of information infrastructure. Beginning with the early research in packet switching, the government, industry and academia have been partners in evolving and deploying this exciting new technology. Today, terms like “ [email protected] ” and “ http://www.acm.org ” trip lightly off the tongue of the random person on the street.  1

This is intended to be a brief, necessarily cursory and incomplete history. Much material currently exists about the Internet, covering history, technology, and usage. A trip to almost any bookstore will find shelves of material written about the Internet.  2

Learn more about how we are building a bigger, stronger Internet.

In this paper, 3  several of us involved in the development and evolution of the Internet share our views of its origins and history. This history revolves around four distinct aspects. There is the technological evolution that began with early research on packet switching and the ARPANET (and related technologies), and where current research continues to expand the horizons of the infrastructure along several dimensions, such as scale, performance, and higher-level functionality. There is the operations and management aspect of a global and complex operational infrastructure. There is the social aspect, which resulted in a broad community of Internauts working together to create and evolve the technology. And there is the commercialization aspect, resulting in an extremely effective transition of research results into a broadly deployed and available information infrastructure.

The Internet today is a widespread information infrastructure, the initial prototype of what is often called the National (or Global or Galactic) Information Infrastructure. Its history is complex and involves many aspects – technological, organizational, and community. And its influence reaches not only to the technical fields of computer communications but throughout society as we move toward increasing use of online tools to accomplish electronic commerce, information acquisition, and community operations.

Origins of the Internet

The first recorded description of the social interactions that could be enabled through networking was a  series of memos  written by J.C.R. Licklider of MIT in August 1962 discussing his “Galactic Network” concept. He envisioned a globally interconnected set of computers through which everyone could quickly access data and programs from any site. In spirit, the concept was very much like the Internet of today. Licklider was the first head of the computer research program at DARPA, 4  starting in October 1962. While at DARPA he convinced his successors at DARPA, Ivan Sutherland, Bob Taylor, and MIT researcher Lawrence G. Roberts, of the importance of this networking concept.

Leonard Kleinrock at MIT published the  first paper on packet switching theory  in July 1961 and the  first book on the subject  in 1964. Kleinrock convinced Roberts of the theoretical feasibility of communications using packets rather than circuits, which was a major step along the path towards computer networking. The other key step was to make the computers talk together. To explore this, in 1965 working with Thomas Merrill, Roberts connected the TX-2 computer in Mass. to the Q-32 in California with a low speed dial-up telephone line creating the  first (however small) wide-area computer network ever built . The result of this experiment was the realization that the time-shared computers could work well together, running programs and retrieving data as necessary on the remote machine, but that the circuit switched telephone system was totally inadequate for the job. Kleinrock’s conviction of the need for packet switching was confirmed.

In late 1966 Roberts went to DARPA to develop the computer network concept and quickly put together his  plan for the “ARPANET” , publishing it in 1967. At the conference where he presented the paper, there was also a paper on a packet network concept from the UK by Donald Davies and Roger Scantlebury of NPL. Scantlebury told Roberts about the NPL work as well as that of Paul Baran and others at RAND. The RAND group had written a paper on packet switching networks for secure voice  in the military in 1964. It happened that the work at MIT (1961-1967), at RAND (1962-1965), and at NPL (1964-1967) had all proceeded in parallel without any of the researchers knowing about the other work. The word “packet” was adopted from the work at NPL and the proposed line speed to be used in the ARPANET design was upgraded from 2.4 kbps to 50 kbps.  5

In August 1968, after Roberts and the DARPA funded community had refined the overall structure and specifications for the ARPANET, an RFQ was released by DARPA for the development of one of the key components, the packet switches called Interface Message Processors (IMP’s). The RFQ was won in December 1968 by a group headed by Frank Heart at Bolt Beranek and Newman (BBN). As the BBN team worked on the IMP’s with Bob Kahn playing a major role in the overall ARPANET architectural design, the network topology and economics were designed and optimized by Roberts working with Howard Frank and his team at Network Analysis Corporation, and the network measurement system was prepared by Kleinrock’s team at UCLA.  6

Due to Kleinrock’s early development of packet switching theory and his focus on analysis, design and measurement, his Network Measurement Center at UCLA was selected to be the first node on the ARPANET. All this came together in September 1969 when BBN installed the first IMP at UCLA and the first host computer was connected. Doug Engelbart’s project on “Augmentation of Human Intellect” (which included NLS, an early hypertext system) at Stanford Research Institute (SRI) provided a second node. SRI supported the Network Information Center, led by Elizabeth (Jake) Feinler and including functions such as maintaining tables of host name to address mapping as well as a directory of the RFC’s.

One month later, when SRI was connected to the ARPANET, the first host-to-host message was sent from Kleinrock’s laboratory to SRI. Two more nodes were added at UC Santa Barbara and University of Utah. These last two nodes incorporated application visualization projects, with Glen Culler and Burton Fried at UCSB investigating methods for display of mathematical functions using storage displays to deal with the problem of refresh over the net, and Robert Taylor and Ivan Sutherland at Utah investigating methods of 3-D representations over the net. Thus, by the end of 1969, four host computers were connected together into the initial ARPANET, and the budding Internet was off the ground. Even at this early stage, it should be noted that the networking research incorporated both work on the underlying network and work on how to utilize the network. This tradition continues to this day.

Computers were added quickly to the ARPANET during the following years, and work proceeded on completing a functionally complete Host-to-Host protocol and other network software. In December 1970 the Network Working Group (NWG) working under S. Crocker finished the initial ARPANET Host-to-Host protocol, called the Network Control Protocol (NCP). As the ARPANET sites completed implementing NCP during the period 1971-1972, the network users finally could begin to develop applications.

In October 1972, Kahn organized a large, very successful demonstration of the ARPANET at the International Computer Communication Conference (ICCC). This was the first public demonstration of this new network technology to the public. It was also in 1972 that the initial “hot” application, electronic mail, was introduced. In March Ray Tomlinson at BBN wrote the basic email message send and read software, motivated by the need of the ARPANET developers for an easy coordination mechanism. In July, Roberts expanded its utility by writing the first email utility program to list, selectively read, file, forward, and respond to messages. From there email took off as the largest network application for over a decade. This was a harbinger of the kind of activity we see on the World Wide Web today, namely, the enormous growth of all kinds of “people-to-people” traffic.

The Initial Internetting Concepts

The original ARPANET grew into the Internet. Internet was based on the idea that there would be multiple independent networks of rather arbitrary design, beginning with the ARPANET as the pioneering packet switching network, but soon to include packet satellite networks, ground-based packet radio networks and other networks. The Internet as we now know it embodies a key underlying technical idea, namely that of open architecture networking. In this approach, the choice of any individual network technology was not dictated by a particular network architecture but rather could be selected freely by a provider and made to interwork with the other networks through a meta-level “Internetworking Architecture”. Up until that time there was only one general method for federating networks. This was the traditional circuit switching method where networks would interconnect at the circuit level, passing individual bits on a synchronous basis along a portion of an end-to-end circuit between a pair of end locations. Recall that Kleinrock had shown in 1961 that packet switching was a more efficient switching method. Along with packet switching, special purpose interconnection arrangements between networks were another possibility. While there were other limited ways to interconnect different networks, they required that one be used as a component of the other, rather than acting as a peer of the other in offering end-to-end service.

In an open-architecture network, the individual networks may be separately designed and developed and each may have its own unique interface which it may offer to users and/or other providers. including other Internet providers. Each network can be designed in accordance with the specific environment and user requirements of that network. There are generally no constraints on the types of network that can be included or on their geographic scope, although certain pragmatic considerations will dictate what makes sense to offer.

The idea of open-architecture networking was first introduced by Kahn shortly after having arrived at DARPA in 1972. This work was originally part of the packet radio program, but subsequently became a separate program in its own right. At the time, the program was called “Internetting”. Key to making the packet radio system work was a reliable end-end protocol that could maintain effective communication in the face of jamming and other radio interference, or withstand intermittent blackout such as caused by being in a tunnel or blocked by the local terrain. Kahn first contemplated developing a protocol local only to the packet radio network, since that would avoid having to deal with the multitude of different operating systems, and continuing to use NCP.

However, NCP did not have the ability to address networks (and machines) further downstream than a destination IMP on the ARPANET and thus some change to NCP would also be required. (The assumption was that the ARPANET was not changeable in this regard). NCP relied on ARPANET to provide end-to-end reliability. If any packets were lost, the protocol (and presumably any applications it supported) would come to a grinding halt. In this model NCP had no end-end host error control, since the ARPANET was to be the only network in existence and it would be so reliable that no error control would be required on the part of the hosts. Thus, Kahn decided to develop a new version of the protocol which could meet the needs of an open-architecture network environment. This protocol would eventually be called the Transmission Control Protocol/Internet Protocol (TCP/IP). While NCP tended to act like a device driver, the new protocol would be more like a communications protocol.

Four ground rules were critical to Kahn’s early thinking:

  • Each distinct network would have to stand on its own and no internal changes could be required to any such network to connect it to the Internet.
  • Communications would be on a best effort basis. If a packet didn’t make it to the final destination, it would shortly be retransmitted from the source.
  • Black boxes would be used to connect the networks; these would later be called gateways and routers. There would be no information retained by the gateways about the individual flows of packets passing through them, thereby keeping them simple and avoiding complicated adaptation and recovery from various failure modes.
  • There would be no global control at the operations level.

Other key issues that needed to be addressed were:

  • Algorithms to prevent lost packets from permanently disabling communications and enabling them to be successfully retransmitted from the source.
  • Providing for host-to-host “pipelining” so that multiple packets could be enroute from source to destination at the discretion of the participating hosts, if the intermediate networks allowed it.
  • Gateway functions to allow it to forward packets appropriately. This included interpreting IP headers for routing, handling interfaces, breaking packets into smaller pieces if necessary, etc.
  • The need for end-end checksums, reassembly of packets from fragments and detection of duplicates, if any.
  • The need for global addressing
  • Techniques for host-to-host flow control.
  • Interfacing with the various operating systems
  • There were also other concerns, such as implementation efficiency, internetwork performance, but these were secondary considerations at first.

Kahn began work on a communications-oriented set of operating system principles while at BBN and documented some of his early thoughts in an internal BBN memorandum entitled “ Communications Principles for Operating Systems “. At this point he realized it would be necessary to learn the implementation details of each operating system to have a chance to embed any new protocols in an efficient way. Thus, in the spring of 1973, after starting the internetting effort, he asked Vint Cerf (then at Stanford) to work with him on the detailed design of the protocol. Cerf had been intimately involved in the original NCP design and development and already had the knowledge about interfacing to existing operating systems. So armed with Kahn’s architectural approach to the communications side and with Cerf’s NCP experience, they teamed up to spell out the details of what became TCP/IP.

The give and take was highly productive and the first written version of the resulting approach was distributed as INWG#39 at a special meeting of the International Network Working Group (INWG) at Sussex University in September 1973. Subsequently a refined version was published in 1974 7 . The INWG was created at the October 1972 International Computer Communications Conference organized by Bob Kahn, et al, and Cerf was invited to chair this group.

Some basic approaches emerged from this collaboration between Kahn and Cerf:

  • Communication between two processes would logically consist of a very long stream of bytes (they called them octets). The position of any octet in the stream would be used to identify it.
  • Flow control would be done by using sliding windows and acknowledgments (acks). The destination could select when to acknowledge and each ack returned would be cumulative for all packets received to that point.
  • It was left open as to exactly how the source and destination would agree on the parameters of the windowing to be used. Defaults were used initially.
  • Although Ethernet was under development at Xerox PARC at that time, the proliferation of LANs were not envisioned at the time, much less PCs and workstations. The original model was national level networks like ARPANET of which only a relatively small number were expected to exist. Thus a 32 bit IP address was used of which the first 8 bits signified the network and the remaining 24 bits designated the host on that network. This assumption, that 256 networks would be sufficient for the foreseeable future, was clearly in need of reconsideration when LANs began to appear in the late 1970s.

The original Cerf/Kahn paper on the Internet described one protocol, called TCP, which provided all the transport and forwarding services in the Internet. Kahn had intended that the TCP protocol support a range of transport services, from the totally reliable sequenced delivery of data (virtual circuit model) to a datagram service in which the application made direct use of the underlying network service, which might imply occasional lost, corrupted or reordered packets. However, the initial effort to implement TCP resulted in a version that only allowed for virtual circuits. This model worked fine for file transfer and remote login applications, but some of the early work on advanced network applications, in particular packet voice in the 1970s, made clear that in some cases packet losses should not be corrected by TCP, but should be left to the application to deal with. This led to a reorganization of the original TCP into two protocols, the simple IP which provided only for addressing and forwarding of individual packets, and the separate TCP, which was concerned with service features such as flow control and recovery from lost packets. For those applications that did not want the services of TCP, an alternative called the User Datagram Protocol (UDP) was added in order to provide direct access to the basic service of IP.

A major initial motivation for both the ARPANET and the Internet was resource sharing – for example allowing users on the packet radio networks to access the time sharing systems attached to the ARPANET. Connecting the two together was far more economical that duplicating these very expensive computers. However, while file transfer and remote login (Telnet) were very important applications, electronic mail has probably had the most significant impact of the innovations from that era. Email provided a new model of how people could communicate with each other, and changed the nature of collaboration, first in the building of the Internet itself (as is discussed below) and later for much of society.

There were other applications proposed in the early days of the Internet, including packet based voice communication (the precursor of Internet telephony), various models of file and disk sharing, and early “worm” programs that showed the concept of agents (and, of course, viruses). A key concept of the Internet is that it was not designed for just one application, but as a general infrastructure on which new applications could be conceived, as illustrated later by the emergence of the World Wide Web. It is the general purpose nature of the service provided by TCP and IP that makes this possible.

Proving the Ideas

DARPA let three contracts to Stanford (Cerf), BBN (Ray Tomlinson) and UCL (Peter Kirstein) to implement TCP/IP (it was simply called TCP in the Cerf/Kahn paper but contained both components). The Stanford team, led by Cerf, produced the detailed specification and within about a year there were three independent implementations of TCP that could interoperate.

This was the beginning of long term experimentation and development to evolve and mature the Internet concepts and technology. Beginning with the first three networks (ARPANET, Packet Radio, and Packet Satellite) and their initial research communities, the experimental environment has grown to incorporate essentially every form of network and a very broad-based research and development community.  [REK78]  With each expansion has come new challenges.

The early implementations of TCP were done for large time sharing systems such as Tenex and TOPS 20. When desktop computers first appeared, it was thought by some that TCP was too big and complex to run on a personal computer. David Clark and his research group at MIT set out to show that a compact and simple implementation of TCP was possible. They produced an implementation, first for the Xerox Alto (the early personal workstation developed at Xerox PARC) and then for the IBM PC. That implementation was fully interoperable with other TCPs, but was tailored to the application suite and performance objectives of the personal computer, and showed that workstations, as well as large time-sharing systems, could be a part of the Internet. In 1976, Kleinrock published the  first book on the ARPANET . It included an emphasis on the complexity of protocols and the pitfalls they often introduce. This book was influential in spreading the lore of packet switching networks to a very wide community.

Widespread development of LANS, PCs and workstations in the 1980s allowed the nascent Internet to flourish. Ethernet technology, developed by Bob Metcalfe at Xerox PARC in 1973, is now probably the dominant network technology in the Internet and PCs and workstations the dominant computers. This change from having a few networks with a modest number of time-shared hosts (the original ARPANET model) to having many networks has resulted in a number of new concepts and changes to the underlying technology. First, it resulted in the definition of three network classes (A, B, and C) to accommodate the range of networks. Class A represented large national scale networks (small number of networks with large numbers of hosts); Class B represented regional scale networks; and Class C represented local area networks (large number of networks with relatively few hosts).

A major shift occurred as a result of the increase in scale of the Internet and its associated management issues. To make it easy for people to use the network, hosts were assigned names, so that it was not necessary to remember the numeric addresses. Originally, there were a fairly limited number of hosts, so it was feasible to maintain a single table of all the hosts and their associated names and addresses. The shift to having a large number of independently managed networks (e.g., LANs) meant that having a single table of hosts was no longer feasible, and the Domain Name System (DNS) was invented by Paul Mockapetris of USC/ISI. The DNS permitted a scalable distributed mechanism for resolving hierarchical host names (e.g.  www.acm.org ) into an Internet address.

The increase in the size of the Internet also challenged the capabilities of the routers. Originally, there was a single distributed algorithm for routing that was implemented uniformly by all the routers in the Internet. As the number of networks in the Internet exploded, this initial design could not expand as necessary, so it was replaced by a hierarchical model of routing, with an Interior Gateway Protocol (IGP) used inside each region of the Internet, and an Exterior Gateway Protocol (EGP) used to tie the regions together. This design permitted different regions to use a different IGP, so that different requirements for cost, rapid reconfiguration, robustness and scale could be accommodated. Not only the routing algorithm, but the size of the addressing tables, stressed the capacity of the routers. New approaches for address aggregation, in particular classless inter-domain routing (CIDR), have recently been introduced to control the size of router tables.

As the Internet evolved, one of the major challenges was how to propagate the changes to the software, particularly the host software. DARPA supported UC Berkeley to investigate modifications to the Unix operating system, including incorporating TCP/IP developed at BBN. Although Berkeley later rewrote the BBN code to more efficiently fit into the Unix system and kernel, the incorporation of TCP/IP into the Unix BSD system releases proved to be a critical element in dispersion of the protocols to the research community. Much of the CS research community began to use Unix BSD for their day-to-day computing environment. Looking back, the strategy of incorporating Internet protocols into a supported operating system for the research community was one of the key elements in the successful widespread adoption of the Internet.

One of the more interesting challenges was the transition of the ARPANET host protocol from NCP to TCP/IP as of January 1, 1983. This was a “flag-day” style transition, requiring all hosts to convert simultaneously or be left having to communicate via rather ad-hoc mechanisms. This transition was carefully planned within the community over several years before it actually took place and went surprisingly smoothly (but resulted in a distribution of buttons saying “I survived the TCP/IP transition”).

TCP/IP was adopted as a defense standard three years earlier in 1980. This enabled defense to begin sharing in the DARPA Internet technology base and led directly to the eventual partitioning of the military and non- military communities. By 1983, ARPANET was being used by a significant number of defense R&D and operational organizations. The transition of ARPANET from NCP to TCP/IP permitted it to be split into a MILNET supporting operational requirements and an ARPANET supporting research needs.

Thus, by 1985, Internet was already well established as a technology supporting a broad community of researchers and developers, and was beginning to be used by other communities for daily computer communications. Electronic mail was being used broadly across several communities, often with different systems, but interconnection between different mail systems was demonstrating the utility of broad based electronic communications between people.

Transition to Widespread Infrastructure

At the same time that the Internet technology was being experimentally validated and widely used amongst a subset of computer science researchers, other networks and networking technologies were being pursued. The usefulness of computer networking – especially electronic mail – demonstrated by DARPA and Department of Defense contractors on the ARPANET was not lost on other communities and disciplines, so that by the mid-1970s computer networks had begun to spring up wherever funding could be found for the purpose. The U.S. Department of Energy (DoE) established MFENet for its researchers in Magnetic Fusion Energy, whereupon DoE’s High Energy Physicists responded by building HEPNet. NASA Space Physicists followed with SPAN, and Rick Adrion, David Farber, and Larry Landweber established CSNET for the (academic and industrial) Computer Science community with an initial grant from the U.S. National Science Foundation (NSF). AT&T’s free-wheeling dissemination of the UNIX computer operating system spawned USENET, based on UNIX’ built-in UUCP communication protocols, and in 1981 Ira Fuchs and Greydon Freeman devised BITNET, which linked academic mainframe computers in an “email as card images” paradigm.

With the exception of BITNET and USENET, these early networks (including ARPANET) were purpose-built – i.e., they were intended for, and largely restricted to, closed communities of scholars; there was hence little pressure for the individual networks to be compatible and, indeed, they largely were not. In addition, alternate technologies were being pursued in the commercial sector, including XNS from Xerox, DECNet, and IBM’s SNA. 8  It remained for the British JANET (1984) and U.S. NSFNET (1985) programs to explicitly announce their intent to serve the entire higher education community, regardless of discipline. Indeed, a condition for a U.S. university to receive NSF funding for an Internet connection was that “… the connection must be made available to ALL qualified users on campus.”

In 1985, Dennis Jennings came from Ireland to spend a year at NSF leading the NSFNET program. He worked with the community to help NSF make a critical decision – that TCP/IP would be mandatory for the NSFNET program. When Steve Wolff took over the NSFNET program in 1986, he recognized the need for a wide area networking infrastructure to support the general academic and research community, along with the need to develop a strategy for establishing such infrastructure on a basis ultimately independent of direct federal funding. Policies and strategies were adopted (see below) to achieve that end.

NSF also elected to support DARPA’s existing Internet organizational infrastructure, hierarchically arranged under the (then) Internet Activities Board (IAB). The public declaration of this choice was the joint authorship by the IAB’s Internet Engineering and Architecture Task Forces and by NSF’s Network Technical Advisory Group of RFC 985 (Requirements for Internet Gateways ), which formally ensured interoperability of DARPA’s and NSF’s pieces of the Internet.

In addition to the selection of TCP/IP for the NSFNET program, Federal agencies made and implemented several other policy decisions which shaped the Internet of today.

  • Federal agencies shared the cost of common infrastructure, such as trans-oceanic circuits. They also jointly supported “managed interconnection points” for interagency traffic; the Federal Internet Exchanges (FIX-E and FIX-W) built for this purpose served as models for the Network Access Points and “*IX” facilities that are prominent features of today’s Internet architecture.
  • To coordinate this sharing, the Federal Networking Council 9  was formed. The FNC also cooperated with other international organizations, such as RARE in Europe, through the Coordinating Committee on Intercontinental Research Networking, CCIRN, to coordinate Internet support of the research community worldwide.
  • This sharing and cooperation between agencies on Internet-related issues had a long history. An unprecedented 1981 agreement between Farber, acting for CSNET and the NSF, and DARPA’s Kahn, permitted CSNET traffic to share ARPANET infrastructure on a statistical and no-metered-settlements basis.
  • Subsequently, in a similar mode, the NSF encouraged its regional (initially academic) networks of the NSFNET to seek commercial, non-academic customers, expand their facilities to serve them, and exploit the resulting economies of scale to lower subscription costs for all.
  • On the NSFNET Backbone – the national-scale segment of the NSFNET – NSF enforced an “Acceptable Use Policy” (AUP) which prohibited Backbone usage for purposes “not in support of Research and Education.” The predictable (and intended) result of encouraging commercial network traffic at the local and regional level, while denying its access to national-scale transport, was to stimulate the emergence and/or growth of “private”, competitive, long-haul networks such as PSI, UUNET, ANS CO+RE, and (later) others. This process of privately-financed augmentation for commercial uses was thrashed out starting in 1988 in a series of NSF-initiated conferences at Harvard’s Kennedy School of Government on “The Commercialization and Privatization of the Internet” – and on the “com-priv” list on the net itself.
  • In 1988, a National Research Council committee, chaired by Kleinrock and with Kahn and Clark as members, produced a report commissioned by NSF titled “Towards a National Research Network”. This report was influential on then Senator Al Gore, and ushered in high speed networks that laid the networking foundation for the future information superhighway.
  • In 1994, a National Research Council report, again chaired by Kleinrock (and with Kahn and Clark as members again), Entitled “Realizing The Information Future: The Internet and Beyond” was released. This report, commissioned by NSF, was the document in which a blueprint for the evolution of the information superhighway was articulated and which has had a lasting affect on the way to think about its evolution. It anticipated the critical issues of intellectual property rights, ethics, pricing, education, architecture and regulation for the Internet.
  • NSF’s privatization policy culminated in April, 1995, with the defunding of the NSFNET Backbone. The funds thereby recovered were (competitively) redistributed to regional networks to buy national-scale Internet connectivity from the now numerous, private, long-haul networks.

The backbone had made the transition from a network built from routers out of the research community (the “Fuzzball” routers from David Mills) to commercial equipment. In its 8 1/2 year lifetime, the Backbone had grown from six nodes with 56 kbps links to 21 nodes with multiple 45 Mbps links. It had seen the Internet grow to over 50,000 networks on all seven continents and outer space, with approximately 29,000 networks in the United States.

Such was the weight of the NSFNET program’s ecumenism and funding ($200 million from 1986 to 1995) – and the quality of the protocols themselves – that by 1990 when the ARPANET itself was finally decommissioned 10 , TCP/IP had supplanted or marginalized most other wide-area computer network protocols worldwide, and IP was well on its way to becoming THE bearer service for the Global Information Infrastructure.

The Role of Documentation

A key to the rapid growth of the Internet has been the free and open access to the basic documents, especially the specifications of the protocols.

The beginnings of the ARPANET and the Internet in the university research community promoted the academic tradition of open publication of ideas and results. However, the normal cycle of traditional academic publication was too formal and too slow for the dynamic exchange of ideas essential to creating networks.

In 1969 a key step was taken by S. Crocker (then at UCLA) in establishing the  Request for Comments  (or RFC) series of notes. These memos were intended to be an informal fast distribution way to share ideas with other network researchers. At first the RFCs were printed on paper and distributed via snail mail. As the File Transfer Protocol (FTP) came into use, the RFCs were prepared as online files and accessed via FTP. Now, of course, the RFCs are easily accessed via the World Wide Web at dozens of sites around the world. SRI, in its role as Network Information Center, maintained the online directories. Jon Postel acted as RFC Editor as well as managing the centralized administration of required protocol number assignments, roles that he continued to play until his death, October 16, 1998.

The effect of the RFCs was to create a positive feedback loop, with ideas or proposals presented in one RFC triggering another RFC with additional ideas, and so on. When some consensus (or a least a consistent set of ideas) had come together a specification document would be prepared. Such a specification would then be used as the base for implementations by the various research teams.

Over time, the RFCs have become more focused on protocol standards (the “official” specifications), though there are still informational RFCs that describe alternate approaches, or provide background information on protocols and engineering issues. The RFCs are now viewed as the “documents of record” in the Internet engineering and standards community.

The open access to the RFCs (for free, if you have any kind of a connection to the Internet) promotes the growth of the Internet because it allows the actual specifications to be used for examples in college classes and by entrepreneurs developing new systems.

Email has been a significant factor in all areas of the Internet, and that is certainly true in the development of protocol specifications, technical standards, and Internet engineering. The very early RFCs often presented a set of ideas developed by the researchers at one location to the rest of the community. After email came into use, the authorship pattern changed – RFCs were presented by joint authors with common view independent of their locations.

The use of specialized email mailing lists has been long used in the development of protocol specifications, and continues to be an important tool. The IETF now has in excess of 75 working groups, each working on a different aspect of Internet engineering. Each of these working groups has a mailing list to discuss one or more draft documents under development. When consensus is reached on a draft document it may be distributed as an RFC.

As the current rapid expansion of the Internet is fueled by the realization of its capability to promote information sharing, we should understand that the network’s first role in information sharing was sharing the information about its own design and operation through the RFC documents. This unique method for evolving new capabilities in the network will continue to be critical to future evolution of the Internet.

Formation of the Broad Community

The Internet is as much a collection of communities as a collection of technologies, and its success is largely attributable to both satisfying basic community needs as well as utilizing the community in an effective way to push the infrastructure forward. This community spirit has a long history beginning with the early ARPANET. The early ARPANET researchers worked as a close-knit community to accomplish the initial demonstrations of packet switching technology described earlier. Likewise, the Packet Satellite, Packet Radio and several other DARPA computer science research programs were multi-contractor collaborative activities that heavily used whatever available mechanisms there were to coordinate their efforts, starting with electronic mail and adding file sharing, remote access, and eventually World Wide Web capabilities. Each of these programs formed a working group, starting with the ARPANET Network Working Group. Because of the unique role that ARPANET played as an infrastructure supporting the various research programs, as the Internet started to evolve, the Network Working Group evolved into Internet Working Group.

In the late 1970s, recognizing that the growth of the Internet was accompanied by a growth in the size of the interested research community and therefore an increased need for coordination mechanisms, Vint Cerf, then manager of the Internet Program at DARPA, formed several coordination bodies – an International Cooperation Board (ICB), chaired by Peter Kirstein of UCL, to coordinate activities with some cooperating European countries centered on Packet Satellite research, an Internet Research Group which was an inclusive group providing an environment for general exchange of information, and an Internet Configuration Control Board (ICCB), chaired by Clark. The ICCB was an invitational body to assist Cerf in managing the burgeoning Internet activity.

In 1983, when Barry Leiner took over management of the Internet research program at DARPA, he and Clark recognized that the continuing growth of the Internet community demanded a restructuring of the coordination mechanisms. The ICCB was disbanded and in its place a structure of Task Forces was formed, each focused on a particular area of the technology (e.g. routers, end-to-end protocols, etc.). The Internet Activities Board (IAB) was formed from the chairs of the Task Forces.

It of course was only a coincidence that the chairs of the Task Forces were the same people as the members of the old ICCB, and Dave Clark continued to act as chair. After some changing membership on the IAB, Phill Gross became chair of a revitalized Internet Engineering Task Force (IETF), at the time merely one of the IAB Task Forces. As we saw above, by 1985 there was a tremendous growth in the more practical/engineering side of the Internet. This growth resulted in an explosion in the attendance at the IETF meetings, and Gross was compelled to create substructure to the IETF in the form of working groups.

This growth was complemented by a major expansion in the community. No longer was DARPA the only major player in the funding of the Internet. In addition to NSFNet and the various US and international government-funded activities, interest in the commercial sector was beginning to grow. Also in 1985, both Kahn and Leiner left DARPA and there was a significant decrease in Internet activity at DARPA. As a result, the IAB was left without a primary sponsor and increasingly assumed the mantle of leadership.

The growth continued, resulting in even further substructure within both the IAB and IETF. The IETF combined Working Groups into Areas, and designated Area Directors. An Internet Engineering Steering Group (IESG) was formed of the Area Directors. The IAB recognized the increasing importance of the IETF, and restructured the standards process to explicitly recognize the IESG as the major review body for standards. The IAB also restructured so that the rest of the Task Forces (other than the IETF) were combined into an Internet Research Task Force (IRTF) chaired by Postel, with the old task forces renamed as research groups.

The growth in the commercial sector brought with it increased concern regarding the standards process itself. Starting in the early 1980’s and continuing to this day, the Internet grew beyond its primarily research roots to include both a broad user community and increased commercial activity. Increased attention was paid to making the process open and fair. This coupled with a recognized need for community support of the Internet eventually led to the formation of the Internet Society in 1991, under the auspices of Kahn’s Corporation for National Research Initiatives (CNRI) and the leadership of Cerf, then with CNRI.

In 1992, yet another reorganization took place. In 1992, the Internet Activities Board was re-organized and re-named the Internet Architecture Board operating under the auspices of the Internet Society. A more “peer” relationship was defined between the new IAB and IESG, with the IETF and IESG taking a larger responsibility for the approval of standards. Ultimately, a cooperative and mutually supportive relationship was formed between the IAB, IETF, and Internet Society, with the Internet Society taking on as a goal the provision of service and other measures which would facilitate the work of the IETF.

The recent development and widespread deployment of the World Wide Web has brought with it a new community, as many of the people working on the WWW have not thought of themselves as primarily network researchers and developers. A new coordination organization was formed, the World Wide Web Consortium (W3C). Initially led from MIT’s Laboratory for Computer Science by Tim Berners-Lee (the inventor of the WWW) and Al Vezza, W3C has taken on the responsibility for evolving the various protocols and standards associated with the Web.

Thus, through the over two decades of Internet activity, we have seen a steady evolution of organizational structures designed to support and facilitate an ever-increasing community working collaboratively on Internet issues.

Commercialization of the Technology

Commercialization of the Internet involved not only the development of competitive, private network services, but also the development of commercial products implementing the Internet technology. In the early 1980s, dozens of vendors were incorporating TCP/IP into their products because they saw buyers for that approach to networking. Unfortunately they lacked both real information about how the technology was supposed to work and how the customers planned on using this approach to networking. Many saw it as a nuisance add-on that had to be glued on to their own proprietary networking solutions: SNA, DECNet, Netware, NetBios. The DoD had mandated the use of TCP/IP in many of its purchases but gave little help to the vendors regarding how to build useful TCP/IP products.

In 1985, recognizing this lack of information availability and appropriate training, Dan Lynch in cooperation with the IAB arranged to hold a three day workshop for ALL vendors to come learn about how TCP/IP worked and what it still could not do well. The speakers came mostly from the DARPA research community who had both developed these protocols and used them in day-to-day work. About 250 vendor personnel came to listen to 50 inventors and experimenters. The results were surprises on both sides: the vendors were amazed to find that the inventors were so open about the way things worked (and what still did not work) and the inventors were pleased to listen to new problems they had not considered, but were being discovered by the vendors in the field. Thus a two-way discussion was formed that has lasted for over a decade.

After two years of conferences, tutorials, design meetings and workshops, a special event was organized that invited those vendors whose products ran TCP/IP well enough to come together in one room for three days to show off how well they all worked together and also ran over the Internet. In September of 1988 the first Interop trade show was born. 50 companies made the cut. 5,000 engineers from potential customer organizations came to see if it all did work as was promised. It did. Why? Because the vendors worked extremely hard to ensure that everyone’s products interoperated with all of the other products – even with those of their competitors. The Interop trade show has grown immensely since then and today it is held in 7 locations around the world each year to an audience of over 250,000 people who come to learn which products work with each other in a seamless manner, learn about the latest products, and discuss the latest technology.

In parallel with the commercialization efforts that were highlighted by the Interop activities, the vendors began to attend the IETF meetings that were held 3 or 4 times a year to discuss new ideas for extensions of the TCP/IP protocol suite. Starting with a few hundred attendees mostly from academia and paid for by the government, these meetings now often exceed a thousand attendees, mostly from the vendor community and paid for by the attendees themselves. This self-selected group evolves the TCP/IP suite in a mutually cooperative manner. The reason it is so useful is that it is composed of all stakeholders: researchers, end users and vendors.

Network management provides an example of the interplay between the research and commercial communities. In the beginning of the Internet, the emphasis was on defining and implementing protocols that achieved interoperation.

As the network grew larger, it became clear that the sometime ad hoc procedures used to manage the network would not scale. Manual configuration of tables was replaced by distributed automated algorithms, and better tools were devised to isolate faults. In 1987 it became clear that a protocol was needed that would permit the elements of the network, such as the routers, to be remotely managed in a uniform way. Several protocols for this purpose were proposed, including Simple Network Management Protocol or SNMP (designed, as its name would suggest, for simplicity, and derived from an earlier proposal called SGMP) , HEMS (a more complex design from the research community) and CMIP (from the OSI community). A series of meeting led to the decisions that HEMS would be withdrawn as a candidate for standardization, in order to help resolve the contention, but that work on both SNMP and CMIP would go forward, with the idea that the SNMP could be a more near-term solution and CMIP a longer-term approach. The market could choose the one it found more suitable. SNMP is now used almost universally for network-based management.

In the last few years, we have seen a new phase of commercialization. Originally, commercial efforts mainly comprised vendors providing the basic networking products, and service providers offering the connectivity and basic Internet services. The Internet has now become almost a “commodity” service, and much of the latest attention has been on the use of this global information infrastructure for support of other commercial services. This has been tremendously accelerated by the widespread and rapid adoption of browsers and the World Wide Web technology, allowing users easy access to information linked throughout the globe. Products are available to facilitate the provisioning of that information and many of the latest developments in technology have been aimed at providing increasingly sophisticated information services on top of the basic Internet data communications.

History of the Future

On October 24, 1995, the FNC unanimously passed a resolution defining the term Internet. This definition was developed in consultation with members of the internet and intellectual property rights communities. RESOLUTION: The Federal Networking Council (FNC) agrees that the following language reflects our definition of the term “Internet”. “Internet” refers to the global information system that — (i) is logically linked together by a globally unique address space based on the Internet Protocol (IP) or its subsequent extensions/follow-ons; (ii) is able to support communications using the Transmission Control Protocol/Internet Protocol (TCP/IP) suite or its subsequent extensions/follow-ons, and/or other IP-compatible protocols; and (iii) provides, uses or makes accessible, either publicly or privately, high level services layered on the communications and related infrastructure described herein.

The Internet has changed much in the two decades since it came into existence. It was conceived in the era of time-sharing, but has survived into the era of personal computers, client-server and peer-to-peer computing, and the network computer. It was designed before LANs existed, but has accommodated that new network technology, as well as the more recent ATM and frame switched services. It was envisioned as supporting a range of functions from file sharing and remote login to resource sharing and collaboration, and has spawned electronic mail and more recently the World Wide Web. But most important, it started as the creation of a small band of dedicated researchers, and has grown to be a commercial success with billions of dollars of annual investment.

One should not conclude that the Internet has now finished changing. The Internet, although a network in name and geography, is a creature of the computer, not the traditional network of the telephone or television industry. It will, indeed it must, continue to change and evolve at the speed of the computer industry if it is to remain relevant. It is now changing to provide new services such as real time transport, in order to support, for example, audio and video streams.

The availability of pervasive networking (i.e., the Internet) along with powerful affordable computing and communications in portable form (i.e., laptop computers, two-way pagers, PDAs, cellular phones), is making possible a new paradigm of nomadic computing and communications. This evolution will bring us new applications – Internet telephone and, slightly further out, Internet television. It is evolving to permit more sophisticated forms of pricing and cost recovery, a perhaps painful requirement in this commercial world. It is changing to accommodate yet another generation of underlying network technologies with different characteristics and requirements, e.g. broadband residential access and satellites. New modes of access and new forms of service will spawn new applications, which in turn will drive further evolution of the net itself.

The most pressing question for the future of the Internet is not how the technology will change, but how the process of change and evolution itself will be managed. As this paper describes, the architecture of the Internet has always been driven by a core group of designers, but the form of that group has changed as the number of interested parties has grown. With the success of the Internet has come a proliferation of stakeholders – stakeholders now with an economic as well as an intellectual investment in the network.

We now see, in the debates over control of the domain name space and the form of the next generation IP addresses, a struggle to find the next social structure that will guide the Internet in the future. The form of that structure will be harder to find, given the large number of concerned stakeholders. At the same time, the industry struggles to find the economic rationale for the large investment needed for the future growth, for example to upgrade residential access to a more suitable technology. If the Internet stumbles, it will not be because we lack for technology, vision, or motivation. It will be because we cannot set a direction and march collectively into the future.

Did you find this resource helpful? By donating any amount, you help fund more research and content like this.

1 Perhaps this is an exaggeration based on the lead author’s residence in Silicon Valley. 2 On a recent trip to a Tokyo bookstore, one of the authors counted 14 English language magazines devoted to the Internet. 3 An abbreviated version of this article appears in the 50th anniversary issue of the CACM, Feb. 97. The authors would like to express their appreciation to Andy Rosenbloom, CACM Senior Editor, for both instigating the writing of this article and his invaluable assistance in editing both this and the abbreviated version. 4 The Advanced Research Projects Agency (ARPA) changed its name to Defense Advanced Research Projects Agency (DARPA) in 1971, then back to ARPA in 1993, and back to DARPA in 1996. We refer throughout to DARPA, the current name. 5 It was from the RAND study that the false rumor started claiming that the ARPANET was somehow related to building a network resistant to nuclear war. This was never true of the ARPANET, only the unrelated RAND study on secure voice considered nuclear war. However, the later work on Internetting did emphasize robustness and survivability, including the capability to withstand losses of large portions of the underlying networks. 6 Including amongst others Vint Cerf, Steve Crocker, and Jon Postel. Joining them later were David Crocker who was to play an important role in documentation of electronic mail protocols, and Robert Braden, who developed the first NCP and then TCP for IBM mainframes and also was to play a long term role in the ICCB and IAB. 7 This was subsequently published as V. G. Cerf and R. E. Kahn, “A protocol for packet network intercommunication”, IEEE Trans. Comm. Tech., vol. COM-22, V 5, pp. 627-641, May 1974. 8 The desirability of email interchange, however, led to one of the first “Internet books”: !%@:: A Directory of Electronic Mail Addressing and Networks, by Frey and Adams, on email address translation and forwarding. 9 Originally named Federal Research Internet Coordinating Committee, FRICC. The FRICC was originally formed to coordinate U.S. research network activities in support of the international coordination provided by the CCIRN. 10 The decommissioning of the ARPANET was commemorated on its 20th anniversary by a UCLA symposium in 1989.

P. Baran, “On Distributed Communications Networks”, IEEE Trans. Comm. Systems, March 1964. V. G. Cerf and R. E. Kahn, “A protocol for packet network interconnection”, IEEE Trans. Comm. Tech., vol. COM-22, V 5, pp. 627-641, May 1974. S. Crocker, RFC001 Host software, Apr-07-1969. R. Kahn, Communications Principles for Operating Systems. Internal BBN memorandum, Jan. 1972. Proceedings of the IEEE, Special Issue on Packet Communication Networks, Volume 66, No. 11, November 1978. (Guest editor: Robert Kahn, associate guest editors: Keith Uncapher and Harry van Trees) L. Kleinrock, “Information Flow in Large Communication Nets”, RLE Quarterly Progress Report, July 1961. L. Kleinrock, Communication Nets: Stochastic Message Flow and Delay, Mcgraw-Hill (New York), 1964. L. Kleinrock, Queueing Systems: Vol II, Computer Applications, John Wiley and Sons (New York), 1976 J.C.R. Licklider & W. Clark, “On-Line Man Computer Communication”, August 1962. L. Roberts & T. Merrill, “Toward a Cooperative Network of Time-Shared Computers”, Fall AFIPS Conf., Oct. 1966. L. Roberts, “Multiple Computer Networks and Intercomputer Communication”, ACM Gatlinburg Conf., October 1967.

Barry M. Leiner was Director of the  Research Institute for Advanced Computer Science . He passed away in April 2003. Vinton G. Cerf  is Vice President and Chief Internet Evangelist at  Google . David D. Clark  is Senior Research Scientist at the  MIT Laboratory for Computer Science . Robert E. Kahn  is President of the  Corporation for National Research Initiatives . Leonard Kleinrock  is a Distinguished Professor of Computer Science at the University of California, Los Angeles, and is a Founder of Linkabit Corp., TTI/Vanguard,  Nomadix Inc., and Platformation Inc. Daniel C. Lynch  is a founder of the  Interop networking trade show and conferences . Jon Postel  served as Director of the Computer Networks Division of the  Information Sciences Institute  of the University of Southern California until his untimely death October 16, 1998. Dr. Lawrence G. Roberts was CEO, President, and Chairman of Anagran, Inc . He passed away in December 2019. Stephen Wolff  is Principal Scientist of  Internet2 .

History of the World Wide Web

Information system running in the Internet and its history From Wikipedia, the free encyclopedia

The World Wide Web ("WWW", "W3" or simply "the Web") is a global information medium that users can access via computers connected to the Internet . The term is often mistakenly used as a synonym for the Internet, but the Web is a service that operates over the Internet, just as email and Usenet do. The history of the Internet and the history of hypertext date back significantly further than that of the World Wide Web.

World Wide Web
Inventor
Inception12 March 1989; 35 years ago (1989-03-12)

Tim Berners-Lee invented the World Wide Web while working at CERN in 1989. He proposed a "universal linked information system" using several concepts and technologies, the most fundamental of which was the connections that existed between information. [1] [2] He developed the first web server , the first web browser , and a document formatting protocol, called Hypertext Markup Language (HTML). After publishing the markup language in 1991, and releasing the browser source code for public use in 1993, many other web browsers were soon developed, with Marc Andreessen 's Mosaic (later Netscape Navigator ), being particularly easy to use and install, and often credited with sparking the Internet boom of the 1990s. It was a graphical browser which ran on several popular office and home computers, bringing multimedia content to non-technical users by including images and text on the same page.

Websites for use by the general public began to emerge in 1993–94. This spurred competition in server and browser software, highlighted in the Browser wars which was initially dominated by Netscape Navigator and Internet Explorer . Following the complete removal of commercial restrictions on Internet use by 1995, commercialization of the Web amidst macroeconomic factors led to the dot-com boom and bust in the late 1990s and early 2000s.

The features of HTML evolved over time, leading to HTML version 2 in 1995, HTML3 and HTML4 in 1997, and HTML5 in 2014. The language was extended with advanced formatting in Cascading Style Sheets (CSS) and with programming capability by JavaScript . AJAX programming delivered dynamic content to users, which sparked a new era in Web design , styled Web 2.0 . The use of social media , becoming common-place in the 2010s, allowed users to compose multimedia content without programming skills, making the Web ubiquitous in every-day life.

The underlying concept of hypertext as a user interface paradigm originated in projects in the 1960s, from research such as the Hypertext Editing System (HES) by Andries van Dam at Brown University, IBM Generalized Markup Language , Ted Nelson's Project Xanadu , and Douglas Engelbart's oN-Line System (NLS). [3] [ page   needed ] [ non-primary source needed ] Both Nelson and Engelbart were in turn inspired by Vannevar Bush 's microfilm -based memex , which was described in the 1945 essay " As We May Think ". [4] [ title   missing ] [5] Other precursors were FRESS and Intermedia . Paul Otlet's project Mundaneum has also been named as an early 20th-century precursor of the Web.

In 1980, Tim Berners-Lee , at the European Organization for Nuclear Research (CERN) in Switzerland, built ENQUIRE , as a personal database of people and software models, but also as a way to experiment with hypertext; each new page of information in ENQUIRE had to be linked to another page. [6] [7]

When Berners-Lee built ENQUIRE, the ideas developed by Bush, Engelbart, and Nelson did not influence his work, since he was not aware of them. However, as Berners-Lee began to refine his ideas, the work of these predecessors would later help to confirm the legitimacy of his concept. [6] [8]

During the 1980s, many packet-switched data networks emerged based on various communication protocols (see Protocol Wars ). One of these standards was the Internet protocol suite , which is often referred to as TCP/IP. As the Internet grew through the 1980s, many people realized the increasing need to be able to find and organize files and use information. By 1985, the Domain Name System (upon which the Uniform Resource Locator is built) came into being. [9] [ better   source   needed ] [ failed verification ] Many small, self-contained hypertext systems were created, such as Apple Computer's HyperCard (1987).

Berners-Lee's contract in 1980 was from June to December, but in 1984 he returned to CERN in a permanent role, and considered its problems of information management: physicists from around the world needed to share data, yet they lacked common machines and any shared presentation software. Shortly after Berners-Lee's return to CERN, TCP/IP protocols were installed on Unix machines at the institution, turning it into the largest Internet site in Europe. In 1988, the first direct IP connection between Europe and North America was established and Berners-Lee began to openly discuss the possibility of a web-like system at CERN. [10] He was inspired by a book, Enquire Within upon Everything . Many online services existed before the creation of the World Wide Web, such as for example CompuServe , Usenet [11] and bulletin board systems . [12]

While working at CERN , Tim Berners-Lee became frustrated with the inefficiencies and difficulties posed by finding information stored on different computers. [13] On 12 March 1989, he submitted a memorandum, titled "Information Management: A Proposal", [1] [14] to the management at CERN. The proposal used the term "web" and was based on "a large hypertext database with typed links". It described a system called "Mesh" that referenced ENQUIRE , the database and software project he had built in 1980, with a more elaborate information management system based on links embedded as text: "Imagine, then, the references in this document all being associated with the network address of the thing to which they referred, so that while reading this document, you could skip to them with a click of the mouse." Such a system, he explained, could be referred to using one of the existing meanings of the word hypertext , a term that he says was coined in the 1950s. Berners-Lee notes the possibility of multimedia documents that include graphics, speech and video, which he terms hypermedia . [1] [2]

Although the proposal attracted little interest, Berners-Lee was encouraged by his manager, Mike Sendall, to begin implementing his system on a newly acquired NeXT workstation. He considered several names, including Information Mesh , The Information Mine or Mine of Information , but settled on World Wide Web . Berners-Lee found an enthusiastic supporter in his colleague and fellow hypertext enthusiast Robert Cailliau who began to promote the proposed system throughout CERN. Berners-Lee and Cailliau pitched Berners-Lee's ideas to the European Conference on Hypertext Technology in September 1990, but found no vendors who could appreciate his vision.

Berners-Lee's breakthrough was to marry hypertext to the Internet. In his book Weaving The Web , he explains that he had repeatedly suggested to members of both technical communities that a marriage between the two technologies was possible. But, when no one took up his invitation, he finally assumed the project himself. In the process, he developed three essential technologies:

  • a system of globally unique identifiers for resources on the Web and elsewhere, the universal document identifier (UDI), later known as uniform resource locator (URL);
  • the publishing language Hypertext Markup Language (HTML);
  • the Hypertext Transfer Protocol (HTTP). [15]

With help from Cailliau he published a more formal proposal on 12 November 1990 to build a "hypertext project" called World Wide Web (abbreviated "W3") as a "web" of "hypertext documents" to be viewed by "browsers" using a client–server architecture . [16] [17] The proposal was modelled after the Standard Generalized Markup Language (SGML) reader Dynatext by Electronic Book Technology, a spin-off from the Institute for Research in Information and Scholarship at Brown University . The Dynatext system, licensed by CERN, was considered too expensive and had an inappropriate licensing policy for use in the general high energy physics community, namely a fee for each document and each document alteration. [ citation needed ]

At this point HTML and HTTP had already been in development for about two months and the first web server was about a month from completing its first successful test. Berners-Lee's proposal estimated that a read-only Web would be developed within three months and that it would take six months to achieve "the creation of new links and new material by readers, [so that] authorship becomes universal" as well as "the automatic notification of a reader when new material of interest to him/her has become available".

By December 1990, Berners-Lee and his work team had built all the tools necessary for a working Web: the HyperText Transfer Protocol (HTTP), the HyperText Markup Language (HTML), the first web browser (named WorldWideWeb , which was also a web editor ), the first web server (later known as CERN httpd ) and the first web site ( http://info.cern.ch ) containing the first web pages that described the project itself was published on 20 December 1990. [18] [19] The browser could access Usenet newsgroups and FTP files as well. A NeXT Computer was used by Berners-Lee as the web server and also to write the web browser. [20]

Working with Berners-Lee at CERN, Nicola Pellow developed the first cross-platform web browser, the Line Mode Browser . [21]

Initial launch

In January 1991, the first web servers outside CERN were switched on. On 6 August 1991, Berners-Lee published a short summary of the World Wide Web project on the newsgroup alt.hypertext , inviting collaborators. [22]

Paul Kunz from the Stanford Linear Accelerator Center (SLAC) visited CERN in September 1991, and was captivated by the Web. He brought the NeXT software back to SLAC, where librarian Louise Addis adapted it for the VM/CMS operating system on the IBM mainframe as a way to host the SPIRES -HEP database and display SLAC's catalog of online documents. [23] [24] [25] [26] This was the first web server outside of Europe and the first in North America. [27]

The World Wide Web had several differences from other hypertext systems available at the time. The Web required only unidirectional links rather than bidirectional ones, making it possible for someone to link to another resource without action by the owner of that resource. It also significantly reduced the difficulty of implementing web servers and browsers (in comparison to earlier systems), but in turn, presented the chronic problem of link rot .

Early browsers

The WorldWideWeb browser only ran on NeXTSTEP operating system. This shortcoming was discussed in January 1992, [28] and alleviated in April 1992 by the release of Erwise , an application developed at the Helsinki University of Technology , and in May by ViolaWWW , created by Pei-Yuan Wei , which included advanced features such as embedded graphics, scripting, and animation. ViolaWWW was originally an application for HyperCard . [29] Both programs ran on the X Window System for Unix . In 1992, the first tests between browsers on different platforms were concluded successfully between buildings 513 and 31 in CERN, between browsers on the NexT station and the X11-ported Mosaic browser. ViolaWWW became the recommended browser at CERN. To encourage use within CERN, Bernd Pollermann put the CERN telephone directory on the web—previously users had to log onto the mainframe in order to look up phone numbers. The Web was successful at CERN and spread to other scientific and academic institutions.

Students at the University of Kansas adapted an existing text-only hypertext browser, Lynx , to access the web in 1992. Lynx was available on Unix and DOS, and some web designers, unimpressed with glossy graphical websites, held that a website not accessible through Lynx was not worth visiting.

In these earliest browsers, images opened in a separate "helper" application.

From Gopher to the WWW

In the early 1990s, Internet-based projects such as Archie , Gopher , Wide Area Information Servers (WAIS), and the FTP Archive list attempted to create ways to organize distributed data. Gopher was a document browsing system for the Internet, released in 1991 by the University of Minnesota . Invented by Mark P. McCahill , it became the first commonly used hypertext interface to the Internet. While Gopher menu items were examples of hypertext, they were not commonly perceived in that way [ clarification needed ] . In less than a year, there were hundreds of Gopher servers. [30] It offered a viable alternative to the World Wide Web in the early 1990s and the consensus was that Gopher would be the primary way that people would interact with the Internet. [31] [32] However, in 1993, the University of Minnesota declared that Gopher was proprietary and would have to be licensed. [30]

In response, on 30 April 1993, CERN announced that the World Wide Web would be free to anyone, with no fees due, and released their code into the public domain. [33] This made it possible to develop servers and clients independently and to add extensions without licensing restrictions. [ citation needed ] Coming two months after the announcement that the server implementation of the Gopher protocol was no longer free to use, this spurred the development of various browsers which precipitated a rapid shift away from Gopher. [34] By releasing Berners-Lee's invention for public use, CERN encouraged and enabled its widespread use. [35]

Early websites intermingled links for both the HTTP web protocol and the Gopher protocol , which provided access to content through hypertext menus presented as a file system rather than through HTML files. Early Web users would navigate either by bookmarking popular directory pages or by consulting updated lists such as the NCSA "What's New" page. Some sites were also indexed by WAIS, enabling users to submit full-text searches similar to the capability later provided by search engines .

After 1993 the World Wide Web saw many advances to indexing and ease of access through search engines, which often neglected Gopher and Gopherspace. As its popularity increased through ease of use, incentives for commercial investment in the Web also grew. By the middle of 1994, the Web was outcompeting Gopher and the other browsing systems for the Internet. [36]

The National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana–Champaign (UIUC) established a website in November 1992. After Marc Andreessen , a student at UIUC, was shown ViolaWWW in late 1992, [29] he began work on Mosaic with another UIUC student Eric Bina , using funding from the High-Performance Computing and Communications Initiative , a US-federal research and development program initiated by US Senator Al Gore . [37] [38] [39] Andreessen and Bina released a Unix version of the browser in February 1993; Mac and Windows versions followed in August 1993. The browser gained popularity due to its strong support of integrated multimedia , and the authors' rapid response to user bug reports and recommendations for new features. [29] Historians generally agree that the 1993 introduction of the Mosaic web browser was a turning point for the World Wide Web. [40] [41] [42]

Before the release of Mosaic in 1993, graphics were not commonly mixed with text in web pages, and the Web was less popular than older protocols such as Gopher and WAIS. Mosaic could display inline images [43] and submit forms [44] [45] for Windows, Macintosh and X-Windows. NCSA also developed HTTPd , a Unix web server that used the Common Gateway Interface to process forms and Server Side Includes for dynamic content. Both the client and server were free to use with no restrictions. [46] Mosaic was an immediate hit; [47] its graphical user interface allowed the Web to become by far the most popular protocol on the Internet. Within a year, web traffic surpassed Gopher's. [30] Wired declared that Mosaic made non-Internet online services obsolete, [48] and the Web became the preferred interface for accessing the Internet. [ citation needed ]

Early growth

The World Wide Web enabled the spread of information over the Internet through an easy-to-use and flexible format. It thus played an important role in popularising use of the Internet. [49] Although the two terms are sometimes conflated in popular use, World Wide Web is not synonymous with Internet . [50] The Web is an information space containing hyperlinked documents and other resources , identified by their URIs. [51] It is implemented as both client and server software using Internet protocols such as TCP/IP and HTTP .

In keeping with its origins at CERN, early adopters of the Web were primarily university-based scientific departments or physics laboratories such as SLAC and Fermilab . By January 1993 there were fifty web servers across the world. [52] By October 1993 there were over five hundred servers online, including some notable websites . [53]

Practical media distribution and streaming media over the Web was made possible by advances in data compression , due to the impractically high bandwidth requirements of uncompressed media. Following the introduction of the Web, several media formats based on discrete cosine transform (DCT) were introduced for practical media distribution and streaming over the Web, including the MPEG video format in 1991 and the JPEG image format in 1992. The high level of image compression made JPEG a good format for compensating slow Internet access speeds, typical in the age of dial-up Internet access . JPEG became the most widely used image format for the World Wide Web. A DCT variation, the modified discrete cosine transform (MDCT) algorithm, led to the development of MP3 , which was introduced in 1991 and became the first popular audio format on the Web.

In 1992 the Computing and Networking Department of CERN, headed by David Williams, withdrew support of Berners-Lee's work. A two-page email sent by Williams stated that the work of Berners-Lee, with the goal of creating a facility to exchange information such as results and comments from CERN experiments to the scientific community, was not the core activity of CERN and was a misallocation of CERN's IT resources. Following this decision, Tim Berners-Lee left CERN for the Massachusetts Institute of Technology (MIT), where he continued to develop HTTP. [ citation needed ]

The first Microsoft Windows browser was Cello , written by Thomas R. Bruce for the Legal Information Institute at Cornell Law School to provide legal information, since access to Windows was more widespread amongst lawyers than access to Unix. Cello was released in June 1993.

The rate of web site deployment increased sharply around the world, and fostered development of international standards for protocols and content formatting. [54] Berners-Lee continued to stay involved in guiding web standards, such as the markup languages to compose web pages, and he advocated his vision of a Semantic Web (sometimes known as Web 3.0) based around machine-readability and interoperability standards.

World Wide Web Conference

In May 1994, the first International WWW Conference , organized by Robert Cailliau , was held at CERN; the conference has been held every year since.

World Wide Web Consortium

The World Wide Web Consortium (W3C) was founded by Tim Berners-Lee after he left the European Organization for Nuclear Research (CERN) in September/October 1994 in order to create open standards for the Web. [55] It was founded at the Massachusetts Institute of Technology Laboratory for Computer Science (MIT/LCS) with support from the Defense Advanced Research Projects Agency (DARPA), which had pioneered the Internet. A year later, a second site was founded at INRIA (a French national computer research lab) with support from the European Commission ; and in 1996, a third continental site was created in Japan at Keio University .

W3C comprised various companies that were willing to create standards and recommendations to improve the quality of the Web. Berners-Lee made the Web available freely, with no patent and no royalties due. The W3C decided that its standards must be based on royalty-free technology, so they can be easily adopted by anyone. Netscape and Microsoft, in the middle of a browser war , ignored the W3C and added elements to HTML ad hoc (e.g., blink and marquee ). Finally, in 1995, Netscape and Microsoft came to their senses and agreed to abide by the W3C's standard. [56]

The W3C published the standard for HTML 4 in 1997, which included Cascading Style Sheets (CSS) , giving designers more control over the appearance of web pages without the need for additional HTML tags. The W3C could not enforce compliance so none of the browsers were fully compliant. This frustrated web designers who formed the Web Standards Project (WaSP) in 1998 with the goal of cajoling compliance with standards. [57] A List Apart and CSS Zen Garden were influential websites that promoted good design and adherence to standards. [58] Nevertheless, AOL halted development of Netscape [59] and Microsoft was slow to update IE. [60] Mozilla and Apple both released browsers that aimed to be more standards compliant ( Firefox and Safari ), but were unable to dislodge IE as the dominant browser.

Commercialization, dot-com boom and bust, aftermath

As the Web grew in the mid-1990s, web directories and primitive search engines were created to index pages and allow people to find things. Commercial use restrictions on the Internet were lifted in 1995 when NSFNET was shut down.

In the US, the online service America Online (AOL) offered their users a connection to the Internet via their own internal browser, using a dial-up Internet connection. In January 1994, Yahoo! was founded by Jerry Yang and David Filo , then students at Stanford University . Yahoo! Directory became the first popular web directory . Yahoo! Search , launched the same year, was the first popular search engine on the World Wide Web. Yahoo! became the quintessential example of a first mover on the Web.

Online shopping began to emerge with the launch of Amazon 's shopping site by Jeff Bezos in 1995 and eBay by Pierre Omidyar the same year.

By 1994, Marc Andreessen's Netscape Navigator superseded Mosaic in popularity, holding the position for some time. Bill Gates outlined Microsoft's strategy to dominate the Internet in his Tidal Wave memo in 1995. [61] With the release of Windows 95 and the popular Internet Explorer browser, many public companies began to develop a Web presence. At first, people mainly anticipated the possibilities of free publishing and instant worldwide information. By the late 1990s, the directory model had given way to search engines, corresponding with the rise of Google Search , which developed new approaches to relevancy ranking . Directory features, while still commonly available, became after-thoughts to search engines.

Netscape had a very successful IPO valuing the company at $2.9 billion despite the lack of profits and triggering the dot-com bubble . [62] Increasing familiarity with the Web led to the growth of direct Web-based commerce ( e-commerce ) and instantaneous group communications worldwide. Many dot-com companies , displaying products on hypertext webpages, were added into the Web. Over the next 5 years, over a trillion dollars was raised to fund thousands of startups consisting of little more than a website.

During the dot-com boom , many companies vied to create a dominant web portal in the belief that such a website would best be able to attract a large audience that in turn would attract online advertising revenue. While most of these portals offered a search engine, they were not interested in encouraging users to find other websites and leave the portal and instead concentrated on "sticky" content. [63] In contrast, Google was a stripped-down search engine that delivered superior results. [64] It was a hit with users who switched from portals to Google. Furthermore, with AdWords , Google had an effective business model. [65] [66]

AOL bought Netscape in 1998. [67] In spite of their early success, Netscape was unable to fend off Microsoft. [68] Internet Explorer and a variety of other browsers almost completely replaced it.

Faster broadband internet connections replaced many dial-up connections from the beginning of the 2000s.

With the bursting of the dot-com bubble, many web portals either scaled back operations, floundered, [69] or shut down entirely. [70] [71] [72] AOL disbanded Netscape in 2003. [73]

Web server software

Web server software was developed to allow computers to act as web servers . The first web servers supported only static files, such as HTML (and images), but now they commonly allow embedding of server side applications. Web framework software enabled building and deploying web applications. Content management systems (CMS) were developed to organize and facilitate collaborative content creation. Many of them were built on top of separate content management frameworks .

After Robert McCool joined Netscape, development on the NCSA HTTPd server languished. In 1995, Brian Behlendorf and Cliff Skolnick created a mailing list to coordinate efforts to fix bugs and make improvements to HTTPd . [74] They called their version of HTTPd, Apache . [75] Apache quickly became the dominant server on the Web. [76] After adding support for modules, Apache was able to allow developers to handle web requests with a variety of languages including Perl , PHP and Python . Together with Linux and MySQL , it became known as the LAMP platform.

Following the success of Apache, the Apache Software Foundation was founded in 1999 and produced many open source web software projects in the same collaborative spirit.

Browser wars

After graduating from UIUC, Andreessen and Jim Clark , former CEO of Silicon Graphics , met and formed Mosaic Communications Corporation in April 1994 to develop the Mosaic Netscape browser commercially. The company later changed its name to Netscape , and the browser was developed further as Netscape Navigator , which soon became the dominant web client. They also released the Netsite Commerce web server which could handle SSL requests, thus enabling e-commerce on the Web. [77] SSL became the standard method to encrypt web traffic. Navigator 1.0 also introduced cookies , but Netscape did not publicize this feature. Netscape followed up with Navigator 2 in 1995 introducing frames , Java applets and JavaScript . In 1998, Netscape made Navigator open source and launched Mozilla . [78]

Microsoft licensed Mosaic from Spyglass and released Internet Explorer 1.0 that year and IE2 later the same year. IE2 added features pioneered at Netscape such as cookies, SSL, and JavaScript. The browser wars became a competition for dominance when Explorer was bundled with Windows. [79] [80] This led to the United States v. Microsoft Corporation antitrust lawsuit.

IE3 , released in 1996, added support for Java applets, ActiveX , and CSS . At this point, Microsoft began bundling IE with Windows. IE3 managed to increase Microsoft's share of the browser market from under 10% to over 20%. [81] IE4 , released the following year, introduced Dynamic HTML setting the stage for the Web 2.0 revolution. By 1998, IE was able to capture the majority of the desktop browser market. [68] It would be the dominant browser for the next fourteen years.

Google released their Chrome browser in 2008 with the first JIT JavaScript engine , V8 . Chrome overtook IE to become the dominant desktop browser in four years, [82] and overtook Safari to become the dominant mobile browser in two. [83] At the same time, Google open sourced Chrome's codebase as Chromium . [84]

Ryan Dahl used Chromium's V8 engine in 2009 to power an event driven runtime system , Node.js , which allowed JavaScript code to be used on servers as well as browsers. This led to the development of new software stacks such as MEAN . Thanks to frameworks such as Electron , developers can bundle up node applications as standalone desktop applications such as Slack .

Acer and Samsung began selling Chromebooks , cheap laptops running ChromeOS capable of running web apps, in 2011. Over the next decade, more companies offered Chromebooks. Chromebooks outsold MacOS devices in 2020 to become the second most popular OS in the world. [85]

Other notable web browsers emerged including Mozilla's Firefox , Opera's Opera browser and Apple's Safari .

Web 1.0 is a retronym referring to the first stage of the World Wide Web 's evolution, from roughly 1989 to 2004. According to Graham Cormode and Balachander Krishnamurthy, "content creators were few in Web 1.0 with the vast majority of users simply acting as consumers of content". [86] Personal web pages were common, consisting mainly of static pages hosted on ISP -run web servers , or on free web hosting services such as Tripod and the now-defunct GeoCities . [87] [88]

Some common design elements of a Web 1.0 site include: [89]

  • Static pages rather than dynamic HTML . [90]
  • Content provided from the server's filesystem rather than a relational database management system ( RDBMS ).
  • Pages built using Server Side Includes or Common Gateway Interface (CGI) instead of a web application written in a dynamic programming language such as Perl , PHP , Python or Ruby . [ clarification needed ]
  • The use of HTML 3.2 -era elements such as frames and tables to position and align elements on a page. These were often used in combination with spacer GIFs . [ citation needed ]
  • Proprietary HTML extensions, such as the < blink > and < marquee > tags, introduced during the first browser war .
  • Online guestbooks .
  • GIF buttons, graphics (typically 88×31 pixels in size) promoting web browsers , operating systems , text editors and various other products.
  • HTML forms sent via email . Support for server side scripting was rare on shared servers during this period. To provide a feedback mechanism for web site visitors, mailto forms were used. A user would fill in a form, and upon clicking the form's submit button, their email client would launch and attempt to send an email containing the form's details. The popularity and complications of the mailto protocol led browser developers to incorporate email clients into their browsers. [91]

Terry Flew , in his third edition of New Media, described the differences between Web 1.0 and Web 2.0 as a

"move from personal websites to blogs and blog site aggregation, from publishing to participation, from web content as the outcome of large up-front investment to an ongoing and interactive process, and from content management systems to links based on "tagging" website content using keywords ( folksonomy )."

Flew believed these factors formed the trends that resulted in the onset of the Web 2.0 "craze". [92]

Web pages were initially conceived as structured documents based upon HTML. They could include images, video, and other content, although the use of media was initially relatively limited and the content was mainly static. By the mid-2000s, new approaches to sharing and exchanging content, such as blogs and RSS , rapidly gained acceptance on the Web. The video-sharing website YouTube launched the concept of user-generated content. [93] As new technologies made it easier to create websites that behaved dynamically, the Web attained greater ease of use and gained a sense of interactivity which ushered in a period of rapid popularization. This new era also brought into existence social networking websites , such as Friendster , MySpace , Facebook , and Twitter , and photo- and video-sharing websites such as Flickr and, later, Instagram which gained users rapidly and became a central part of youth culture . Wikipedia 's user-edited content quickly displaced the professionally-written Microsoft Encarta . [94] The popularity of these sites, combined with developments in the technology that enabled them, and the increasing availability and affordability of high-speed connections made video content far more common on all kinds of websites. This new media-rich model for information exchange, featuring user-generated and user-edited websites, was dubbed Web 2.0 , a term coined in 1999 by Darcy DiNucci [95] and popularized in 2004 at the Web 2.0 Conference . The Web 2.0 boom drew investment from companies worldwide and saw many new service-oriented startups catering to a newly "democratized" Web. [96] [97] [98] [99] [100] [101]

JavaScript made the development of interactive web applications possible. Web pages could run JavaScript and respond to user input, but they could not interact with the network. Browsers could submit data to servers via forms and receive new pages, but this was slow compared to traditional desktop applications. Developers that wanted to offer sophisticated applications over the Web used Java or nonstandard solutions such as Adobe Flash or Microsoft's ActiveX .

Microsoft added a little-noticed feature called XMLHttpRequest to Internet Explorer in 1999, which enabled a web page to communicate with the server while remaining visible. Developers at Oddpost used this feature in 2002 to create the first Ajax application, a webmail client that performed as well as a desktop application. [102] Ajax apps were revolutionary. Web pages evolved beyond static documents to full-blown applications. Websites began offering APIs in addition to webpages. Developers created a plethora of Ajax apps including widgets , mashups and new types of social apps . Analysts called it Web 2.0 . [103]

Browser vendors improved the performance of their JavaScript engines [104] and dropped support for Flash and Java. [105] [106] Traditional client server applications were replaced by cloud apps . Amazon reinvented itself as a cloud service provider .

The use of social media on the Web has become ubiquitous in everyday life. [107] [108] The 2010s also saw the rise of streaming services, such as Netflix .

In spite of the success of Web 2.0 applications, the W3C forged ahead with their plan to replace HTML with XHTML and represent all data in XML . In 2004, representatives from Mozilla, Opera , and Apple formed an opposing group, the Web Hypertext Application Technology Working Group (WHATWG), dedicated to improving HTML while maintaining backward compatibility. [109] For the next several years, websites did not transition their content to XHTML; browser vendors did not adopt XHTML2; and developers eschewed XML in favor of JSON . [110] By 2007, the W3C conceded and announced they were restarting work on HTML [111] and in 2009, they officially abandoned XHTML. [112] In 2019, the W3C ceded control of the HTML specification, now called the HTML Living Standard, to WHATWG. [113]

Microsoft rewrote their Edge browser in 2021 to use Chromium as its code base in order to be more compatible with Chrome. [114]

Security, censorship and cybercrime

The increasing use of encrypted connections ( HTTPS ) enabled e-commerce and online banking . Nonetheless, the 2010s saw the emergence of various controversial trends, such as internet censorship and the growth of cybercrime , including web-based cyberattacks and ransomware . [115] [116]

Early attempts to allow wireless devices to access the Web used simplified formats such as i-mode and WAP . Apple introduced the first smartphone in 2007 with a full-featured browser. Other companies followed suit and in 2011, smartphone sales overtook PCs. [117] Since 2016, most visitors access websites with mobile devices [118] which led to the adoption of responsive web design .

Apple, Mozilla, and Google have taken different approaches to integrating smartphones with modern web apps. Apple initially promoted web apps for the iPhone, but then encouraged developers to make native apps . [119] Mozilla announced Web APIs in 2011 to allow webapps to access hardware features such as audio, camera or GPS. [120] Frameworks such as Cordova and Ionic allow developers to build hybrid apps . Mozilla released a mobile OS designed to run web apps in 2012, [121] but discontinued it in 2015. [122]

Google announced specifications for Accelerated Mobile Pages (AMP), [123] and progressive web applications (PWA) in 2015. [124] AMPs use a combination of HTML, JavaScript, and Web Components to optimize web pages for mobile devices; and PWAs are web pages that, with a combination of web workers and manifest files , can be saved to a mobile device and opened like a native app.

Web 3.0 and Web3

The extension of the Web to facilitate data exchange was explored as an approach to create a Semantic Web (sometimes called Web 3.0). This involved using machine-readable information and interoperability standards to enable context-understanding programs to intelligently select information for users. [125] Continued extension of the Web has focused on connecting devices to the Internet, coined Intelligent Device Management . As Internet connectivity becomes ubiquitous, manufacturers have started to leverage the expanded computing power of their devices to enhance their usability and capability. Through Internet connectivity, manufacturers are now able to interact with the devices they have sold and shipped to their customers, and customers are able to interact with the manufacturer (and other providers) to access a lot of new content. [126]

This phenomenon has led to the rise of the Internet of Things (IoT), [127] where modern devices are connected through sensors, software, and other technologies that exchange information with other devices and systems on the Internet. This creates an environment where data can be collected and analyzed instantly, providing better insights and improving the decision-making process. Additionally, the integration of AI with IoT devices continues to improve their capabilities, allowing them to predict customer needs and perform tasks, increasing efficiency and user satisfaction.

Web3 (sometimes also referred to as Web 3.0) is an idea for a decentralized Web based on public blockchains , smart contracts , digital tokens and digital wallets . [128]

Beyond Web 3.0

The next generation of the Web is often termed Web 4.0, but its definition is not clear. According to some sources, it is a Web that involves artificial intelligence , [129] the internet of things , pervasive computing , ubiquitous computing and the Web of Things among other concepts. [130] According to the European Union, Web 4.0 is "the expected fourth generation of the World Wide Web. Using advanced artificial and ambient intelligence, the internet of things, trusted blockchain transactions, virtual worlds and XR capabilities, digital and real objects and environments are fully integrated and communicate with each other, enabling truly intuitive, immersive experiences, seamlessly blending the physical and digital worlds". [131]

Historiography of the Web poses specific challenges including, disposable data, missing links, lost content and archived websites, which have consequences for web historians. Sites such as the Internet Archive aim to preserve content. [132] [133]

history of the world wide web essay

  • History of email
  • History of hypertext
  • History of the Internet
  • History of telecommunication
  • History of web syndication technology
  • List of websites founded before 1995

Online services before the World Wide Web

  • NABU Network
  • Quantum Link / AOL
  • Bulletin board system
  • Category:Pre–World Wide Web online services
  • [2] Berners-Lee, T.; Cailliau, R.; Groff, J.-F.; Pollermann, B. (1992). "World-Wide Web: The Information Universe" . Electron. Netw. Res. Appl. Policy . 2 : 52–58. doi : 10.1108/eb047254 . Archived from the original on 27 December 2022 . Retrieved 27 December 2022 .
  • [3] Engelbart, Douglas (1962). Augmenting Human Intellect: A Conceptual Framework (Report). Archived from the original on 24 November 2005 . Retrieved 25 November 2005 .
  • [4] Conklin, Jeff (1987), IEEE Computer , vol.   20, pp.   17–41
  • [5] Bush, Vannevar (July 1945). "As We May Think" . The Atlantic . Retrieved 28 May 2009 .
  • [6] Tim Berners-Lee (1999). Weaving the Web . Internet Archive. HarperSanFrancisco. pp.   5–6. ISBN   978-0-06-251586-5 .
  • [7] "Sir Tim Berners-Lee" . Queen Elizabeth Prize for Engineering . Archived from the original on 16 November 2022 . Retrieved 16 November 2022 .
  • [8] Rutter, Dorian (2005). From Diversity to Convergence: British Computer Networks and the Internet, 1970–1995 (PDF) (Computer Science thesis). The University of Warwick. Archived (PDF) from the original on 10 October 2022 . Retrieved 27 December 2022 .
  • [9] Enzer, Larry (31 August 2018). "The Evolution of the World Wide Web" . Monmouth Web Developers . Archived from the original on 18 November 2018 . Retrieved 31 August 2018 .
  • [10] "Enquire Within upon Everything" (PDF) . Archived from the original (PDF) on 17 November 2015 . Retrieved 26 August 2015 .
  • [11] "First post: A history of online public messaging" . 29 April 2024.
  • [12] "Before the Web: Online services of yesteryear" . ZDNET . Retrieved 4 June 2024 .
  • [13] May, Ashley (12 March 2019). "Happy 30th birthday, World Wide Web. Inventor outlines plan to combat hacking, hate speech" . USA Today . Archived from the original on 6 October 2021 . Retrieved 12 March 2019 .
  • [14] Romano, Aja (12 March 2019). "The World Wide Web – not the Internet – turns 30 years old" . Vox.com . Archived from the original on 12 March 2019 . Retrieved 15 April 2022 .
  • [15] "Tim Berners" , Lemelson Foundation , archived from the original on 16 October 2022 , retrieved 16 October 2022
  • [16] Berners-Lee, Tim ; Cailliau, Robert (12 November 1990). "WorldWideWeb: Proposal for a HyperText Project" . Archived from the original on 2 May 2015 . Retrieved 12 May 2015 .
  • [17] He Created the Web. Now He’s Out to Remake the Digital World Archived 11 October 2021 at the Wayback Machine , New York Times , by Steve Lohr, 10 January 2021.
  • [18] "The birth of the web" . CERN. Archived from the original on 24 December 2015 . Retrieved 23 December 2015 .
  • [19] "First Web pages" . W3.org. Archived from the original on 31 January 2010 . Retrieved 27 July 2009 .
  • [20] "Tim Berners-Lee: client" . W3.org. Archived from the original on 21 July 2009 . Retrieved 27 July 2009 .
  • [21] Hempstead, C.; Worthington, W., eds. (2005). Encyclopedia of 20th-Century Technology . Routledge . p.   905. ISBN   9781135455514 . Retrieved 15 August 2015 .
  • [22] "Short summary of the World Wide Web project" . 6 August 1991. Archived from the original on 29 May 2013 . Retrieved 27 July 2009 .
  • [23] "The Early World Wide Web at SLAC" . Archived from the original on 24 November 2005.
  • [24] "About SPIRES" . Archived from the original on 12 February 2010 . Retrieved 30 March 2010 .
  • [25] "A Little History of the World Wide Web" . Archived from the original on 6 May 2013.
  • [26] "W3C10 Timeline Graphic" . Archived from the original on 9 October 2021 . Retrieved 29 January 2020 .
  • [27] "A short history of the Web" . CERN . Archived from the original on 17 April 2022 . Retrieved 15 April 2022 .
  • [28] Raggett, Dave; Jenny Lam; Ian Alexander (April 1996). HTML 3: Electronic Publishing on the World Wide Web . Harlow, England; Reading, Mass: Addison-Wesley. p.   21. ISBN   9780201876932 .
  • [29] "Frequently asked questions by the Press – Tim BL" . W3.org. Archived from the original on 3 October 2018 . Retrieved 15 April 2022 .
  • [30] Gihring, Tim (11 August 2016). "The rise and fall of the Gopher protocol" . MinnPost. Archived from the original on 10 February 2022 . Retrieved 12 February 2022 .
  • [31] Campbell-Kelly, Martin; Garcia-Swartz, Daniel D (2013). "The History of the Internet: The Missing Narratives" . Journal of Information Technology . 28 (1): 46–53. doi : 10.1057/jit.2013.4 . ISSN   0268-3962 . S2CID   41013 . SSRN   867087 .
  • [32] Hoffman, Jay (April 1991). "What the Web Could Have Been" . The History of the Web . Jay Hoffman. Archived from the original on 22 February 2022 . Retrieved 22 February 2022 .
  • [33] "Ten Years Public Domain for the Original Web Software" . Tenyears-www.web.cern.ch. 30 April 2003. Archived from the original on 13 August 2009 . Retrieved 27 July 2009 .
  • [34] "Software release of WWW into public domain" . CERN Document Server . CERN. 2 February 1993. Archived from the original on 17 February 2022 . Retrieved 17 February 2022 .
  • [35] "The Early World Wide Web at SLAC" . The Early World Wide Web at SLAC: Documentation of the Early Web at SLAC . Archived from the original on 24 November 2005 . Retrieved 25 November 2005 .
  • [36] "Where Have all the Gophers Gone? Why the Web beat Gopher in the Battle for Protocol Mind Share" . Ils.unc.edu. Archived from the original on 17 March 2019 . Retrieved 17 October 2015 .
  • [37] "Mosaic Web Browser History   – NCSA, Marc Andreessen, Eric Bina" . Livinginternet.com. Archived from the original on 18 May 2010 . Retrieved 27 July 2009 .
  • [38] "NCSA Mosaic   – 10 September 1993 Demo" . Totic.org. Archived from the original on 14 May 2019 . Retrieved 27 July 2009 .
  • [39] "Vice President Al Gore's ENIAC Anniversary Speech" . Cs.washington.edu. 14 February 1996. Archived from the original on 20 February 2009 . Retrieved 27 July 2009 .
  • [40] "Bloomberg Game Changers: Marc Andreessen" . Bloomberg.com. 17 March 2011. Archived from the original on 16 May 2012 . Retrieved 15 April 2022 .
  • [41] Vetter, Ronald J. (October 1994). "Mosaic and the World-Wide Web" (PDF) . North Dakota State University . Archived from the original (PDF) on August 24, 2014 . Retrieved November 20, 2010 .
  • [42] Berners-Lee, Tim. "What were the first WWW browsers?" . World Wide Web Consortium . Archived from the original on 3 October 2018 . Retrieved 15 June 2010 .
  • [43] Hoffman, Jay (21 April 1993). "The Origin of the IMG Tag" . The History of the Web . Archived from the original on 13 February 2022 . Retrieved 13 February 2022 .
  • [44] Wilson, Brian. "Mosaic" . Index D O T Html . Brian Wilson. Archived from the original on 1 February 2022 . Retrieved 15 February 2022 .
  • [45] Clarke, Roger. "The Birth of Web Commerce" . Roger Clarke's Web-Site . XAMAX. Archived from the original on 15 February 2022 . Retrieved 15 February 2022 .
  • [46] Calore, Michael (22 April 2010). "22 April 1993: Mosaic Browser Lights Up Web With Color, Creativity" . Wired . Retrieved 12 February 2022 .
  • [47] Kline, Greg (20 April 2003). "Mosaic started Web rush, Internet boom" . The News-Gazette (Champaign–Urbana). Archived from the original on 13 June 2020 . Retrieved 27 February 2022 .
  • [48] Wolfe, Gary (1 October 1994). "The (Second Phase of the) Revolution Has Begun" . Wired . Retrieved 15 February 2022 .
  • [49] Catalano, Charles S. (15 October 2007). "Megaphones to the Internet and the World: The Role of Blogs in Corporate Communications". International Journal of Strategic Communication . 1 (4): 247–262. doi : 10.1080/15531180701623627 . S2CID   143156963 .
  • [50] "WWW (World Wide Web) Definition" . TechDictionary . Retrieved 12 April 2024 .
  • [51] Jacobs, Ian; Walsh, Norman (15 December 2004). "Architecture of the World Wide Web, Volume One" . Introduction: W3C. Archived from the original on 9 February 2015 . Retrieved 11 February 2015 .
  • [52] Hopgood, Bob. "History of the Web" . w3.org . The World Wide Web Consortium. Archived from the original on 21 March 2022 . Retrieved 12 February 2022 .
  • [53] Couldry, Nick (2012). Media, Society, World: Social Theory and Digital Media Practice . London: Polity Press. p.   2. ISBN   9780745639208 .
  • [54] Hey, Anthony J. G.; Pápay, Gyuri (2015). The Computing Universe: A Journey through a Revolution . Cambridge University Press. p.   228. ISBN   978-0-521-76645-6 .
  • [55] "LCS announces Web industry consortium" . MIT News. 19 October 1994. Archived from the original on 12 February 2022 . Retrieved 15 February 2022 .
  • [56] Hoffman, Jay (10 January 1997). "The HTML Tags Everybody Hated" . The History of the Web . Jay Hoffman. Archived from the original on 9 February 2022 . Retrieved 15 February 2022 .
  • [57] Oakes, Chris (18 August 1998). "Group Out to Set A New Standard" . Wired .
  • [58] Hoffman, Jay (23 May 2003). "Year of A List Apart" . The History of the Web . Jay Hoffman. Archived from the original on 19 February 2022 . Retrieved 19 February 2022 .
  • [59] "AOL to End Support of Netscape Navigator" . New York Times . 29 December 2007. Archived from the original on 27 February 2022 . Retrieved 27 February 2022 .
  • [60] Conlon, Tom (2 March 2010). "Inside the Excruciatingly Slow Death of Internet Explorer 6" . Popular Science. Archived from the original on 19 February 2022 . Retrieved 19 February 2022 .
  • [61] Wired Staff (26 May 2010). "Gates, Microsoft Jump on 'Internet Tidal Wave' " . Wired . Retrieved 12 February 2022 .
  • [62] McCullough, Brian. "20 YEARS ON: WHY NETSCAPE'S IPO WAS THE "BIG BANG" OF THE INTERNET ERA" . www.internethistorypodcast.com . INTERNET HISTORY PODCAST. Archived from the original on 12 February 2022 . Retrieved 12 February 2022 .
  • [63] Wingfield, Nick (7 December 1998). "Portal Sites Reap the Rewards Of Strategies for Getting 'Sticky' " . Wall Street Journal . Archived from the original on 15 February 2022 . Retrieved 15 February 2022 .
  • [64] Heitzman, Adam (5 June 2017). "How Google Came To Dominate Search And What The Future Holds" . Fortune . Archived from the original on 15 February 2022 . Retrieved 15 February 2022 .
  • [65] Bayers, Chip. "I'm Feeling Lucky" . Wired . Retrieved 20 February 2022 .
  • [66] "The Evolution of Google AdWords – A $38 Billion Advertising Platform" . WordStream . LOCALiQ. Archived from the original on 15 February 2022 . Retrieved 15 February 2022 .
  • [67] "AOL, Netscape tie knot" . CNNMoney . CNN. Archived from the original on 28 January 2022 . Retrieved 15 February 2022 .
  • [68] Calore, Michael (28 September 2009). "28 September 1998: Internet Explorer Leaves Netscape in Its Wake" . Wired . Retrieved 14 February 2022 .
  • [69] Greenberg, Julia (23 November 2015). "Once Upon a Time, Yahoo Was the Most Important Internet Company" . Wired . Retrieved 17 February 2022 .
  • [70] Hu, Jim (2 January 2002). "Time Warner to shutter Pathfinder" . CNet. Archived from the original on 15 February 2022 . Retrieved 15 February 2022 .
  • [71] Hansell, Saul (30 January 2001). "Disney, in Retreat From Internet, to Abandon Go.com Portal Site" . New York Times . Archived from the original on 17 February 2022 . Retrieved 17 February 2022 .
  • [72] "NBC to Shut Down NBCi" . PBS. 9 April 2001. Archived from the original on 17 February 2022 . Retrieved 17 February 2022 .
  • [73] Higgins, Chris (15 July 2017). "On This Day in 2003, Netscape Went Offline Forever" . Mental Floss. Archived from the original on 12 February 2022 . Retrieved 15 February 2022 .
  • [74] "How Apache Came to Be" . httpd.apache.org . Apache. Archived from the original on 7 June 2008 . Retrieved 13 February 2022 .
  • [75] Moschovitis, Christos J. P (1999). History of the Internet   : a chronology, 1843 to the present . Internet Archive. ABC-CLIO. ISBN   978-1-57607-118-2 .
  • [76] "December 1996 Web Server Survey" . Netcraft.co.uk . Netcraft. December 1996. Archived from the original on 15 February 2022 . Retrieved 15 February 2022 .
  • [77] "NETSCAPE COMMUNICATIONS SHIPS RELEASE 1.0 OF NETSCAPE NAVIGATOR AND NETSCAPE SERVERS" . Netscape. Archived from the original on 27 October 1996 . Retrieved 13 February 2022 .
  • [78] "History of the Mozilla Project" . mozilla.org . Mozilla. Archived from the original on 15 February 2022 . Retrieved 15 February 2022 .
  • [79] Campbell-Kelly, Martin; Garcia-Swartz, Daniel D (2013). "The History of the Internet: The Missing Narratives" . Journal of Information Technology . 28 (1): 54–57. doi : 10.1057/jit.2013.4 . ISSN   0268-3962 . S2CID   41013 . SSRN   867087 .
  • [80] "Browser" . Mashable . Archived from the original on 2 September 2011 . Retrieved 2 September 2011 .
  • [81] "Microsoft Internet Explorer 3.0 Is World's Fastest-Growing Browser" . Microsoft. 29 January 1997. Archived from the original on 15 August 2022 . Retrieved 11 April 2022 .
  • [82] Vaughan-Nichols, Steven (21 May 2012). "Chrome beats Internet Explorer in global Web browser race" . ZDNet. Archived from the original on 14 February 2022 . Retrieved 14 February 2022 .
  • [83] Ellis, Megan (2 April 2019). "5 Reasons Why Android Is So Much More Popular Than iPhone" . MUO. Archived from the original on 27 February 2022 . Retrieved 27 February 2022 .
  • [84] "Welcome to Chromium" . 2 September 2008. Archived from the original on 12 January 2018 . Retrieved 27 February 2022 .
  • [85] Hachman, Mark (17 February 2021). "Chromebooks continued to outsell Macs in 2020" . PC World. Archived from the original on 20 February 2022 . Retrieved 20 February 2022 .
  • [86] Balachander Krishnamurthy, Graham Cormode (2 June 2008). "Key differences between Web 1.0 and Web 2.0" . First Monday . 13 (6). Archived from the original on 25 October 2012 . Retrieved 23 September 2014 .
  • [87] "Geocities – Dead Media Archive" . cultureandcommunication.org . Archived from the original on 24 May 2014 . Retrieved 23 September 2014 .
  • [88] "So Long, GeoCities: We Forgot You Still Existed" . 23 April 2009. Archived from the original on 17 October 2014 . Retrieved 23 September 2014 .
  • [89] Viswanathan, Ganesh; Dutt Mathur, Punit; Yammiyavar, Pradeep (March 2010). "From Web 1.0 to Web 2.0 and beyond: Reviewing usability heuristic criteria taking music sites as case studies" . IndiaHCI Conference. Mumbai. Archived from the original on 21 March 2022 . Retrieved 20 February 2015 . {{ cite journal }} : Cite journal requires | journal= ( help )
  • [90] "Is there a Web 1.0?" . HowStuffWorks . January 28, 2008. Archived from the original on February 22, 2019 . Retrieved February 15, 2019 .
  • [91] "The Right Size of Software" . www.catb.org . Archived from the original on 17 June 2015 . Retrieved 20 February 2015 .
  • [92] Flew, Terry (2008). New Media: An Introduction (3rd   ed.). Melbourne: Oxford University Press. p.   19.
  • [93] Susarla, Anjana; Oh, Jeong-Ha; Tan, Yong (2012). "Social Networks and the Diffusion of User-Generated Content: Evidence from YouTube" . Information Systems Research . 23 (1): 23–41. doi : 10.1287/isre.1100.0339 . ISSN   1047-7047 . JSTOR   23207870 . Archived from the original on 12 July 2022 . Retrieved 12 July 2022 .
  • [94] "Victim Of Wikipedia: Microsoft To Shut Down Encarta" . Forbes . Archived from the original on 12 July 2022 . Retrieved 12 July 2022 .
  • [95] "What is Web 2.0? | Definition from TechTarget" . WhatIs . Retrieved 13 April 2024 .
  • [96] "What Is Web 2.0?" . CBS News . 1 May 2008. Archived from the original on 16 April 2022 . Retrieved 16 April 2022 .
  • [97] "The Good, the Bad, And the 'Web 2.0' " . Wall Street Journal . 19 July 2007. ISSN   0099-9660 . Archived from the original on 16 April 2022 . Retrieved 16 April 2022 .
  • [98] Anderson, Paul (2016). "14.2.1 AJAX: The Key to Web 2.0" . Web 2.0 and Beyond: Principles and Technologies . CRC Press. p.   257. ISBN   978-1-4398-2868-7 .
  • [99] Han, Sam (2012). Web 2.0 . Routledge. p.   35. ISBN   978-1-136-99606-1 .
  • [100] "How companies are benefiting from Web 2.0" . McKinsey . Archived from the original on 27 March 2022 . Retrieved 16 April 2022 .
  • [101] "Tim Berners-Lee's original World Wide Web browser" . Archived from the original on 17 July 2011. With recent phenomena like blogs and wikis, the Web is beginning to develop the kind of collaborative nature that its inventor envisaged from the start.
  • [102] Gibbs, Mark (12 April 2004). "There's nothing odd about the slickness of Oddpost" . Network Word. Archived from the original on 15 February 2022 . Retrieved 15 February 2022 .
  • [103] Singel, Ryan (6 October 2005). "Are You Ready for Web 2.0?" . Wired . Retrieved 16 February 2022 .
  • [104] Shankland, Stephen (20 March 2009). "Browser war centers on once-obscure JavaScript" . CNet. Archived from the original on 20 February 2022 . Retrieved 20 February 2022 .
  • [105] Skuse, Cole (12 January 2021). "Gone in a flash: Adobe Flash removed from online browsers" . The Tartan. Archived from the original on 27 January 2022 . Retrieved 16 February 2022 .
  • [106] Hughes, Matthew (11 September 2015). "The Web Just Became More Secure: Google Drops Support for Java" . makeuseof.com . MUO. Archived from the original on 16 February 2022 . Retrieved 16 February 2022 .
  • [107] Deo, Prakash Vidyarthi (2012). Technologies and Protocols for the Future of Internet Design: Reinventing the Web: Reinventing the Web . IGI Global. p.   3. ISBN   978-1-4666-0204-5 .
  • [108] Schuster, Jenna (10 June 2016). "A brief history of internet service providers" . Archived from the original on 28 April 2019 . Retrieved 15 January 2020 .
  • [109] Hickson, Ian. "WHAT open mailing list announcement" . whatwg.org . WHATWG. Archived from the original on 8 March 2022 . Retrieved 16 February 2022 .
  • [110] Target, Sinclair. "The Rise and Rise of JSON" . twobithistory.org . Sinclair Target. Archived from the original on 19 January 2022 . Retrieved 16 February 2022 .
  • [111] Daly, Janet (7 March 2007). "W3C Relaunches HTML Activity" . W3C. Archived from the original on 16 February 2022 . Retrieved 17 February 2022 .
  • [112] Shankland, Stephen (9 July 2009). "An epitaph for the Web standard, XHTML 2" . CNet. Archived from the original on 16 February 2022 . Retrieved 17 February 2022 .
  • [113] "Memorandum of Understanding Between W3C and WHATWG" . w3.org . W3C. Archived from the original on 29 May 2019 . Retrieved 16 February 2022 .
  • [114] Bradshaw, Kyle (6 December 2018). "Microsoft confirms Edge rewrite based on Google's Chromium for 'improved compatibility' " . 9to5Google . 925. Archived from the original on 14 February 2022 . Retrieved 14 February 2022 .
  • [115] Kortti, Jukka (17 April 2019). Media in History: An Introduction to the Meanings and Transformations of Communication Over Time . Macmillan International Higher Education. p.   142. ISBN   978-1-352-00596-7 .
  • [116] Gragido, Will; Pirc, John (7 January 2011). Cybercrime and Espionage: An Analysis of Subversive Multi-Vector Threats . Newnes. p.   14. ISBN   978-1-59749-614-8 .
  • [117] Goldman, David (9 February 2011). "Smartphones have conquered PCs" . CNN. Archived from the original on 9 December 2021 . Retrieved 18 February 2022 .
  • [118] Murphy, Mike (1 November 2016). "More websites were viewed on mobile devices and tablets than desktops for the first time ever this month" . Quartz. Archived from the original on 18 February 2022 . Retrieved 18 February 2022 .
  • [119] Ortolani, Parker (3 June 2021). "Remembering Apple's 'sweet solution' for iPhone apps before the App Store" . 9to5Mac. Archived from the original on 18 February 2022 . Retrieved 18 February 2022 .
  • [120] "Web APIs" . MDN Web Docs . Mozilla. Archived from the original on 13 February 2022 . Retrieved 16 February 2022 .
  • [121] Velazco, Chris (2 July 2012). "Mozilla's Boot To Gecko Becomes Firefox OS, Scores Support From Sprint, Deutsche Telekom, ZTE, And More" . TechCrunch. Archived from the original on 18 February 2022 . Retrieved 18 February 2022 .
  • [122] Lunden, Ingrid (8 December 2015). "Mozilla Will Stop Developing And Selling Firefox OS Smartphones" . TechCrunch. Archived from the original on 31 January 2017 . Retrieved 18 February 2022 .
  • [123] Besbris, David (7 October 2015). "Introducing the Accelerated Mobile Pages Project, for a faster, open mobile web" . Google. Archived from the original on 17 June 2021 . Retrieved 22 February 2022 .
  • [124] Osmani, Addy (December 2015). "Getting Started with Progressive Web Apps" . Google Inc. Archived from the original on 22 February 2022 . Retrieved 22 February 2022 .
  • [125] Virgilio, Roberto de; Giunchiglia, Fausto; Tanca, Letizia (2010). Semantic Web Information Management: A Model-Based Perspective . Springer Science & Business Media. p.   481. ISBN   978-3-642-04329-1 .
  • [126] Gottinger, Hans W. (2017). Internet Economics: Models, Mechanisms and Management . Bentham Science Publishers. p.   126. ISBN   978-1-68108-546-3 .
  • [127] "What is Internet of Things? Internet of Things Definition" . amazingalgorithms.com . Retrieved 13 April 2024 .
  • [128] Ragnedda, Massimo; Destefanis, Giuseppe (2019). Blockchain and Web 3.0: Social, Economic, and Technological Challenges . Routledge. ISBN   978-0-429-63920-3 .
  • [129] https://www.rsisinternational.org/IJRSI/Issue31/75-78.pdf
  • [130] Almeida, F. (2017). Concept and dimensions of web 4.0. International journal of computers and technology, 16(7).
  • [131] "The Commission wants the EU to lead on 'Web 4.0' — whatever that is" . 11 July 2023.
  • [132] Brügger, Niels (2013). "Web historiography and Internet Studies: Challenges and perspectives" . New Media & Society . 15 (5): 752–764. doi : 10.1177/1461444812462852 . ISSN   1461-4448 . S2CID   32892005 . Archived from the original on 14 December 2022 . Retrieved 14 December 2022 .
  • [133] Craig, William. "The Importance of Historiography on the Web" . WebFX . Archived from the original on 14 December 2022 . Retrieved 14 December 2022 .
  • Berners-Lee, Tim; Fischetti, Mark (1999). Weaving the Web   : the original design and ultimate destiny of the World Wide Web by its inventor . San Francisco: HarperSanFrancisco. ISBN   0-06-251586-1 . OCLC   41238513 .
  • Brügger, Niels (2017). Web 25   : histories from the first 25 years of the World Wide Web . New York, NY. ISBN   978-1-4331-3269-8 . OCLC   976036138 . {{ cite book }} : CS1 maint: location missing publisher ( link )
  • Gillies, James; Cailliau, Robert (2000). How the Web was born   : the story of the World Wide Web . Oxford: Oxford University Press. ISBN   0-19-286207-3 . OCLC   43377073 .
  • Herman, Andrew; Swiss, Thomas (2000). The World Wide Web and contemporary cultural theory . New York: Routledge. ISBN   0-415-92501-0 . OCLC   44446371 .
  • Web History: first 30 years
  • "A Little History of the World Wide Web: from 1945 to 1995" , Dan Connolly, W3C , 2000
  • "The World Wide Web: Past, Present and Future" , Tim Berners-Lee, August 1996
  • The History of the Web
  • Web Development History
  • A Brief(ish) History of the Web Universe , Brian Kardell
  • Web History Community Group , W3C
  • The history of the Web , W3C
  • info.cern.ch , the first website

Wikiwand in your browser!

Seamless Wikipedia browsing. On steroids.

Every time you click a link to Wikipedia, Wiktionary or Wikiquote in your browser's search results, it will show the modern Wikiwand interface.

Wikiwand extension is a five stars, simple, with minimum permission required to keep your browsing private, safe and transparent.

Wikiwand for Chrome

Wikiwand for Firefox

1989–1991: Origins

1991–1994: the web goes public, early growth, 1994–2004: open standards, going global, 2004–present: the web as platform, ubiquity, historiography, further reading, external links.

preview

Essay on The World Wide Web

  • 3 Works Cited

The World Wide Web Communication--it is a fundamental part of our everyday lives. It characterizes who we are, what we do, and how we relate to others in society. It is a very powerful tool that holds many different uses for our basic needs and survival. At a very simplistic level, it is key in attaining our very basic needs for survival. In that respect, it is key in achieving all needs in Maslows hierarchy. Its uses and possibilities endless. Over time, the discoveries that have been made in relation to communication have been revolutionary in that they have changed the way we live and act dramatically. For example, the writing on walls, pencils, pens, ink, paper, the printing press , telegraph, telephone, television, …show more content…

This lack of understanding causes fear and a personal defense from utilizing "the web" and all the possibilities that lie within it. In essence, the concept behind the World Wide Web is really quite simple. It is nothing more than "a community of information sites on the Internet computer network" (Pasadena Public Library). So basically, the World Wide Web is a small part of a greater entity—the Internet. The sites, or web pages, are all connected by this network and can be considered a community because of their common location-the World Wide Web. If you can visualize it, imagine a large network, the Internet, which connects people and information all over the world. Then imagine within that network, a community of its own. The World Wide Web has an endless number of sites, made of web pages, and all of which are connected through the Internet. So how does one use or "navigate" on the Web? To understand that, one must first understand the language (that which makes it run, understand and execute commands) the web is based upon. This language is known as Hypertext Markup Language, which will later be discussed in more detail. Essentially, it the hypertext language uses "hot words to link pages to each other" (Halonen). The hot words, which have been termed links, consist of an underlying code, HTML, which is read and takes the user to the desired destination. Also, the web uses URL’s, Uniform Resource Locators, to locate information. Also

Essay about Webspective

* Technical solutions of webspective is not up to the mark and many of the potential customers perceive webspective to be crippled with problems such as inefficient traffic handling,brand name inconsistency and network problems.

Cja 304 Week 1

Communication enables human beings to interact in a meaningful way. It is hence a vital component of coming up with the meanings of situations so as to derive the intended conclusions.

Unit 9 P2 Questions And Answers

Communication is the process in which people share information and ideas with each other and create shared meanings.

Whole Life Project Part 2 Analysis

Communication is an ability that dates back to prehistoric times. I am dividing this in to two parts, written and oral. The skill of communicating effectively through means of writing is important because it allows a person to express their ideas thoughts in an effective manner. It is helpful for documenting which prevent others from benefiting from the idea and allows an easy way to recall what was expressed. This

Assignment Four: A Sociological Analysis Of Generation Like

how the World Wide Web

On The 4 December 2012 A Strategy And Vision Was Created

The main focus of this essay will be how communication is important and why it will always remain to

The Takeover: New Media’s Role on Civic Engagement

  • 11 Works Cited

The Internet is an overall informational tool, but the Internet contains search engines such as: Google, Bing, Mozilla Firefox, Safari, etc., that will give you websites, databases, organizations, etc., that will give you information on almost anything imaginable. And when it

Basic Needs to Design a Student Login Portal

Internet is an essential source for communicating and interacting with people around worldwide. Web browsers are software programs are software applications which allows millions of users to access the web content either for business, entertainment or personal use. the “WWW” portion of internet stands for world wide web. WWW made up of hyperlinked documents written in “XHTML & Interactive Media”.

The Wide Internet Of The Internet Essay

The wide adoption of the Internet has fundamentally altered the ways in which user communicate, gather information, conduct businesses and make purchases. As the use of World Wide Web is increasing, data on internet is increasing. A few sites consist of millions of pages, but millions of sites only contain a handful of pages. Few sites contain millions of links, but many sites have one or two. Millions of users flock to a few select sites, giving little attention to millions of others. The expansion of the World Wide Web (Web for short) has resulted in a large amount of data that is now in general freely available for user access. The different types of data have to be managed and organized in such a way that they can be accessed by different users efficiently by the search engine.

Web Design And The Internet

The World Wide Web, which is referred to simply as the Web, is a collection of HTML documents, images, videos and sounds files that can be linked to each other. In 1991, the

Essay On Web 1. 0

In March of 1989, British engineer Tim Berners-Lee, known as TimBL invented the World Wide Web. Also known as Web 1.0, the World Wide Web was the first readable data in a computer. TimBL, who is currently a Professor of Computer Science at the University of Oxford in the United Kingdom, created a database interaction between web users and the sites. Initially, Web 1.0 had limited options, which did not allow users to provide comments, feedback or reviews. Therefore, it was time to update the Web 1.0 to Web 2.0. The new format was the first editable expression for the World Wide Web with interactional data. Web 2.0 allowed its users to make the connection with the websites, allowing people to make comments and give feedback to the sites.

Essay On New Web

On the off chance that you adored the look of Windows Phone's Live Tiles, at that point Awesome HQ is for you. The tile-like interface additionally makes it one of only a handful couple of New Tab pages that you can use on a portable program also.

The World Wide Web as Social Hypertext

Initially, I didn't pay much attention to the Web. After all, it was just a new take on distributed information server systems such as WAIS[8] and Gopher[1]. True, it was easier to use than WAIS, and the ability of Web browsers to display formatted text and graphics with embedded links made it more attractive and engaging than either WAIS or Gopher. But there was nothing really new; it was an incremental advance, a new combination of well known functionality. So I mentally categorized the Web as just the latest fashion to sweep the internet.

The Internet : The World Wide Web

The internet, otherwise known as the world wide web, was discovered and popularized in 1969, along with a new way of thinking. The internet has become our TV, maps, clocks, radios, and our typewriters, revolutionizing the technology world. Nowadays, people can go onto their personal computers and find anything and everything they were looking for simply at a click of a button. Although some critique the internet for making our population dumb, the internet is full of available and efficient resources, if given the patience. Students growing up in the 21st century know the evergoing and rapid changes of today’s technology, but one way to be engrossed in writing and reading is to go deeper into the internet by not only scratching the surface, such as Wikipedia and Google, but looking into primary resources and studies contained in the web. By doing this students will ultimately concentrate more on the given task, be less apt to skim read, and use the internet as a helpful resource rather than a harmful one.

The Impact Of Internet On The World Wide Web

Globally, the estimate for Internet users in 2016 comes to around 46.1 percent, a three percent increase from 2015 (“Internet Users in the World,” Internet Live Stats). In a world where almost half of the population has Internet most people cannot imagine their lives without constant access to news sources. In this, different platforms that involve the sharing of information include: physical newspaper, online news sources, Facebook, and other social media platforms. The sharing of news stories among society through these different platforms allows larger communities to stay up-to-date with local, national, and even global events. Being knowledgeable about current events through the World Wide Web is easier through the advances made

Related Topics

  • Printing press
  • Everyday lives
  • Electronic mail

What is the world wide web?

Part of Computing Computer science Year 3 Year 4

More on Computer science

Find out more by working through a topic

How do search engines work?

  • count 12 of 24

history of the world wide web essay

  • count 13 of 24

history of the world wide web essay

Decomposition

  • count 14 of 24

history of the world wide web essay

Logical reasoning

  • count 15 of 24

history of the world wide web essay

Advertisement

The 100 Best Books of the 21st Century: A Printable List

By The New York Times Books Staff Aug. 26, 2024

  • Share full article

Print this version to keep track of what you’ve read and what you’d like to read. See the full project, including commentary about the books, here.

A PDF version of this document with embedded text is available at the link below:

Download the original document (pdf)

The New York Times Book Review I've I want THE 100 BEST BOOKS OF THE 21ST CENTURY read to it read it 1 My Brilliant Friend, by Elena Ferrante 26 26 Atonement, by lan McEwan 2 The Warmth of Other Suns, by Isabel Wilkerson 27 Americanah, by Chimamanda Ngozi Adichie 3 Wolf Hall, by Hilary Mantel 28 Cloud Atlas, by David Mitchell 4 The Known World, by Edward P. Jones 29 The Last Samurai, by Helen DeWitt 5 The Corrections, by Jonathan Franzen 30 Sing, Unburied, Sing, by Jesmyn Ward 6 2666, by Roberto Bolaño 31 White Teeth, by Zadie Smith 7 The Underground Railroad, by Colson Whitehead 32 The Line of Beauty, by Alan Hollinghurst 8 Austerlitz, by W.G. Sebald 33 Salvage the Bones, by Jesmyn Ward 9 Never Let Me Go, by Kazuo Ishiguro 34 Citizen, by Claudia Rankine 10 Gilead, by Marilynne Robinson 35 Fun Home, by Alison Bechdel 11 The Brief Wondrous Life of Oscar Wao, by Junot Díaz 36 Between the World and Me, by Ta-Nehisi Coates 12 The Year of Magical Thinking, by Joan Didion 37 The Years, by Annie Ernaux 13 The Road, by Cormac McCarthy 38 The Savage Detectives, by Roberto Bolaño 14 Outline, by Rachel Cusk 39 A Visit From the Goon Squad, by Jennifer Egan 15 Pachinko, by Min Jin Lee 40 H Is for Hawk, by Helen Macdonald 16 The Amazing Adventures of Kavalier & Clay, by Michael Chabon 41 Small Things Like These, by Claire Keegan 17 The Sellout, by Paul Beatty 42 A Brief History of Seven Killings, by Marlon James 18 Lincoln in the Bardo, by George Saunders 43 Postwar, by Tony Judt 19 Say Nothing, by Patrick Radden Keefe 44 The Fifth Season, by N.K. Jemisin 20 Erasure, by Percival Everrett 45 The Argonauts, by Maggie Nelson 21 Evicted, by Matthew Desmond 46 The Goldfinch, by Donna Tartt 22 22 Behind the Beautiful Forevers, by Katherine Boo 47 A Mercy, by Toni Morrison 23 Hateship, Friendship, Courtship, Loveship, Marriage, by Alice Munro 48 Persepolis, by Marjane Satrapi 24 The Overstory, by Richard Powers 49 The Vegetarian, by Han Kang 25 25 Random Family, by Adrian Nicole LeBlanc 50 Trust, by Hernan Diaz I've I want read to it read it

The New York Times Book Review I've I want THE 100 BEST BOOKS OF THE 21ST CENTURY read to it read it 51 Life After Life, by Kate Atkinson 52 52 Train Dreams, by Denis Johnson 53 Runaway, by Alice Munro 76 77 An American Marriage, by Tayari Jones 78 Septology, by Jon Fosse Tomorrow, and Tomorrow, and Tomorrow, by Gabrielle Zevin 54 Tenth of December, by George Saunders 55 The Looming Tower, by Lawrence Wright 56 The Flamethrowers, by Rachel Kushner 57 Nickel and Dimed, by Barbara Ehrenreich ཤྲཱ རྒྱ སྐྱ A Manual for Cleaning Women, by Lucia Berlin The Story of the Lost Child, by Elena Ferrante Pulphead, by John Jeremiah Sullivan. Hurricane Season, by Fernanda Melchor 58 Stay True, by Hua Hsu 83 When We Cease to Understand the World, by Benjamín Labatut 59 Middlesex, by Jeffrey Eugenides 84 The Emperor of All Maladies, by Siddhartha Mukherjee 60 Heavy, by Kiese Laymon 85 Pastoralia, by George Saunders 61 Demon Copperhead, by Barbara Kingsolver 86 Frederick Douglass, by David W. Blight 62 10:04, by Ben Lerner 87 Detransition, Baby, by Torrey Peters 63 Veronica, by Mary Gaitskill 88 The Collected Stories of Lydia Davis 64 The Great Believers, by Rebecca Makkai 89 The Return, by Hisham Matar 65 The Plot Against America, by Philip Roth 90 The Sympathizer, by Viet Thanh Nguyen 66 We the Animals, by Justin Torres 91 The Human Stain, by Philip Roth 67 Far From the Tree, by Andrew Solomon 92 The Days of Abandonment, by Elena Ferrante 68 The Friend, by Sigrid Nunez 93 Station Eleven, by Emily St. John Mandel 69 59 The New Jim Crow, by Michelle Alexander 94 On Beauty, by Zadie Smith 10 70 All Aunt Hagar's Children, by Edward P. Jones 95 Bring Up the Bodies, by Hilary Mantel 71 The Copenhagen Trilogy, by Tove Ditlevsen 96 Wayward Lives, Beautiful Experiments, by Saidiya Hartman 72 22 Secondhand Time, by Svetlana Alexievich 97 Men We Reaped, by Jesmyn Ward 73 The Passage of Power, by Robert A. Caro 98 Bel Canto, by Ann Patchett 74 Olive Kitteridge, by Elizabeth Strout 99 How to Be Both, by Ali Smith 75 15 Exit West, by Mohsin Hamid 100 Tree of Smoke, by Denis Johnson I've I want read to it read it

  • Library of Congress
  • Research Guides

World of 1898: International Perspectives on the Spanish American War

Introduction.

  • Overview Essay
  • Cuba in 1898
  • Chronology of Cuba in the Spanish-American War
  • Philippine Perspective
  • The Changing of the Guard: Puerto Rico in 1898
  • The Spanish-American War of 1898: a Spanish View
  • American Perspective
  • Emilio Aguinaldo y Famy
  • Russell Alexander Alger
  • Thomas McArthur Anderson
  • Basilio Augustin y Dávila
  • Ramón Auñón y Villalón
  • Román Baldorioty de Castro
  • José Celso Barbosa
  • Clara Barton
  • Segismundo Bermejo
  • Ramón Emeterio Betances
  • Ramón Blanco y Erenas
  • Andrés Bonifacio
  • John Rutter Brooke
  • Jules-Martin Cambon
  • Pascual Cervera y Topete
  • Grover Cleveland
  • Stephen Crane
  • George W. Davis
  • Federico Degetau y González
  • George Dewey
  • José de Diego
  • Manuel V. Domenech
  • Enrique Dupuy de Lôme
  • Oswald Herbert Ernst
  • Maximo Gómez Baez
  • John Milton Hay
  • Guy Vernon Henry
  • Eugenio María de Hostos y Bonilla
  • Tulio Larrinaga
  • Fitzhugh Lee
  • William Ludlow
  • Antonio Maceo
  • Manuel Macías
  • William McKinley
  • Nelson Appleton Miles
  • Luis Muñoz Rivera
  • Whitelaw Reid
  • Lola Rodríguez de Tió
  • Manuel Rojas
  • Theodore Roosevelt
  • Práxedes Mateo Sagasta
  • William T. Sampson
  • Juan Manuel Sánchez y Gutiérrez de Castro
  • Theodore Schwan
  • William Shafter
  • Martín Travieso
  • Joaquín Vara de Rey y Rubio
  • James Franklin Wade
  • Richard Wainwright
  • Valeriano Weyler
  • Walt Whitman
  • Henry H. Whitney
  • James Harrison Wilson
  • Coamo and Aibonito
  • Mayagüez, Hormigueros, and Arecibo
  • Cienfuegos Bay
  • Abolition of Slavery in Puerto Rico
  • American Ships in the Spanish-American War
  • Balzac v. Porto Rico
  • Foraker Act (Organic Act of 1900)
  • Grito de Balintawak
  • Grito de Lares
  • Hurricane San Ciriaco
  • Anti-Imperialist League
  • Military Government in Puerto Rico
  • Olmsted Amendment
  • Peace Agreement in Puerto Rico
  • Reconcentration Policy
  • Rough Riders
  • Spanish Ships in the Spanish-American War
  • Teller and Platt Amendments
  • Treaty of Paris of 1898
  • U.S.S. Gloucester
  • Additional Resources
  • Acknowledgements

Guide Editor: María Daniela Thurber, Reference Librarian, Hispanic Reading Room, Latin American, Caribbean, and European Division

Content Authors: Please visit the Acknowledgement page for information on all authors and contributors to the original The World of 1898: The Spanish-American War web project.

Note: This guide is adapted from The World of 1898: The Spanish-American War , the first online collection mounted on the web by the Hispanic Reading Room.

Created: Spring 2022

Last Updated: February 28, 2023

Caribbean, Iberian & Latin American Studies : Ask a Librarian

Have a question? Need assistance? Use our online form to ask a librarian for help.

Haga su pregunta .

Faça a sua pergunta .

The war of the United States with Spain was very brief. Its results were many, startling, and of world-wide meaning. --Henry Cabot Lodge

history of the world wide web essay

On April 25, 1898 the United States declared war on Spain following the sinking of the Battleship Maine in Havana harbor on February 15, 1898. The war ended with the signing of the Treaty of Paris on December 10, 1898. As a result, Spain lost its control over the remains of its overseas empire -- Cuba, Puerto Rico, the Philippines Islands, Guam, and other islands.

Beginning in 1492, Spain was the first European nation to sail westward across the Atlantic Ocean, explore, and colonize the Amerindian nations of the Western Hemisphere. At its greatest extent, the empire that resulted from this exploration extended from Virginia on the eastern coast of the United States south to Tierra del Fuego at the tip of South America excluding Brazil and westward to California and Alaska. Across the Pacific, it included the Philippines and other island groups. By 1825 much of this empire had fallen into other hands and in that year, Spain acknowledged the independence of its possessions in the present-day United States (then under Mexican control) and south to the tip of South America. The only remnants that remained in the empire in the Western Hemisphere were Cuba and Puerto Rico and across the Pacific in Philippines Islands, and the Carolina, Marshall, and Mariana Islands (including Guam) in Micronesia.

history of the world wide web essay

Kurz & Allison. Destruction of the U.S. battleship Maine in Havana Harbor Feby 15th. Havana, Cuba, ca. 1898. Library of Congress Prints and Photographs Division.

history of the world wide web essay

A view of our battleship MAINE as she appears today. Havana Harbor, ca. 1900. Library of Congress Prints and Photographs Division.

history of the world wide web essay

Raising of battleship Maine. Havana, Cuba. 1911. Library of Congress Prints and Photographs Division.

Following its declaration of war against Spain issued on April 25, 1898, the United States added the Teller Amendment asserting that it would not attempt to exercise hegemony over Cuba. Two days later Commodore George Dewey sailed from Hong Kong with Emilio Aguinaldo on board. Fighting began in the Phillipines Islands at the Battle of Manila Bay on May 1 where Commodore George Dewey reportedly exclaimed, "You may fire when ready, Gridley," and the Spanish fleet under Rear Admiral Patricio Montojo was destroyed. However, Dewey did not have enough manpower to capture Manila so Aguinaldo's guerrillas maintained their operations until 15,000 U.S. troops arrived at the end of July. On the way, the cruiser Charleston stopped at Guam and accepted its surrender from its Spanish governor who was unaware his nation was at war. Although a peace protocol was signed by the two belligerents on August 12, Commodore Dewey and Maj. Gen. Wesley Merritt, leader of the army troops, assaulted Manila the very next day, unaware that peace had been declared.

In late April, Andrew Summers Rowan made contact with Cuban General Calixto García who supplied him with maps, intelligence, and a core of rebel officers to coordinate U.S. efforts on the island. The U.S. North Atlantic Squadron left Key West for Cuba on April 22 following the frightening news that the Spanish home fleet commanded by Admiral Pascual Cervera had left Cadiz and entered Santiago, having slipped by U.S. ships commanded by William T. Sampson and Winfield Scott Schley. They arrived in Cuba in late May.

War actually began for the U.S. in Cuba in June when the Marines captured Guantánamo Bay and 17,000 troops landed at Siboney and Daiquirí, east of Santiago de Cuba, the second largest city on the island. At that time Spanish troops stationed on the island included 150,000 regulars and 40,000 irregulars and volunteers while rebels inside Cuba numbered as many as 50,000. Total U.S. army strength at the time totalled 26,000, requiring the passage of the Mobilization Act of April 22 that allowed for an army of at first 125,000 volunteers (later increased to 200,000) and a regular army of 65,000. On June 22, U.S. troops landed at Daiquiri where they were joined by Calixto García and about 5,000 revolutionaries.

U.S. troops attacked the San Juan heights on July 1, 1898. Dismounted troopers, including the African-American Ninth and Tenth cavalries and the Rough Riders commanded by Lt. Col. Theodore Roosevelt went up against Kettle Hill while the forces led by Brigadier General Jacob Kent charged up San Juan Hill and pushed Spanish troops further inland while inflicting 1,700 casualties. While U.S. commanders were deciding on a further course of action, Admiral Cervera left port only to be defeated by Schley. On July 16, the Spaniards agreed to the unconditional surrender of the 23,500 troops around the city. A few days later, Major General Nelson Miles sailed from Guantánamo to Puerto Rico. His forces landed near Ponce and marched to San Juan with virtually no opposition.

Representatives of Spain and the United States signed a peace treaty in Paris on December 10, 1898, which established the independence of Cuba, ceded Puerto Rico and Guam to the United States, and allowed the victorious power to purchase the Philippines Islands from Spain for $20 million. The war had cost the United States $250 million and 3,000 lives, of whom 90% had perished from infectious diseases.

What's included in this guide

This presentation provides resources and documents about the Spanish-American War, the period before the war, and some of the fascinating people who participated in the fighting or commented about it. Information about Cuba, Guam, the Philippines, Puerto Rico, Spain, and the United States is provided in chronologies, bibliographies, and a variety of pictorial and textual material from bilingual sources, supplemented by an overview essay about the war and the period. Among the participants and authors featured are such well-known figures as Presidents Grover Cleveland, William McKinley, and Theodore Roosevelt, as well as Admiral George Dewey and author Mark Twain (United States), together with other important figures such as Antonio Maceo and José Martí (Cuba), Román Baldorioty de Castro and Lola Rodríguez de Tió (Puerto Rico), José Rizal and Emilio Aguinaldo (Philippines), and Antonio Cánovas del Castillo and Ramón Blanco (Spain).

Related Research Guides by the Library of Congress

history of the world wide web essay

Spanish-American War: A Resource Guide

The Spanish-American War (1898) was a conflict between the U.S. and Spain, ending with the loss of Spain’s overseas empire and the U.S. emerging as a world power. This guide compiles digital material, external websites, and a selected print bibliography.

history of the world wide web essay

Spanish American War: Topics in Chronicling America

A guide for researching the topic of the "Spanish American War," which took place from April 25 until December 10,1898, in the Chronicling America digital collection of historic newspapers.

history of the world wide web essay

Spain: Hispanic Reading Room Country Guide

This guide provides curated Library of Congress resources for the study of Spain, including digitized primary source materials in a wide variety of formats, books and periodicals, online databases, and tips for searching.

history of the world wide web essay

Cuba: Hispanic Reading Room Country Guide

This guide provides curated Library of Congress resources for researching Cuba, including digitized primary source materials in a wide variety of formats, books and periodicals, online databases, and tips for searching.

history of the world wide web essay

Philippine-American War: Topics in Chronicling America

After the Treaty of Paris, the Phillippine-American War occurred from February 1899 to July 1902. This guide provides access to materials related to the “Philippine-American War” in the Chronicling America digital collection of historic newspapers.

  • Next: Overview Essay >>
  • Last Updated: Jan 12, 2024 2:02 AM
  • URL: https://guides.loc.gov/world-of-1898

CERN Accelerating science

home

World Wide Web born at CERN 25 years ago

In March 1989 Tim Berners-Lee wrote a proposal to develop a radical new way of linking and sharing information: the World Wide Web

12 March, 2014

World Wide Web born at CERN 25 years ago

The image on the cover page of Tim Berners-Lee's proposal for the World Wide Web in March 1989 (Image: CERN)

In March 1989 Tim Berners-Lee, a scientist working at CERN, submitted a proposal to develop a radical new way of linking and sharing information over the internet. The document was entitled Information Management: A Proposal . And so the web was born.

The first website at CERN – and in the world – was dedicated to the World Wide Web project itself. Last April CERN initiated a project to restore the first website , and to bring back the spirit of that time through its technical innovation and the founding principles of openness and freedom.

In 1993 CERN put the World Wide Web software in the public domain . CERN made the next release available with an open licence, as a more sure way to maximise its dissemination. Through these actions, making the software required to run a web server freely available, along with a basic browser and a library of code, the web was allowed to flourish . 

"Beyond CERN's role in helping us understand the universe, it was a great place to work in 1989," said Tim Berners-Lee. "CERN was an early adopter of Internet protocols, and their support for a Royalty-Free Web has been a key to its widespread adoption today."

Now Tim Berners-Lee, the World Wide Web Consortium (W3C) and the World Wide Web Foundation  are launching a series of initiatives  to mark the 25th anniversary of the original proposal.

In addition, Berners-Lee and the Web Foundation are launching " The web we want " campaign to promote a global dialogue and change in public policy to ensure that the web remains an open, free, accessible medium – so that everyone on the planet can participate in the free flow of knowledge, ideas and creativity online.

Opinion pieces on this topic:

" Minimising the muddle " – Peggie Rimmer, Tim Berners-Lee's supervisor from 1984 to 1990

" Good old Bitnet, and the rise of the World Wide Web " – Richard Jacobsson, senior physicist on the LHCb experiment

" On the open internet and the free web " – David Foster, deputy head of CERN's IT department

" Not at all vague and much more than exciting " – Maria Dimou, CERN computer scientist and early web contributor

Related Articles

World wide web at 35, home.cern goes retro to commemorate 30 years ..., web@30: reliving history and rethinking the f..., also on computing, computer security: security workouts, computer security: audited for the better, computer security: security nimbys, digital archaeology: new lep data now availab..., computer security: don’t print naked, computer security: dear summer students, welc..., computer security: blind trust means money lo..., computer security: the better generation, computer security: whitehat & zebra trainings....

To revisit this article, visit My Profile, then View saved stories .

Image may contain: Text

As a Teenager in Europe, I Went to Nudist Beaches All the Time. 30 Years Later, Would the Experience Be the Same?

Image may contain Princess Antonia of Luxembourg Sandro Botticelli Art Painting Adult Person and Wedding

In July 2017, I wrote an article about toplessness for Vogue Italia. The director, actor, and political activist Lina Esco had emerged from the world of show business to question public nudity laws in the United States with 2014’s Free the Nipple . Her film took on a life of its own and, thanks to the endorsement from the likes of Miley Cyrus, Cara Delevingne, and Willow Smith, eventually developed into a whole political movement, particularly on social media where the hashtag #FreeTheNipple spread at lightning speed. The same year as that piece, actor Alyssa Milano tweeted “me too” and encouraged others who had been sexually assaulted to do the same, building on the movement activist Tarana Burke had created more than a decade earlier. The rest is history.

In that Vogue article, I chatted with designer Alessandro Michele about a shared memory of our favorite topless beaches of our youth. Anywhere in Italy where water appeared—be it the hard-partying Riviera Romagnola, the traditionally chic Amalfi coast and Sorrento peninsula, the vertiginous cliffs and inlets of Italy’s continuation of the French Côte d’Azur or the towering volcanic rocks of Sicily’s mythological Riviera dei Ciclopi—one was bound to find bodies of all shapes and forms, naturally topless.

In the ’90s, growing up in Italy, naked breasts were everywhere and nobody thought anything about it. “When we look at our childhood photos we recognize those imperfect breasts and those bodies, each with their own story. I think of the ‘un-beauty’ of that time and feel it is actually the ultimate beauty,” Michele told me.

Indeed, I felt the same way. My relationship with toplessness was part of a very democratic cultural status quo. If every woman on the beaches of the Mediterranean—from the sexy girls tanning on the shoreline to the grandmothers eating spaghetti al pomodoro out of Tupperware containers under sun umbrellas—bore equally naked body parts, then somehow we were all on the same team. No hierarchies were established. In general, there was very little naked breast censorship. Free nipples appeared on magazine covers at newsstands, whether tabloids or art and fashion magazines. Breasts were so naturally part of the national conversation and aesthetic that Ilona Staller (also known as Cicciolina) and Moana Pozzi, two porn stars, cofounded a political party called the Love Party. I have a clear memory of my neighbor hanging their party’s banner out his window, featuring a topless Cicciolina winking.

A lot has changed since those days, but also since that initial 2017 piece. There’s been a feminist revolution, a transformation of women’s fashion and gender politics, the absurd overturning of Harvey Weinstein’s 2020 rape conviction in New York, the intensely disturbing overturning of Roe v Wade and the current political battle over reproductive rights radiating from America and far beyond. One way or another, the female body is very much the site of political battles as much as it is of style and fashion tastes. And maybe for this reason naked breasts seem to populate runways and street style a lot more than they do beaches—it’s likely that being naked at a dinner party leaves more of a permanent mark than being naked on a glamorous shore. Naked “dressing” seems to be much more popular than naked “being.” It’s no coincidence that this year Saint Laurent, Chloé, Ferragamo, Tom Ford, Gucci, Ludovic de Saint Sernin, and Valentino all paid homage to sheer dressing in their collections, with lacy dresses, see-through tops, sheer silk hosiery fabric, and close-fitting silk dresses. The majority of Anthony Vaccarello’s fall 2024 collection was mostly transparent. And even off the runway, guests at the Saint Laurent show matched the mood. Olivia Wilde appeared in a stunning see-through dark bodysuit, Georgia May Jagger wore a sheer black halter top, Ebony Riley wore a breathtaking V-neck, and Elsa Hosk went for translucent polka dots.

In some strange way, it feels as if the trends of the ’90s have swapped seats with those of today. When, in 1993, a 19-year-old Kate Moss wore her (now iconic) transparent, bronze-hued Liza Bruce lamé slip dress to Elite Model Agency’s Look of the Year Awards in London, I remember seeing her picture everywhere and feeling in awe of her daring and grace. I loved her simple sexy style, with her otherworldly smile, the hair tied back in a bun. That very slip has remained in the collective unconscious for decades, populating thousands of internet pages, but in remembering that night Moss admitted that the nude look was totally unintentional: “I had no idea why everyone was so excited—in the darkness of Corinne [Day’s] Soho flat, the dress was not see-through!” That’s to say that nude dressing was usually mostly casual and not intellectualized in the context of a larger movement.

15 Best Moisturizers For Glowing Skin

But today nudity feels loaded in different ways. In April, actor and author Julia Fox appeared in Los Angeles in a flesh-colored bra that featured hairy hyper-realist prints of breasts and nipples, and matching panties with a print of a sewn-up vagina and the words “closed” on it, as a form of feminist performance art. Breasts , an exhibition curated by Carolina Pasti, recently opened as part of the 60th Venice Biennale at Palazzo Franchetti and showcases works that span from painting and sculpture to photography and film, reflecting on themes of motherhood, empowerment, sexuality, body image, and illness. The show features work by Cindy Sherman, Robert Mapplethorpe, Louise Bourgeois, and an incredible painting by Bernardino Del Signoraccio of Madonna dell’Umiltà, circa 1460-1540. “It was fundamental for me to include a Madonna Lactans from a historical perspective. In this intimate representation, the Virgin reveals one breast while nurturing the child, the organic gesture emphasizing the profound bond between mother and child,” Pasti said when we spoke.

Through her portrayal of breasts, she delves into the delicate balance of strength and vulnerability within the female form. I spoke to Pasti about my recent musings on naked breasts, which she shared in a deep way. I asked her whether she too noticed a disparity between nudity on beaches as opposed to the one on streets and runways, and she agreed. Her main concern today is around censorship. To Pasti, social media is still far too rigid around breast exposure and she plans to discuss this issue through a podcast that she will be launching in September, together with other topics such as motherhood, breastfeeding, sexuality, and breast cancer awareness.

With summer at the door, it was my turn to see just how much of the new reread on transparency would apply to beach life. In the last few years, I noticed those beaches Michele and I reminisced about have grown more conservative and, despite being the daughter of unrepentant nudists and having a long track record of militant topless bathing, I myself have felt a bit more shy lately. Perhaps a woman in her 40s with two children is simply less prone to taking her top off, but my memories of youth are populated by visions of bare-chested mothers surveilling the coasts and shouting after their kids in the water. So when did we stop? And why? When did Michele’s era of “un-beauty” end?

In order to get back in touch with my own naked breasts I decided to revisit the nudist beaches of my youth to see what had changed. On a warm day in May, I researched some local topless beaches around Rome and asked a friend to come with me. Two moms, plus our four children, two girls and two boys of the same ages. “Let’s make an experiment of this and see what happens,” I proposed.

The kids all yawned, but my friend was up for it. These days to go topless, especially on urban beaches, you must visit properties that have an unspoken nudist tradition. One of these in Rome is the natural reserve beach at Capocotta, south of Ostia, but I felt a bit unsure revisiting those sands. In my memory, the Roman nudist beaches often equated to encounters with promiscuous strangers behind the dunes. I didn’t want to expose the kids, so, being that I am now a wise adult, I went ahead and picked a compromise. I found a nude-friendly beach on the banks of the Farfa River, in the rolling Sabina hills.

We piled into my friend’s car and drove out. The kids were all whining about the experiment. “We don’t want to see naked mums!” they complained. “Can’t you just lie and say you went to a nudist beach?”

We parked the car and walked across the medieval fairy-tale woods until we reached the path that ran along the river. All around us were huge trees and gigantic leaves. It had rained a lot recently and the vegetation had grown incredibly. We walked past the remains of a Roman road. The colors all around were bright green, the sky almost fluorescent blue. The kids got sidetracked by the presence of frogs. According to the indications, the beach was about a mile up the river. Halfway down the path, we bumped into a couple of young guys in fanny packs. I scanned them for signs of quintessential nudist attitude, but realized I actually had no idea what that was. I asked if we were headed in the right direction to go to “the beach”. They nodded and gave us a sly smile, which I immediately interpreted as a judgment about us as mothers, and more generally about our age, but I was ready to vindicate bare breasts against ageism.

We reached a small pebbled beach, secluded and bordered by a huge trunk that separated it from the path. A group of girls was there, sharing headphones and listening to music. To my dismay they were all wearing the tops and bottoms of their bikinis. One of them was in a full-piece bathing suit and shorts. “See, they are all wearing bathing suits. Please don’t be the weird mums who don’t.”

At this point, it was a matter of principle. My friend and I decided to take our bathing suits off completely, if only for a moment, and jumped into the river. The boys stayed on the beach with full clothes and shoes on, horrified. The girls went in behind us with their bathing suits. “Are you happy now? my son asked. “Did you prove your point?”

I didn’t really know what my point actually was. I think a part of me wanted to feel entitled to those long-gone decades of naturalism. Whether this was an instinct, or as Pasti said, “an act that was simply tied to the individual freedom of each woman”, it was hard to tell. At this point in history, the two things didn’t seem to cancel each other out—in fact, the opposite. Taking off a bathing suit, at least for my generation who never had to fight for it, had unexpectedly turned into a radical move and maybe I wanted to be part of the new discourse. Also, the chances of me going out in a fully sheer top were slim these days, but on the beach it was different. I would always fight for an authentic topless experience.

After our picnic on the river, we left determined to make our way—and without children—to the beaches of Capocotta. In truth, no part of me actually felt very subversive doing something I had been doing my whole life, but it still felt good. Once a free breast, always a free breast.

This article was originally published on British Vogue .

More Great Living Stories From Vogue

Meghan Markle Is Returning to Television

Is Art Deco Interior Design Roaring Back Into Style?

Kate Middleton and Prince William Share a Never-Before-Seen Wedding Picture

Sofia Richie Grainge Has Given Birth to Her First Child—And the Name Is…

The 10 Best Spas in the World

Never miss a Vogue moment and get unlimited digital access for just $2 $1 per month.

Vogue Daily

By signing up you agree to our User Agreement (including the class action waiver and arbitration provisions ), our Privacy Policy & Cookie Statement and to receive marketing and account-related emails from Vogue. You can unsubscribe at any time. This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

IMAGES

  1. World Wide Web Essay: The Evolution And Impact Of The World Wide Web

    history of the world wide web essay

  2. A Brief History of The World-Wide-Web

    history of the world wide web essay

  3. History of The World Wide Web

    history of the world wide web essay

  4. World Wide Web Essay: The Evolution And Impact Of The World Wide Web

    history of the world wide web essay

  5. The World's First Website Launched 30 Years Ago : NPR

    history of the world wide web essay

  6. History of the World Wide Web: A 30-Year Timeline Infographic || In

    history of the world wide web essay

VIDEO

  1. You Won't Believe How the Internet REALLY Started! 🔍

  2. Atoms & Webpages?

  3. WWW || world wide web|| world wide web in detail

  4. Mind-Blowing Internet Facts That Will Make You Question Everything!

  5. World Wide Web Day

  6. World Wide Web |10 sentences #worldwideweb #internet

COMMENTS

  1. History of the World Wide Web

    The World Wide Web ("WWW", "W3" or simply "the Web") is a global information medium that users can access via computers connected to the Internet. The term is often mistakenly used as a synonym for the Internet, but the Web is a service that operates over the Internet, just as email and Usenet do. The history of the Internet and the history of hypertext date back significantly further than ...

  2. The World Wide Web: Past, Present and Future

    This paper gives an overview of the history, the current state, and possible future directions for the World Wide Web. The Web is simply defined as the universe of global network-accessible information. It is an abstract space with which people can interact, and is currently chiefly populated by interlinked pages of text, images and animations, with occasional sounds, three dimensional worlds ...

  3. World Wide Web

    World Wide Web, the leading information retrieval service of the Internet (the worldwide computer network). The Web gives users access to a vast array of content that is connected by means of hyperlinks, electronic connections that link related pieces of information.

  4. History Of The World Wide Web Information Technology Essay

    As such, outside research for projects and essays often require students to utilize the World Wide Web. For history, students can find information much quicker on web pages than going to the library or bookstore and sifting through books.

  5. A short history of the Web

    Where the Web was born Tim Berners-Lee, a British scientist, invented the World Wide Web (WWW) in 1989, while working at CERN. The Web was originally conceived and developed to meet the demand for automated information-sharing between scientists in universities and institutes around the world.

  6. The World Wide Web: The Invention That Connected The World

    As we reach the web's 30th birthday, we reflect on its history - from its hardware foundations to the 5 billion person network we see today The internet is a huge network of computers all connected together, but it was the world wide web that made the technology into something that linked information together and made it accessible to everyone.

  7. History of the Internet and World Wide Web (WWW)

    THE WORLD WIDE WEB Perhaps the invention that most facilitated the growth of the Internet as a global information-sharing system is the World Wide Web. Unlike the Internet, however, the early design and development of the World Wide Web was primarily the doing of just one person: Tim Berners-Lee.

  8. Development of the Internet and the World Wide Web

    Development of the Internet and the World Wide Web The recent growth of the Internet and the World Wide Web makes it appear that the world is witnessing the arrival of a completely new technology. In fact, the Web—now considered to be a major driver of the way society accesses and views information—is the result of numerous projects in computer networking, mostly funded by the federal ...

  9. History of the Web

    In 2009, Sir Tim co-founded the World Wide Web Foundation with Rosemary Leith. The Web Foundation is fighting for the web we want: a web that is safe, empowering and for everyone. Please do explore our site and our work. We hope you'll be inspired by our vision and decide to take action.

  10. The History and Purpose of the World Wide Web

    Here is some background to the early development of the World-Wide Web, a brief overview of its present state and an introduction to the concepts on which it is based.

  11. The birth of the Web

    The World Wide Web was invented by British scientist Tim Berners-Lee in 1989 while working at CERN Tim Berners-Lee, a British scientist, invented the World Wide Web (WWW) in 1989, while working at CERN. The web was originally conceived and developed to meet the demand for automated information-sharing between scientists in universities and institutes around the world.

  12. Web History Timeline

    World Wide Web Timeline Since its founding in 1989, the World Wide Web has touched the lives of billions of people around the world and fundamentally changed how we connect with others, the nature of our work, how we discover and share news and new ideas, how we entertain ourselves and how communities form and function.

  13. A brief history of the World Wide Web

    A brief history of the World Wide Web. ID: CERN-VIDEO-2019-005-002. This video summarises in 3 minutes how the Web was invented at CERN by British physicist and IT expert Tim Berners Lee in 1989, and how it grew to become what it is today thanks to CERN's decision in 1993 to keep it as an open standard for everyone to use.

  14. The World Wide Web: A very short personal history

    The World Wide Web: A very short personal history There have always been things which people are good at, and things computers have been good at, and little overlap between the two. I was brought up to understand this distinction in the 50s and 60s and intuition and understanding were human characteristics, and that computers worked mechanically in tables and hierarchies.

  15. PDF PII: 0169-7552(92)90039-S

    The world-wide Cailliau Computer web, Networks andISDN Systems 25(1992) 454-459. This paper describes theWorld-Wide (W3) glo Web information al system its protocols and initiative, ata formats, and how itis used It in discusses practice. the plethora ofdifferent similar but information and how systems the web unifies creating them, a single ...

  16. Internet history timeline: ARPANET to the World Wide Web

    In internet history, credit for the initial concept that developed into the World Wide Web is typically given to Leonard Kleinrock. In 1961, he wrote about ARPANET, the predecessor of the internet ...

  17. History of the world wide web

    It was Tim Berners Lee who brought this all together and created the World Wide Web. The first trials of the World Wide Web were at the CERN laboratories (one of Europe's largest research laboratories) in Switzerland in December 1990. By 1991 browser and web server software was available, and by 1992 a few preliminary sites existed in places ...

  18. A Brief History of the Internet

    The Internet has revolutionized the computer and communications world like nothing before. The invention of the telegraph, telephone, radio, and computer set the stage for this unprecedented integration of capabilities. The Internet is at once a world-wide broadcasting capability, a mechanism for information dissemination, and a medium for ...

  19. History of the World Wide Web

    The World Wide Web ("WWW", "W3" or simply "the Web") is a global information medium that users can access via computers connected to the Internet. The term is often mistakenly used as a synonym for the Internet, but the Web is a service that operates over the Internet, just as email and Usenet do. The history of the Internet and the history of hypertext date back significantly further than ...

  20. A brief history of the World Wide Web

    This video summarises in 3 minutes how the Web was invented at CERN by British physicist and IT expert Tim Berners Lee in 1989, and how it grew to become wha...

  21. Essay on The World Wide Web

    Essay on The World Wide Web Better Essays 3102 Words 13 Pages 3 Works Cited Open Document The World Wide Web Communication--it is a fundamental part of our everyday lives. It characterizes who we are, what we do, and how we relate to others in society. It is a very powerful tool that holds many different uses for our basic needs and survival.

  22. What is the world wide web?

    Learn about the world wide web and how the internet began with this KS2 primary computing guide from BBC Bitesize for years 3 and 4.

  23. The 100 Best Books of the 21st Century: A Printable List

    The New York Times Book Review I've I want THE 100 BEST BOOKS OF THE 21ST CENTURY read to it read it 51 Life After Life, by Kate Atkinson 52 52 Train Dreams, by Denis Johnson 53 Runaway, by Alice ...

  24. Research Guides: World of 1898: International Perspectives on the

    Guide Editor: María Daniela Thurber, Reference Librarian, Hispanic Reading Room, Latin American, Caribbean, and European Division Content Authors: Please visit the Acknowledgement page for information on all authors and contributors to the original The World of 1898: The Spanish-American War web project. Note: This guide is adapted from The World of 1898: The Spanish-American War, the first ...

  25. World Wide Web born at CERN 25 years ago

    The first website at CERN - and in the world - was dedicated to the World Wide Web project itself. Last April CERN initiated a project to restore the first website, and to bring back the spirit of that time through its technical innovation and the founding principles of openness and freedom. In 1993 CERN put the World Wide Web software in ...

  26. Harris explains in exclusive CNN interview why she's shifted her

    Vice President Kamala Harris on Thursday offered her most expansive explanation to date on why she's changed some of her positions on fracking and immigration, telling CNN's Dana Bash her ...

  27. As a Teenager in Europe, I Went to Nudist Beaches All the Time. 30

    The director, actor, and political activist Lina Esco had emerged from the world of show business to question public nudity laws in the United States with 2014's Free the Nipple. Her film took ...