Pardon Our Interruption

As you were browsing something about your browser made us think you were a bot. There are a few reasons this might happen:

  • You've disabled JavaScript in your web browser.
  • You're a power user moving through this website with super-human speed.
  • You've disabled cookies in your web browser.
  • A third-party browser plugin, such as Ghostery or NoScript, is preventing JavaScript from running. Additional information is available in this support article .

To regain access, please make sure that cookies and JavaScript are enabled before reloading the page.

Encyclopedia Britannica

  • History & Society
  • Science & Tech
  • Biographies
  • Animals & Nature
  • Geography & Travel
  • Arts & Culture
  • Games & Quizzes
  • On This Day
  • One Good Fact
  • New Articles
  • Lifestyles & Social Issues
  • Philosophy & Religion
  • Politics, Law & Government
  • World History
  • Health & Medicine
  • Browse Biographies
  • Birds, Reptiles & Other Vertebrates
  • Bugs, Mollusks & Other Invertebrates
  • Environment
  • Fossils & Geologic Time
  • Entertainment & Pop Culture
  • Sports & Recreation
  • Visual Arts
  • Demystified
  • Image Galleries
  • Infographics
  • Top Questions
  • Britannica Kids
  • Saving Earth
  • Space Next 50
  • Student Center

proportions of World Wide Web content constituting the surface web, deep web, and dark web

  • Who controls the Internet?
  • Is the Internet “making us stupid”?
  • Is cancel culture (or “callout culture”) good for society?

Internet http://www blue screen. Hompepage blog 2009, history and society, media news television, crowd opinion protest, In the News 2009, breaking news

World Wide Web

Our editors will review what you’ve submitted and determine whether to revise the article.

  • Workforce LibreTexts - The World Wide Web
  • Academia - WWW (World Wide Web)
  • World Wide Web Foundation - History of the Web
  • ACM Digital Library - The World-Wide Web
  • LiveScience - World Wide Web: Definition, history and facts
  • NPR - The World Wide Web became available to the broader public 30 years ago
  • The University of Oklahoma - World-Wide Web
  • World Wide Web (WWW) - Student Encyclopedia (Ages 11 and up)

world wide web assignment

World Wide Web (WWW) , the leading information retrieval service of the Internet (the worldwide computer network ). The Web gives users access to a vast array of mass media and content—via the deep web , the dark web , and the commonly accessible surface web—that is connected by means of hypertext or hypermedia links—i.e., hyperlinks , electronic connections that link related pieces of information in order to allow a user easy access to them. Hypertext allows the user to select a word or phrase from text and thereby access other documents that contain additional information pertaining to that word or phrase. Hypermedia documents feature links to images, sounds, animations, and movies. The Web operates within the Internet’s basic client-server format; servers are computer programs that store and transmit documents to other computers on the network when asked to, while clients are programs that request documents from a server as the user asks for them. Browser software allows users to view the retrieved documents. Special browsers and platforms such as Tor allow users to do so anonymously.

A hypertext document with its corresponding text and hyperlinks is written in HyperText Markup Language ( HTML ) and is assigned an online address called a Uniform Resource Locator ( URL ).

world wide web assignment

The development of the World Wide Web was begun in 1989 by Tim Berners-Lee and his colleagues at CERN , an international scientific organization based in Geneva, Switzerland. They created a protocol , HyperText Transfer Protocol ( HTTP ), which standardized communication between servers and clients. Their text-based Web browser was made available for general release in January 1992.

The World Wide Web gained rapid acceptance with the creation of a Web browser called Mosaic , which was developed in the United States by Marc Andreessen and others at the National Center for Supercomputing Applications at the University of Illinois and was released in September 1993. Mosaic allowed people using the Web to use the same sort of “point-and-click” graphical manipulations that had been available in personal computers for some years. In April 1994 Andreessen cofounded Netscape Communications Corporation , whose Netscape Navigator became the dominant Web browser soon after its release in December 1994. BookLink Technologies’ InternetWorks, the first browser with tabs, in which a user could visit another Web site without opening an entirely new window, debuted that same year. By the mid-1990s the World Wide Web had millions of active users.

The software giant Microsoft Corporation became interested in supporting Internet applications on personal computers and developed its own Web browser (based initially on Mosaic), Internet Explorer (IE), in 1995 as an add-on to the Windows 95 operating system . IE was integrated into the Windows operating system in 1996 (that is, it came “bundled” ready-to-use within the operating system of personal computers), which had the effect of reducing competition from other Internet browser manufacturers, such as Netscape. IE soon became the most popular Web browser.

Apple ’s Safari was released in 2003 as the default browser on Macintosh personal computers and later on iPhones (2007) and iPads (2010). Safari 2.0 (2005) was the first browser with a privacy mode, Private Browsing, in which the application would not save websites in its history, downloaded files in its cache , or personal information entered on Web pages.

world wide web assignment

The first serious challenger to IE’s dominance was Mozilla’s Firefox , released in 2004 and designed to address issues with speed and security that had plagued IE. In 2008 Google launched Chrome , the first browser with isolated tabs, which meant that when one tab crashed, other tabs and the whole browser would still function. By 2013 Chrome had become the dominant browser, surpassing IE and Firefox in popularity. Microsoft discontinued IE and replaced it with Edge in 2015.

In the early 21st century, smartphones became more computer-like, and more-advanced services, such as Internet access, became possible. Web usage on smartphones steadily increased, and in 2016 it accounted for more than half of Web browsing.

World Wide Web

Assignment #1, assignment #2, search engines, assignments, internet search, foreign language search engines.

Assignment on World Wide Web

Assignment on World  Wide  Web

  Web  Site:

A  web  site  is  a  collection  of  web  pages, images, videos  or  other  digital  assets  that  is  hosted  on  one  or  several  web  server(s), usually  accessible  via  the  Internet, cell  phone  or  a  LAN. A  web  page  is  a  document,  typically  written  in  HTML, that  is  almost  always  accessible  via  HTTP, a  protocol  that  transfers  information  from  the  web  server  to  display  in  the  user’s  web  browser. All  publicity  accessible  websites  are  seen  collectively  as  constituting  the  “World  Wide  Web”.

The  pages  of  websites  can  usually  be  accessed  from  a  common  root  URL  called  the  homepage, and  usually  reside  on  the  same  physical  server. The  URLs  of  the  pages  organize  them  into  a  hierarchy, although  the  hyperlinks  between  them  control  how  the  reader  perceives  the  overall  structure  and  how  the  traffic  flows  between  the  different  parts  of  the  sites. Some  websites  require  a  subscription  to  access  some  or  all  of  their  content. Examples  of  subscription  sites  include  many  business  sites, parts  of  many  news  sites,  academic  journal  sites, gaming  sites, message  boards, Web-based  e-mail, services, social  networking  website  and  sites  providing  real-time  stock  market  data.

Organized  by  the  function, a  website  may  be-

          »  A  personal  website

          »  A  commercial  website

          »  A  government  website

          »  A  non-profit  organization  website

It  could  be  the  work  of  an  individual, a  business  or  other  organization  and  is  typically  dedicated  to  some  particular  topic  or  purpose.

Web  The  New  Arena:

Life  was  just  going  on  fine, when  along  came  the  Internet  and  just  about  everything  changed. You  just  have  to  look  back  over  the  past  five  year  or  six  and  think  of  different  things  were  before  that. Today, everyone’s  young, smart  and  online. Oh  yes, we  are  well  into  the  Net  Age. Whether  you  are  working  in  a  high-tech  corporation  or  setting  up  your  home  office, trying  to  learn  or  two  at  college  or  having  a  whale  of  a  time  at  school, life’s  on  the  Internet .

 Content  Is  The  King:

No  matter  how  great  a  site  looks, no  amount  of  design  ever  makes  up  for  poor  content. This  is  a  fact  that  many  web  author  lose  sight  of. That’s  why  we  have  so  many  sites  around  us  that  offer  the  visitor  the  same  old  thing – a  bit  of  this  a  bit  of  that. A  really  good  site  must  have  solid  unique  content. That’s  why  as  experts  recommend  we  started  with  strategy  and  purpose  first – no  with  design. First  off  you  must  quite  clear  of  the  purpose  of  your  site. This  holds  true  for  any  type  of  web  sites, whether  it’s  a  personal  web  sites, a  small  business  set  up, a  hobbyist’s  page  or  e- commerce  or  anything  else. A  web  site  without  purpose  just  takes  space  and  please  no  one  but  its  own  author. So  unless  you  are  just  using  your  site  for  storage, start  with  putting  down  your  purpose, your  objectives, and  message.

Types  Of  Web  Site:

There  are  many  variety  of  web  sites, each  specializing  in  a  particular  type  of  content  or  use  and  they  may  be  arbitrarily  classified  in  any  number  of  ways. A  few  such  classifications  might  include:-

Affiliated  Sites:  Enabled  portal  that  renders  not  only  its  custom  CMS  but  also  syndicated  content  from  other  content  providers  for  an  agreed  fee. There  are  usually  three  relationship  tiers. Affiliate  Agencies ( e.g.  Commission  Junction),  Advertisers ( e.g.  Ebay)  and  consumer ( e.g.  Yahoo).

Archive  Site:  Used  to  preserve  valuable  electronic  content  threatened  with  extinction. Two  examples  are – Internet  Archive  which  since  1996  has  preserved  billions  of  old ( and  new )  web  pages  and  Google  Groups  which  in  early  2005  was  archiving  over  845,000,000  messages  posted  to  Usenet  news / discussion  groups.

Blog  Site:  Sites  generally  used  to  post  online  diaries  which  may  include  discussion  forums ( e.g.  blogger, xanga).

Content  Site:  Sites  whose  business  is  the  creation  and  distribution  of  original  content ( e.g.  Slant,  About.com).

Corporate  Site:  Used  to  provide  background  information  about  a  business, organization  or  service.

E-Commerce  Site:  For  purchasing  goods  such  as  Amazon.com.

Community  Site:  A  site  where  persons  with  similar  interests  communicate  with  each  other  usually  by  chat  or  message  boards  such  as  MySpace.

Database  Site:  A  site  whose  main  use  is  the  search  and  display  of  a  specific  database’s  content  such  as  the  Internet  Movie  Database  or  the  Political  Graveyard.

Development  Site:  A  site  whose  purpose  is  to  provide  information  and  resources  related  to  software  development, web  design  and  the  like.

Directory  Site:  A  site  that  contains  varied  contents  which  are  divided  into  categories  and  subcategories  such  as  Yahoo!  directory, Google  directory  and  open  directory  project.

Download  Site:  Strictly  used  for  downloading  electronic  content  such  as  software, game  demos  or  computer  wall  paper.

Employment  Site:   Allows  employers  to  post  job  requirements  for  a  position  or  positions  and  prospective  employees  to  fill  an  application.

Erotica  Websites:  Shows  sexual  videos  and  images.

Fan  Site:   A  web  site  created  and  maintained  by  fans  of  and  for  a  particular  celebrity  as  opposed  to  a  web  site  created, maintained  and  controlled  by  a  celebrity  through  their  own  paid  webmaster. May  also  be  known  as  a  Shine  in  the  case  of  certain  subjects  such  as  anime  and  manga  characters.

Game  Site:  A  site  that  is  itself  a  game  or  “playground”  where  many  people  come  to  play  such  as  MSN  Games,  POGO.com  and  Newgrounds.com.

Gripe  Site:  A  site  devoted  to  the  critique  of  a person, place, corporation, government  or  institution.

Humor  Site:  Satirizes, parodies  or  otherwise  exists  solely  to  amuse.

Information  Site:  Contains  content  that  is  intended  to  inform  visitors  but  not  necessarily  for  commercial  purposes  such  as: RateMyProfessors.com, free  internet  lexicon  and  encyclopedia. Most  government, educational  and  non-profit  institutions  have  an  informational  site.

Java  Applet  Site:  Contains  software  to  run  over  the  web  as  a  web  application.

Mirror ( Computing )  Site:  A  complete  reproduction  of  a  website.

News  Site:  Similar  to  an  information  site  but  dedicated  to  dispensing  news  and  commentary.

Personal  Homepage:  Run  by  an  individual  or  a  small  group ( such  as  a  family )  that  contains  information  or  any  content  that  the  individual  wishes  to  include.

Political  Site:  A  site  on  which  people  may  voice  political  views.

Pornography (  porn )  Site:  Site  that  shows  pornographic  images  &  videos.

Rating  Site:  Site  on  which  people  can  praise  or  disparage  what  is  featured.

Review  Site:  Site  on  which  people  can  post  reviews  for  product  or  service.

Search  Engine  Site:  A  site  that  provides  general  information  and  is  intended  as  a  gateway  or  lookup  for  other  sites. A  pure  example  is  Google  and  the  most  widely  known  extended  type  is  Yahoo! .

Shock  Site:   Includes  images  or  other  material  that  is  intended  to  be  offensive  to  most  viewers ( e.g.  Rotten.com ).

Phish  Site:   Website  created  to  fraudulently  acquire  sensitive  information  such  as  passwords  and  credit  card  details  by  masquerading  as  a  trustworthy  person  or  business  ( such  as  social  security  administration,  paypal )  in  an  electronic  communication.

Warez:   A  site  filled  with  illegal  downloads.

Web  Portal:   A  site  that  provides  a  starting  point  or  a  gateway  to  other  resources  on  the  internet  or  an  intranet.

Wiki  Site:   A  site  which  users  collaboratively  edit ( such  as  Wikipedia ).

World  Wide  Web:

The  letters  “www”  are  commonly  found  at  the  beginning  of  Web  addresses  because  of  the  long-standing  practice  if  naming  Internet  hosts ( servers )  according  to  the  services  they  provide. So  for  example, the  host  name  for a  Web  server  is  often  “www”  for  an  FTP  server, “ftp”;  and  for  a  USENET  news  server,  “news”   or   “ nntp ”  ( after  the news  protocol  NNTP). These  host  names  appear  as  DNS  subdomain  names  as  in  www.example.com. This use  of  such  prefix  is  not  require  by  any  technical  standard;  indeed,  the web  server  was  at  “ nxoc01.cern.ch”, [ 15 ] and even  today  many  web  sites  exist  without  a  “ www ”  prefix  has  no  meaning  in  the  way  the  main  web  site  is  shown.  The  “ www ”  prefix  is  simply  one  choice  for  a  web  site’s subdomain  name. Some web  browsers  will  automatically  try  adding  “ www ” to  the  beginning  and  possibly “ .com ”  to  the  end  of  typed  URLs if  no  host  is  found  without  them. Internet  Explorer, Mozilla  Firefox,  Safari  and opera  will  also  prefix  “ http://www”  and  append “ .com ” to  the  address  bar  contents  if  the  Control  and  Enter  keys  are  pressed  simulteniously. For  example, entering “ example ” in  the  address  bar  and  then  press  either  just  Enter  or  Control+Enter  will  usually  resolve  to  “ http://www.example.com ”  depending  on  the  exact  browser  version  and  its  settings.

The  World  Wide  Web ( commonly  shortened  to  the  web )  is  a  system  of  interlinked,  hypertext  documents  accessed   via  the  Internet. With  a  web  browser,  a  user  views  web  pages  that  contain  text,  images,  videos  and  other  multimedia  and  navigates  between  them  using  hyperlinks. The  World  Wide  Web  was  created  in  1989  by  Sir  Tim  Berners  Lee,  working  at  CERN  in  Geneva, Switzerland. Since  then,  Berners  Lee  has  played  an  active  role  in  guiding  the  development  of  web  standards ( such  as  the  markup  languages  in  which  web  pages  are  composed), and  in  recent  years  has  advocated  his  vision  of  a  Semantic  Web. Robert  Cailliau, also  at  CERN,  was  an  early  evangelist  for  the  project.

How  The  Web  Works:

Viewing a  web  page  on  the  World  Wide  Web  normally  begins  either  by  typing  the  URL  of  the  page  into  a  web  browser  or  by  following  a  hyperlink  to  that  page  or  resource.  The  web  browser  then  initiates  a  series  of  communication   messages  behind  the  scenes  in  order  to  fetch  and  display  it.

First  the  server-name  portion  of  the  URL  is  resolved  into  an  IP  address  using  the  global,  distributed  Internet  database  known  as  the  domain  name  system or  DNS. This  IP  address  is  necessary  to  contact  and  send  data  packets  to  the  web  server.

The  browser  then  requests  the  resource  by  sending  an  HTTP  request  to  the  web  server  at  that  particular  address. In  the  case  of  a  typical  web  page  the  HTML  text  of  the  page  is  requested  first  and  parsed  immediately  by  the  web  browser  which  will  then  make  additional  requests  for  images  and  any  other  files  that  form  a  part  of  the  page. Statistics  measuring  a  website’s  popularity  are  usually  based  on  the  number  of ‘ page  views’  or  associated  server  ‘ hits’ or   requests  which  take  place.

Having  received  the  require  files  from  the  web  server  the  browser  renders  the  page  onto  the  screen  as  specified  by  its  HTML,  CSS  and  other  web  languages. Any  images  and  other  resources  are  incorporated  to  produce  the  on-screen  web  page  that  the  user  sees.

Contents  Of  Web  Site:

No  matter  how  great  a  site  looks,  no  amount  of  design  ever  makes  up  for  poor  content. This  is  a  fact  that  many  web  author  lose  sight  of. That’s  why we  have  so  many  sites  around  us  that  offer  the  visitor  the  same  old  thing  – a  bit  of  this,  a  bit  of  that. A  really  good  site  must  have  solid  unique  content. That’s  why  as  exports  recommend  we  started  with  strategy  and  purpose  first – no  with  design. First  off  you  must  quite  clear  of  the  purpose  of  your  site. This  hold  true  for  any  type  of  websites, whether  it’s  a  personal  websites, a small  business  setup, a  hobbyist’s  page  or  e-commerce  or  anything  else. A website  without  purpose  just  takes  space  and  please  no  one  but  its  own  author. So unless  you  are  just  using  your  site  for  storage, start  with  putting  down  your  purpose, your  objectives,  and  the  message.

  Writing  Web  Site:

Writing  for  the  web  is  in  many  ways  different  form  writing  for  print. For  one,  the  reader’s  purpose  in  reading  may  be  different. His  attention  span  is  different. The  reading  experience  online  and  the  way  the  reader’s  eye  moves  across  a  page  are  different. With a printed  page,  there  is  only  one  sort  of  navigation-turn  the  page. But  on  a  web  page, there  can  be  dozens  and  dozens  of  options  all  visible  at  once. And  there’s  your  reader,  finger  poised  over  the  mouse  button,  ready  to  be  interactive. But  with  a  web  page,  interactivity  is  important  because  readers  want  to  do  something. All  this  means  that  information  has  to  be  tailored  and  arranged  specially  for  online  reading. Writing  for  the  web  skillfully  involves  learning  how  to  keep  in  mind  new  online  reading  habits  and  patterns. It  means  being  able  to  put  forward  information  in  a  way  that  draws  the  reader  in  quickly  and  keeps  him  at  the  website  or  at  least  that  it  gives  him  what  he  wants  so  that  he  comes  back  again  and  again.

Web  Site  Styles:

A  static  web  site  is  one  that  has  web  pages  stored  on  the  server  in  the same  form  as  the  user  will  view  them. They  are  edited  using  three  broad  categories  of  software:

Text  editor  such  as  notepad  or  text editor,  where  the  HTML  is  manipulated  directly  within  the  editor  program.

Editor  such  as  Microsoft  Frontpage  and  Macromedia  Dreamweaver  where  the  site  is  edited  using  a  GUI  interface  and  the  underlying  HTML  is  generated  automatically  by  the  editor  software. Template-based  editors  such  as  Rapidweaver  and  iWeb  which  allow  users  to  quickly  create  and  upload    websites  to  a  web  server  without  having  to  know  anything  about  HTML  as  they  just  pick  a  suitable  template  from  a  palette  and  add  pictures  and  text  to  it  in  a  DTP-like  fashion  without  ever  having  to  see  any  HTML  code. A  dynamic  website  is  one  that  has  frequently  changing  information  or  collates  information  on  the  hop  each  time  a  page  is  requested. For  example-  it  would  call  various  bits  of  information  from  a  database  and  put  them  together  in  a  pre-defined  format  to  present  the  reader  with  a  coherent  page. It  interacts  with  users  in  a  variety  of  ways  including  by  reading  cookies  recognizing  user’s  previous  history,  session  variables,  server  side  variables  etc,  or  by  using  direct  interaction  ( form  elements,  mouseovers, etc ). A  site  can  display  the  current  state  of  a  dialog  between  users,  monitor  a  changing  situation  or  provide  information  in  some  way  personalized  to  the  requirements  of  the  individual  users.

What  makes  the  World  Wide  Web  so  exciting  is  the  limitless  ways  in  which  information  and  content  can  be  put  up. Using  color, picture, sounds, movie clips, animation  and  interactivity, you  can  make  sure  your  site  is  compelling  enough  to  draw  visitors  again  and  again  which  is  what  every  website  wants. Naturally, this  makes  the  site’s  design, its  layout,  navigation  and  general  look  and  feel  an  important  aspect  to  work  on. What  usually  tends  to  happen  is  that  people  over-design  a  website,  filling  it  with  bright  starting  colors,  a  feast  of  different  fonts  and  too  many  pictures  for  its  own  good. Many  resource  sites  have  design  is  term  that  is  used  rather  loosely  sometimes  it  can  include  usability  issues, navigation, browser  compatibility  and  so  on. If  your  page  is  about  a  regional  specific  topic, then  make  sure  you  include  that.

Preferably  right  in  the title  of  the  page. Put  the  region  in  the  keywords  and  page  description  as  well. Remember  too  that  even  if  your  topic  is  regional,  it  has  value  to  global  viewers. What  if  someone  from  Germany  is  visiting  your  home  town  and  needs  any  information  there? You  also  might  want  to  expand  your  site  to  give  more  generic  information  that  would  appeal  to  a  more  global  audience.

Language  On  the  Web:

Right  now, most  of  the  pages  on  the  web  are  in  English  but  just  because  you’re  writing  your  page  in  English  in  Australia  does  not  mean  that  a  Canadian  would  understand  it  or  find  it  useful. Make  sure  that  you  avoid  slang  on  your  site  as  that  is  the  most  non-translatable  element  of  a  page. When  you  list  a  price, indicate  what  currency  you’re  using. And  when  you  list  sizes  or  measurements, it helps  if  you  list  conversions  or  link  to  a  conversion  web  site.

Static  Versus  Dynamic:  Static  HTML  sites  have  not  changed  much  since their  development   and  the  advent  of  the  web. Essentially  websites  are  presented  using  a  wide  array  of  tags  that  offer  means  for  usually  laying out  a  site. Search  engines  have  become  very  good  at  recognizing  static  websites. In  general  search  engines  can  navigate  through  a  static  website  very  easily  and  thus  locate  information. However,  there  is  one  significant  disadvantage  of  static  sites,  you  may  need  a  separate  page ( file )  for  every  page  on  your  site. For  example, if  you  want  to  make  a  design  change  that  affects  the  entire  site  you  may  need  to  adjust  all  pages. For  small  sites  this  is  not  a  problem  but  for  large  content  or  e-commerce  sites  creating  new  pages  or  updating  existing  pages  can  be  time  consuming  and  expensive. Certainly  there  is  web  development  software  that  makes  this  a  little  easier  but  in  the  end  static  sites  take  time  to  manage. Interaction  with  visitors  is  a  key  feature  of  the  best  sites  on  the  web. After  all  the  most  popular  computer  operating  systems  in  the  world  may  be  the  ones  used  for  game  playing  machines. It  seems  that  people  hate  the  pickiness  and  precision  of  computer  that   allow  them  to  do  the  things  they  do. On  the  other  hand  they  love  the  illusion  of  the  computer  as  another  person  with  dynamism  comes  interaction. Dynamic  websites  means  that  different  actions  by  the  visitor  cause  different  behaviors  i.e. outputs  by  the  site. That  means  pages  are  created  as the  user  views  the  site. In  most  cases  this  requires  the  use  of  a  database  which  contains  the  site’s  information  and  some  kind of  scripting  setup  that  is  programmed  to  retrieve  the  information  from  the  database.

  What  A  Dynamic  web  Site  can  do?:

building  a  database  driven  web  site  is  one  of  the  best  ways  to  insure  that  your  site  will  grow  into  the  future. Here  are  some  of  the  reasons  why – – –

Manage  Your  Own  Content:  A  database-backed  website  brings  unprecedented  flexibility  to  how  information  is  stored  and  displayed  on  the web. That  means  you  can  add  and  manage  stories,  information, schedules  and  photographs  without  having  to  calla web  master. It’s  a  great  way  to  take  control  of  your  site  while  saving  money  on  maintenance.

Keep  Your  Visitors  Coming  Back:   With  fresh  content  that  you  can  update  at  your  site  will  always  be  relevant. So  instead  of  finding  the  same  stories  and  information  on  your  site, returning  visitors  will  find  information  that’s  new  and  current. Its  easy,  inexpensive  and  will  keep  your  visitors  coming  back  time  and  time  again.

 Grow  into  The  Future:   Building  a  dynamic,  database-driven  site  is  strategically  superior  because  changes  to  the  site  are  incredibly  easy  to  make. Want  a  new  look  on  the  site? No  problem,  since  design  ( presentation )  is  separated  from  the  site’s  content. Need  to  change  content,  that’s  only  a  few  keystrokes  away  with  easy-to-use  administrative  interfaces. Want  to  add  new  pages  or  section’s? not  a  problem  when  you  have  built  your site  on  a  foundation  that’s  both  solid  and   flexible.

Manage  Visitors  Securely:   With  a  data  driven  site  you  can  let  visitors  see  only  information  that  you  want  them  to  see. Build  member’s-only  sections, handle   passwords, lockout  unwanted  requests,  handle  subscription  services, allow  your  staff  access  to  areas  where  others  are  not   allowed. A database-backed  site  can  perform  these  secure  functions  with  ease.

Be  Searchable:   letting  visitors  find  the  information  they  need  quickly  and  easily  is  a  snap  with  a  dynamic  site. Whether  you   are  a  publisher  hosting  thousands  of  articles  or  a  merchant  selling  hundreds  of  widgets,  a  dynamic  site  allows  your  visitors  to  find  what  they  need  in  a  heartbeat.

Harness  Your  Site’s  potential:  Unlike  traditional  “ static ”  sites, a dynamic  site  is  far  more  than  useful  than  simple  “ brochure  ware”. With  dynamic  architecture,  your  site  can  be  put  to  an  infinite  variety  of  valuable  uses. For  example: you  can  easily  connect  a   visitor  with  a near  by  distributor,  connect  a  specific  salesperson  to  a customer or  deliver  an  instant  response  customer  service  request. In short, a  dynamic  site  delivers  more  than  a “ static ”  site  ever  could.

Spend  Less  Time  Managing  Your  Site:  A  dynamic  site  can  reduce  or  eliminate  many  of  the  most  time-consuming  functions   facing  your staffs. That’s  because  many  administrative  functions  can  now  be  automated. For  example:  if  a  deadline  has  passed  or  an  inventory  sold-out, the  site  can  automatically  remove  those  items  from  display. It  could  notify  automatically  and  update  product  pages  on   its  own. Now  your  staff  can  spend  less  time  on  the  web  managing  our  site  and  more  time  doing  the  things  they  do  best.

Handle  Complex  Tasks:   while  dynamic  sites  are  superb for  publishing  and  e-commerce,  they  can  also  be  used  far  complex  tasks  such  as  quoting,  estimating  and  presenting  customized  sales  information  anywhere, any  time. Handling  complex  tasks  is  par  for  the  course  with  a  dynamic  site.

Connect  To  Your  customers:   When  visitors  come  to  your  site, do  you  gather  information  that  can  help  you  serve  them better? With  a  built-in  database,  a  dynamic  site  is  a  natural  for  gathering  customer  preferences. Ask  them  if  they  want   to  subscribe  to  news  letters  or  if  they  are  interested  in  new  products. Test market  new  products. Survey  them  for  valuable  feedback. A  dynamic  web  site  can  help  you  connect  to  customers  in  ways  you  were  never  able  to  before.

Customize  your  message:   is  it   possible  to  respond  on  an  individual  basis  to  an  infinite  number   of  site  visitors? If  you  build  a  dynamic  site,  it  is. From  greeting  customers  individually  after  log-in  to  sending  carefully-crafted  customized  emails, a dynamic  site  will  help  you  send  the  message  that  your  customers  are  more  than  just  numbers.

Developing  A  Dynamic  Site:

In  most  cases  this  requires  the  use  of  a  database  which  contains  the  site’s  information   and  some  kind  of  scripting  setup  that  is  programmed  to  retrieve  the  information  from  the  database. Some  popular  scripting  languages  are  ASP,  ASP.Net,  PHP,  Pearl,  ColdFusion,  JavaScript. Some  popular  databases  are  MySql, MS SQL  Server,  Oracle  etc.

ASP:  Active  Server  Pages,  Microsoft’s  technology  to  enables  HTML  pages  to  be  dynamic  and  interactive  by  embedding  scripts  i.e.  either  VBScript  or  Jscript,  Microsoft’s  alternative  of  JavaScript. Since  the  scripts  in  ASP  pages ( suffix asp )   are  processed  by  the  server,  any  browser  can  work  with ASP  pages  regardless  of  its  Support  for  the  scripting  language  used  there  in.

ASP.Net:  Microsoft  ASP.Net  is  a  set  of  technologies  in  the  Microsoft .NET  Framework  for  building  web  applications  and  XML  web  services. ASP.Net  pages  execute  on  the  server  and  generate  markup  such  as  HTML,  WML  or  XML  that  is  sent  to  a  desktop  or  mobile  browser.  ASP.Net  pages  use  a  compiled, event-driven  programming  model  that  improves  performance  and  enables  the  separation  of  application  logic  and  user  interface. ASP.Net  pages  and  ASP.Net  XML  web  services  files  contain  server-side  logic ( as  opposed  to  client-side  logic ) written  in  Microsoft  Visual  Basic.NET,  Microsoft  Visual  C#.NET  or  any  Microsoft.NET  framework-compatible  language.

PHP:   A recursive  acronym  for  “ hypertext  Preprocessor ”  is  an  open  source  server side  scripting  language  designed  for  creating  robust  and  reliable  dynamic  web  pages  for  e-commerce  and  other  mission  critical  web  applications.

Perl:   Practical  Extraction  and  Reporting  Language  A  robust  programming  language  frequently  used  for  creating  CGI  programs  on  web  servers  because  it  is  faster  than  UNIX  shell  script  programs,  it  can  read  and  write  binary  files  and  it  can  process  very  large  files.

Cold  Fusion:   It  is  an  advanced  website  program  that  runs  on  our  servers  here  at  Thelix. Cold  Fusion  works  in  conjunction  with  a  database  of  information  that  it  draws  from. You  can  use  Cold  Fusion  to  create  dynamic  web  pages that  display  a  variety  of  data,  depending  on  what  the  viewer clicks  on.

JavaScript:   a  scripting  language  from  Netscape  that  is  only  marginally  related  to  java. Java  and  JavaScript  are  not  the  same  thing. JavaScript  was  designed  to  resemble  Java  which  in  turn  looks  a  lot  like  C  and  C++. The  difference  is  that  java  was  built  as  a  general-purpose  object  language  while  JavaScript  is  intended  to  provide  a  quicker  and  simpler  language  in  a  web  page  that  is  interpreted  and  executed  by  the  web  client. The  script  writer  controls  from  within  a  web  document,  often  executed  by  mouse  function,  buttons  or  other  actions  from  the  user. JavaScript  can  be  used  to  fully  control  Netscape  and  Microsoft  web  browsers  including  all  the  familiar  browser  attributes.

MySQL: The   MySQL  database  server  is  the  world’s  most  popular  open  source  database. Over  six  million  installations  use  MySQL  to  power  high-volume  websites  and  other  critical  business  systems – including  industry- leaders like the  associated press, Yahoo, NASA, Sabre  Holdings  and  Suzuki. MySQL  is  an  attractive  alternative  to  higher-cost,  more  complex  database  technology. Its  award-winning  speed, scalability  and  reliability  make  it  the  right  choice  for  corporate  IT  departments, web  developers  and  packaged  software  vendors.

Microsoft  SQL :  Microsoft  SQL  server  is  a  relation  database  management  system  produced  by  Microsoft. It  supports  a dialect of  SQL,  the  most  common  database  language. It is  commonly  used  by  governments  and  businesses  for  small  to  medium  sized  databases  and  completes  with  other  SQL  databases  for  this  market  segment.

Oracle  database:   An  Oracle  database,  strictly  speaking,  is  a  collection  of  data,  is  sometimes  imprecisely  used  to  refer  to  the  DBMS  software  itself. This  error  is  made  in  the  title  of  this  article  and  below. The  oracle  managed  by  an  Oracle  database  management  system  or  DBMS. The  term ” Oracle  database management  system”  can  be  referred  to  without  ambiguity  as  Oracle  DBMS  or  the  databases  which  it  manages  being  relational  in  character, as oracle RDBMS. The  very  useful  distinction  between  data  managed  by  Oracle, an  Oracle  database  and  the  Oracle  RDBMS  is  blurred  by  Oracle  Corporation  itself  when  they  refer  now a days  to  the  Oracle  RDBMS,  the  software  they  sell  to  manage  databases  as  the Oracle  Database. The  distinction  between  the  managed  data ( the  database )  and  the  software  which  manages  the  ( DBMS / RDBMS )  relies, in  Oracle’s  marketing  literature. On  the  capitalization  of  the  world  database. The  Oracle  DBMS  is  produced  and  marketed  by  Oracle  Corporation. The Oracle  DBMS  is  extensively  used  by  many  database  applications  on  most  popular  computing  platforms.

Scripts  &  Database:

We  used  PHP  as  scripting  language  and  MySQL  as  backend  Database. In  the  upcoming  chapter  we  will  discuss  briefly  about  PHP  and  MySQL  and  their  beneficiary  side  over  others  similar  tools.

  Publishing  Web  Page:

 Web  page  production  is  available  to  individuals  outside  the  mass  media. In  order  to  publish  a  web  page,  one  does  not  have  to  go  through  a  publisher  or  other  media  institution  and  potential  readers  could  be  found in  all  corners  of  the  globe. Many  different  kinds  of  information  are available  on the  web  and  for  those  who  wish  to  know  other  societies,  cultures  and  peoples, it  has  become  easier. The  increased  opportunity  to  publish materials  is  observable  in  the  countless  personal  and  social  networking  pages, as  well  as  sites  by families, small  shops  etc.  facilitated  by  the  emergence  of  free  web  hosting  services.

Web  Analytics:

Web  analytics is  the  study  of  the  behavior  of  website  visitors. In a  commercial  context,    web  analytics  especially  refers  to   the  use  of  data  collected  from  a  web  site  to  determine  which  aspects  of  the  website  work  towards  the  business  objectives  for  example  which  landing  pages  encourage  people  to  make  a  purchase.

Data  collected  almost  always  includes  web  traffic  reports. It  may  also  include  e-mail  response  rates, direct  mail  campaign  data,  sales  and  lead  information,  user  performance  data  such as  click  heat  mapping  or  other  custom  metrics  as  needed. This  data  is  typically  compared  against  key  performance  indicators  for  performance  and  used  to  improve  a  web  site  or  marketing  campaign’s  audience  response. Many  different  vendors  provide  web  analyt

Wolf Kishner Reduction

Wolf Kishner Reduction

Assignment on Viral Hepatitis

Assignment on Viral Hepatitis

Assingnment on MP3 players Evaluation and Effect.

Assingnment on MP3 players Evaluation and Effect.

Assignment on Data Mining

Assignment on Data Mining

Health Benefits of Artichoke

Health Benefits of Artichoke

Role of the BPC in the Promotion of Tourism Industry in Bangladesh

Role of the BPC in the Promotion of Tourism Industry in Bangladesh

Ethical and Effective Employment Screening

Ethical and Effective Employment Screening

Introduction to Sociology

Introduction to Sociology

WASP-12b – a Class of Extrasolar Planets

WASP-12b – a Class of Extrasolar Planets

The Cutter Expansive Classification

The Cutter Expansive Classification

Latest post.

Breakthrough Heart MRI method reliably predicts Heart Failure Risk in the general Population

Breakthrough Heart MRI method reliably predicts Heart Failure Risk in the general Population

Terbium Monosulfide – a binary inorganic compound

Terbium Monosulfide – a binary inorganic compound

Bismuth Arsenide – an inorganic compound

Bismuth Arsenide – an inorganic compound

Gene-related Metabolic Dysfunction may be generating Cardiac Arrhythmia

Gene-related Metabolic Dysfunction may be generating Cardiac Arrhythmia

Controlling Cholesterol Levels with Fewer Negative effects is achievable with new Medicine

Controlling Cholesterol Levels with Fewer Negative effects is achievable with new Medicine

Terbium Phosphide – an inorganic compound

Terbium Phosphide – an inorganic compound

The World Wide Web

Lesson details, key learning points.

  • In this lesson, we will introduce the key components of the World Wide Web. We will understand the difference between HTTP and HTTPS protocols.

This content is made available by Oak National Academy Limited and its partners and licensed under Oak’s terms & conditions (Collection 1), except where otherwise stated.

Starter quiz

7 questions, 8 questions, lesson appears in, unit computing / networks: from semaphores to the internet.

The World Wide Web

world wide web assignment

Tim Berners-Lee, Faculty

Berners-Lee invented the World Wide Web in 1989. His key concepts are familiar to anyone using the Web: HTML (HyperText Markup Language), HTTP (HyperText Transfer Protocol), and URLs (Uniform Resource Locators). When he came to MIT in 1994, he formed the World Wide Web Consortium (W3C) at MIT’s Computer Science and Artificial Intelligence Laboratory to create and maintain open standards for this essential global system. In 2017 he won the Turing Award, the most prestigious honor in computer science.

With your support, we will build a better world.

Share this story

Explore Topics

More Stories

CERN Accelerating science

home

The birth of the Web

The World Wide Web was invented by British scientist Tim Berners-Lee in 1989 while working at CERN

Tim Berners-Lee, a British scientist, invented the World Wide Web (WWW) in 1989, while working at CERN. The web was originally conceived and developed to meet the demand for automated information-sharing between scientists in universities and institutes around the world.

home.cern,Accelerators

The first website at CERN – and in the world – was dedicated to the World Wide Web project itself and was hosted on Berners-Lee's NeXT computer. In 2013, CERN launched a project to restore this first ever website :  info.cern.ch .

On 30 April 1993, CERN put the World Wide Web software in the public domain. Later, CERN made a release available with an open licence, a more sure way to maximise its dissemination. These actions allowed the web to flourish.

A short history of the Web

Licensing the web, browse the first website, the worldwideweb browser, the line-mode browser, the cern effect, internet prehistory at cern, minimising the muddle, good old bitnet, and the rise of the world wi..., on the open internet and the free web, not at all vague and much more than exciting, why bring back the line-mode browser, how the internet came to cern, twenty years of a free and open www.

National Academies Press: OpenBook

Funding a Revolution: Government Support for Computing Research (1999)

Chapter: 7 development of the internet and the world wide web, 7 development of the internet and the world wide web.

The recent growth of the Internet and the World Wide Web makes it appear that the world is witnessing the arrival of a completely new technology. In fact, the Web—now considered to be a major driver of the way society accesses and views information—is the result of numerous projects in computer networking, mostly funded by the federal government, carried out over the last 40 years. The projects produced communications protocols that define the format of network messages, prototype networks, and application programs such as browsers. This research capitalized on the ubiquity of the nation's telephone network, which provided the underlying physical infrastructure upon which the Internet was built.

This chapter traces the development of the Internet, 1 one aspect of the broader field of data networking. The chapter is not intended to be comprehensive; rather, it focuses on the federal role in both funding research and supporting the deployment of networking infrastructure. This history is divided into four distinct periods. Before 1970, individual researchers developed the underlying technologies, including queuing theory, packet switching, and routing. During the 1970s, experimental networks, notably the ARPANET, were constructed. These networks were primarily research tools, not service providers. Most were federally funded, because, with a few exceptions, industry had not yet realized the potential of the technology. During the 1980s, networks were widely deployed, initially to support scientific research. As their potential to improve personal communications and collaboration became apparent, additional academic disciplines and industry began to use the technol-

ogy. In this era, the National Science Foundation (NSF) was the major supporter of networking, primarily through the NSFNET, which evolved into the Internet. Most recently, in the early 1990s, the invention of the Web made it much easier for users to publish and access information, thereby setting off the rapid growth of the Internet. The final section of the chapter summarizes the lessons to be learned from history.

By focusing on the Internet, this chapter does not address the full scope of computer networking activities that were under way between 1960 and 1995. It specifically ignores other networking activities of a more proprietary nature. In the mid-1980s, for example, hundreds of thousands of workers at IBM were using electronic networks (such as the VNET) for worldwide e-mail and file transfers; banks were performing electronic funds transfer; Compuserve had a worldwide network; Digital Equipment Corporation (DEC) had value-added networking services; and a VNET-based academic network known as BITNET had been established. These were proprietary systems that, for the most part, owed little to academic research, and indeed were to a large extent invisible to the academic computer networking community. By the late 1980s, IBM's proprietary SNA data networking business unit already had several billions of dollars of annual revenue for networking hardware, software, and services. The success of such networks in many ways limited the interest of companies like IBM and Compuserve in the Internet. The success of the Internet can therefore, in many ways, be seen as the success of an open system and open architecture in the face of proprietary competition.

Early Steps: 1960-1970

Approximately 15 years after the first computers became operational, researchers began to realize that an interconnected network of computers could provide services that transcended the capabilities of a single system. At this time, computers were becoming increasingly powerful, and a number of scientists were beginning to consider applications that went far beyond simple numerical calculation. Perhaps the most compelling early description of these opportunities was presented by J.C.R. Licklider (1960), who argued that, within a few years, computers would become sufficiently powerful to cooperate with humans in solving scientific and technical problems. Licklider, a psychologist at the Massachusetts Institute of Technology (MIT), would begin realizing his vision when he became director of the Information Processing Techniques Office (IPTO) at the Advanced Research Projects Agency (ARPA) in 1962. Licklider remained at ARPA until 1964 (and returned for a second tour in 1974-1975), and he convinced his successors, Ivan Sutherland and Robert Taylor, of the importance of attacking difficult, long-term problems.

Taylor, who became IPTO director in 1966, worried about the duplication of expensive computing resources at the various sites with ARPA contracts. He proposed a networking experiment in which users at one site accessed computers at another site, and he co-authored, with Licklider, a paper describing both how this might be done and some of the potential consequences (Licklider and Taylor, 1968). Taylor was a psychologist, not a computer scientist, and so he recruited Larry Roberts of MIT's Lincoln Laboratory to move to ARPA and oversee the development of the new network. As a result of these efforts, ARPA became the primary supporter of projects in networking during this period.

In contrast to the NSF, which awarded grants to individual researchers, ARPA issued research contracts. The IPTO program managers, typically recruited from academia for 2-year tours, had considerable latitude in defining projects and identifying academic and industrial groups to carry them out. In many cases, they worked closely with the researchers they sponsored, providing intellectual leadership as well as financial support. A strength of the ARPA style was that it not only produced artifacts that furthered its missions but also built and trained a community of researchers. In addition to holding regular meetings of principal investigators, Taylor started the "ARPA games," meetings that brought together the graduate students involved in programs. This innovation helped build the community that would lead the expansion of the field and growth of the Internet during the 1980s.

During the 1960s, a number of researchers began to investigate the technologies that would form the basis for computer networking. Most of this early networking research concentrated on packet switching, a technique of breaking up a conversation into small, independent units, each of which carries the address of its destination and is routed through the network independently. Specialized computers at the branching points in the network can vary the route taken by packets on a moment-to-moment basis in response to network congestion or link failure.

One of the earliest pioneers of packet switching was Paul Baran of the RAND Corporation, who was interested in methods of organizing networks to withstand nuclear attack. (His research interest is the likely source of a widespread myth concerning the ARPANET's original purpose [Hafner and Lyon, 1996]). Baran proposed a richly interconnected set of network nodes, with no centralized control system—both properties of today's Internet. Similar work was under way in the United Kingdom, where Donald Davies and Roger Scantlebury of the National Physical Laboratory (NPL) coined the term "packet."

Of course, the United States already had an extensive communications network, the public switched telephone network (PSTN), in which digital switches and transmission lines were deployed as early as 1962.

But the telephone network did not figure prominently in early computer networking. Computer scientists working to interconnect their systems spoke a different language than did the engineers and scientists working in traditional voice telecommunications. They read different journals, attended different conferences, and used different terminology. Moreover, data traffic was (and is) substantially different from voice traffic. In the PSTN, a continuous connection, or circuit, is set up at the beginning of a call and maintained for the duration. Computers, on the other hand, communicate in bursts, and unless a number of "calls" can be combined on a single transmission path, line and switching capacity is wasted. Telecommunications engineers were primarily interested in improving the voice network and were skeptical of alternative technologies. As a result, although telephone lines were used to provide point-to-point communication in the ARPANET, the switching infrastructure of the PSTN was not used. According to Taylor, some Bell Laboratories engineers stated flatly in 1967 that "packet switching wouldn't work." 2

At the first Association for Computing Machinery (ACM) Symposium on Operating System Principles in 1967, Lawrence Roberts, then an IPTO program manager, presented an initial design for the packet-switched network that was to become the ARPANET (Davies et al., 1967). In addition, Roger Scantlebury presented the NPL work (Roberts, 1967), citing Baran's earlier RAND report. The reaction was positive, and Roberts issued a request for quotation (RFQ) for the construction of a four-node network.

From the more than 100 respondents to the RFQ, Roberts selected Bolt, Beranek, and Newman (BBN) of Cambridge, Massachusetts; familiar names such as IBM Corporation and Control Data Corporation chose not to bid. The contract to produce the hardware and software was issued in December 1968. The BBN group was led by Frank Heart, and many of the scientists and engineers who would make major contributions to networking in future years participated. Robert Kahn, who with Vinton Cerf would later develop the Transmission Control Protocol/Internet Protocol (TCP/IP) suite used to control the transmission of packets in the network, helped develop the network architecture. The network hardware consisted of a rugged military version of a Honeywell Corporation minicomputer that connected a site's computers to the communication lines. These interface message processors (IMPs)—each the size of a large refrigerator and painted battleship gray—were highly sought after by DARPA-sponsored researchers, who viewed possession of an IMP as evidence they had joined the inner circle of networking research.

The first ARPANET node was installed in September 1969 at Leonard Kleinrock's Network Measurement Center at the University of California at Los Angeles (UCLA). Kleinrock (1964) had published some of the

earliest theoretical work on packet switching, and so this site was an appropriate choice. The second node was installed a month later at Stanford Research Institute (SRI) in Menlo Park, California, using Douglas Engelbart's On Line System (known as NLS) as the host. SRI also operated the Network Information Center (NIC), which maintained operational and standards information for the network. Two more nodes were soon installed at the University of California at Santa Barbara, where Glen Culler and Burton Fried had developed an interactive system for mathematics education, and the University of Utah, which had one of the first computer graphics groups.

Initially, the ARPANET was primarily a vehicle for experimentation rather than a service, because the protocols for host-to-host communication were still being developed. The first such protocol, the Network Control Protocol (NCP), was completed by the Network Working Group (NWG) led by Stephen Crocker in December 1970 and remained in use until 1983, when it was replaced by TCP/IP.

Expansion of the Arpanet: 1970-1980

Initially conceived as a means of sharing expensive computing resources among ARPA research contractors, the ARPANET evolved in a number of unanticipated directions during the 1970s. Although a few experiments in resource sharing were carried out, and the Telnet protocol was developed to allow a user on one machine to log onto another machine over the network, other applications became more popular.

The first of these applications was enabled by the File Transfer Protocol (FTP), developed in 1971 by a group led by Abhay Bhushan of MIT (Bhushan, 1972). This protocol enabled a user on one system to connect to another system for the purpose of either sending or retrieving a particular file. The concept of an anonymous user was quickly added, with constrained access privileges, to allow users to connect to a system and browse the available files. Using Telnet, a user could read the remote files but could not do anything with them. With FTP, users could now move files to their own machines and work with them as local files. This capability spawned several new areas of activity, including distributed client-server computing and network-connected file systems.

Occasionally in computing, a "killer application" appears that becomes far more popular than its developers expected. When personal computers (PCs) became available in the 1980s, the spreadsheet (initially VisiCalc) was the application that accelerated the adoption of the new hardware by businesses. For the newly minted ARPANET, the killer application was electronic mail, or e-mail. The first e-mail program was developed in 1972 by Ray Tomlinson of BBN. Tomlinson had built an

earlier e-mail system for communication between users on BBN's Tenex time-sharing system, and it was a simple exercise to modify this system to work over the network. By combining the immediacy of the telephone with the precision of written communication, e-mail became an instant hit. Tomlinson's syntax ( user@domain ) remains in use today.

Telnet, FTP, and e-mail were examples of the leverage that research typically provided in early network development. As each new capability was added, the efficiency and speed with which knowledge could be disseminated improved. E-mail and FTP made it possible for geographically distributed researchers to collaborate and share results much more effectively. These programs were also among the first networking applications that were valuable not only to computer scientists, but also to scholars in other disciplines.

From Arpanet to Internet

Although the ARPANET was ARPA's largest networking effort, it was by no means the only one. The agency also supported research on terrestrial packet radio and packet satellite networks. In 1973, Robert Kahn and Vinton Cerf began to consider ways to interconnect these networks, which had quite different bandwidth, delay, and error properties than did the telephone lines of the ARPANET. The result was TCP/IP, first described in 1973 at an International Network Working Group meeting in England. Unlike NCP, which enabled the hosts of a single network to communicate, TCP/IP was designed to interconnect multiple networks to form an Internet. This protocol suite defined the packet format and a flow-control and error-recovery mechanism to allow the hosts to recover gracefully from network errors. It also specified an addressing mechanism that could support an Internet comprising up to 4 billion hosts.

The work necessary to transform TCP/IP from a concept into a useful system was performed under ARPA contract by groups at Stanford University, BBN, and University College London. Although TCP/IP has evolved over the years, it is still in use today as the Internet's basic packet transport protocol.

By 1975, the ARPANET had grown from its original four nodes to nearly 100 nodes. Around this time, two phenomena—the development of local area networks (LANs) and the integration of networking into operating systems—contributed to a rapid increase in the size of the network.

Local Area Networks

While ARPANET researchers were experimenting with dedicated telephone lines for packet transmission, researchers at the University of

Hawaii, led by Norman Abramson, were trying a different approach, also with ARPA funding. Like the ARPANET group, they wanted to provide remote access to their main computer system, but instead of a network of telephone lines, they used a shared radio network. It was shared in the sense that all stations used the same channel to reach the central station. This approach had a potential drawback: if two stations attempted to transmit at the same time, then their transmissions would interfere with each other, and neither one would be received. But such interruptions were unlikely because the data were typed on keyboards, which sent very short pulses to the computer, leaving ample time between pulses during which the channel was clear to receive keystrokes from a different user.

Abramson's system, known as Aloha, generated considerable interest in using a shared transmission medium, and several projects were initiated to build on the idea. Two of the best-known projects were the Atlantic Packet Satellite Experiment and Ethernet. The packet satellite network demonstrated that the protocols developed in Aloha for handling contention between simultaneous users, combined with more traditional reservation schemes, resulted in efficient use of the available bandwidth. However, the long latency inherent in satellite communications limited the usefulness of this approach.

Ethernet, developed by a group led by Robert Metcalfe at Xerox Corporation's Palo Alto Research Center (PARC), is one of the few examples of a networking technology that was not directly funded by the government. This experiment demonstrated that using coaxial cable as a shared medium resulted in an efficient network. Unlike the Aloha system, in which transmitters could not receive any signals, Ethernet stations could detect that collisions had occurred, stop transmitting immediately, and retry a short time later (at random). This approach improved the efficiency of the Aloha technique and made it practical for actual use. Shared-media LANs became the dominant form of computer-to-computer communication within a building or local area, although variations from IBM (Token Ring) and others also captured part of this emerging market.

Ethernet was initially used to connect a network of approximately 100 of PARC's Alto PCs, using the center's time-sharing system as a gateway to the ARPANET. Initially, many believed that the small size and limited performance of PCs would preclude their use as network hosts, but, with DARPA funding, David Clark's group at MIT, which had received several Altos from PARC, built an efficient TCP implementation for that system and, later, for the IBM PC. The proliferation of PCs connected by LANs in the 1980s dramatically increased the size of the Internet.

Integrated Networking

Until the 1970s, academic computer science research groups used a variety of computers and operating systems, many of them constructed by the researchers themselves. Most were time-sharing systems that supported a number of simultaneous users. By 1970, many groups had settled on the Digital Equipment Corporation (DEC) PDP-10 computer and the Tenex operating system developed at BBN. This standardization enabled researchers at different sites to share software, including networking software.

By the late 1970s, the Unix operating system, originally developed at Bell Labs, had become the system of choice for researchers, because it ran on DEC's inexpensive (relative to other systems) VAX line of computers. During the late 1970s and early 1980s, an ARPA-funded project at the University of California at Berkeley (UC-Berkeley) produced a version of Unix (the Berkeley System Distribution, or BSD) that included tightly integrated networking capabilities. The BSD was rapidly adopted by the research community because the availability of source code made it a useful experimental tool. In addition, it ran on both VAX machines and the personal workstations provided by the fledgling Sun Microsystems, Inc., several of whose founders came from the Berkeley group. The TCP/IP suite was now available on most of the computing platforms used by the research community.

Standards and Management

Unlike the various telecommunications networks, the Internet has no owner. It is a federation of commercial service providers, local educational networks, and private corporate networks, exchanging packets using TCP/IP and other, more specialized protocols. To become part of the Internet, a user need only connect a computer to a port on a service provider's router, obtain an IP address, and begin communicating. To add an entire network to the Internet is a bit trickier, but not extraordinarily so, as demonstrated by the tens of thousands of networks with tens of millions of hosts that constitute the Internet today.

The primary technical problem in the Internet is the standardization of its protocols. Today, this is accomplished by the Internet Engineering Task Force (IETF), a voluntary group interested in maintaining and expanding the scope of the Internet. Although this group has undergone many changes in name and makeup over the years, it traces its roots directly to Stephen Crocker's NWG, which defined the first ARPANET protocol in 1969. The NWG defined the system of requests for comments (RFCs) that are still used to specify protocols and discuss other engineer-

ing issues. Today's RFCs are still formatted as they were in 1969, eschewing the decorative fonts and styles that pervade today's Web.

Joining the IETF is a simple matter of asking to be placed on its mailing list, attending thrice-yearly meetings, and participating in the work. This grassroots group is far less formal than organizations such as the International Telecommunications Union, which defines telephony standards through the work of members who are essentially representatives of various governments. The open approach to Internet standards reflects the academic roots of the network.

Closing the Decade

The 1970s were a time of intensive research in networking. Much of the technology used today was developed during this period. Several networks other than ARPANET were assembled, primarily for use by computer scientists in support of their own research. Most of the work was funded by ARPA, although the NSF provided educational support for many researchers and was beginning to consider establishing a large-scale academic network.

During this period, ARPA pursued high-risk research with the potential for high payoffs. Its work was largely ignored by AT&T, and the major computer companies, notably IBM and DEC, began to offer proprietary networking solutions that competed with, rather than applied, the ARPA-developed technologies. 3 Yet the technologies developed under ARPA contract ultimately resulted in today's Internet. It is debatable whether a more risk-averse organization lacking the hands-on program management style of ARPA could have produced the same result.

Operation of the ARPANET was transferred to the Defense Communication Agency in 1975. By the end of the decade, the ARPANET had matured sufficiently to provide services. It remained in operation until 1989, when it was superseded by subsequent networks. The stage was now set for the Internet, which was first used by scientists, then by academics in many disciplines, and finally by the world at large.

The NSFNET Years: 1980-1990

During the late 1970s, several networks were constructed to serve the needs of particular research communities. These networks—typically funded by the federal agency that was the primary supporter of the research area—included MFENet, which the Department of Energy established to give its magnetic fusion energy researchers access to supercomputers, and NASA's Space Physics Analysis Network (SPAN). The NSF began supporting network infrastructure with the establishment

of CSNET, which was intended to link university computer science departments with the ARPANET. The CSNET had one notable property that the ARPANET lacked: it was open to all computer science researchers, whereas only ARPA contractors could use the ARPANET. An NSF grant to plan the CSNET was issued to Larry Landweber at the University of Wisconsin in 1980.

The CSNET was used throughout the 1980s, but as it and other regional networks began to demonstrate their usefulness, the NSF launched a much more ambitious effort, the NSFNET. From the start, the NSFNET was designed to be a network of networks—an ''internet''—with a high-speed backbone connecting NSF's five supercomputer centers and the National Center for Atmospheric Research. To oversee the new network, the NSF hired Dennis Jennings from Trinity College, Dublin. In the early 1980s, Jennings had been responsible for the Irish Higher Education Authority network (HEANet), and so he was well-qualified for the task. One of Jennings' first decisions was to select TCP/IP as the primary protocol suite for the NFSNET.

Because the NSFNET was to be an internet (the beginning of today's Internet), specialized computers called routers were needed to pass traffic between networks at the points where the networks met. Today, routers are the primary products of multibillion-dollar companies (e.g., Cisco Systems Incorporated, Bay Networks), but in 1985, few commercial products were available. The NSF chose the "Fuzzball" router designed by David Mills at the University of Delaware (Mills, 1988). Working with ARPA support, Mills improved the protocols used by the routers to communicate the network topology among themselves, a critical function in a large-scale network.

Another technology required for the rapidly growing Internet was the Domain Name Service (DNS). Developed by Paul Mockapetris at the University of Southern California's Information Sciences Institute, the DNS provides for hierarchical naming of hosts. An administrative entity, such as a university department, can assign host names as it wishes. It also has a domain name, issued by the higher-level authority of which it is a part. (Thus, a host named xyz in the computer science department at UC-Berkeley would be named xyz.cs.berkeley.edu. ) Servers located throughout the Internet provide translation between the host names used by human users and the IP addresses used by the Internet protocols. The name-distribution scheme has allowed the Internet to grow much more rapidly than would be possible with centralized administration.

Jennings left the NSF in 1986. He was succeeded by Stephen Wolff, who oversaw the deployment and growth of the NSFNET. During Wolff's tenure, the speed of the backbone, originally 56 kilobits per second, was increased 1,000-fold, and a large number of academic and regional net-

works were connected to the NSFNET. The NSF also began to expand the reach of the NSFNET beyond its supercomputing centers through its Connections program, which targeted the research and education community. In response to the Connections solicitation, the NSF received innovative proposals from what would become two of the major regional networks: SURANET and NYSERNET. These groups proposed to develop regional networks with a single connection to the NSFNET, instead of connecting each institution independently.

Hence, the NSFNET evolved into a three-tiered structure in which individual institutions connected to regional networks that were, in turn, connected to the backbone of the NSFNET. The NSF agreed to provide seed funding for connecting regional networks to the NSFNET, with the expectation that, as a critical mass was reached, the private sector would take over the management and operating costs of the Internet. This decision helped guide the Internet toward self-sufficiency and eventual commercialization (Computer Science and Telecommunications Board, 1994).

As the NSFNET expanded, opportunities for privatization grew. Wolff saw that commercial interests had to participate and provide financial support if the network were to continue to expand and evolve into a large, single internet. The NSF had already (in 1987) contracted with Merit Computer Network Incorporated at the University of Michigan to manage the backbone. Merit later formed a consortium with IBM and MCI Communications Corporation called Advanced Network and Services (ANS) to oversee upgrades to the NSFNET. Instead of reworking the existing backbone, ANS added a new, privately owned backbone for commercial services in 1991. 4

Emergence of the Web: 1990 to the Present

By the early 1990s, the Internet was international in scope, and its operation had largely been transferred from the NSF to commercial providers. Public access to the Internet expanded rapidly thanks to the ubiquitous nature of the analog telephone network and the availability of modems for connecting computers to this network. Digital transmission became possible throughout the telephone network with the deployment of optical fiber, and the telephone companies leased their broadband digital facilities for connecting routers and regional networks to the developers of the computer network. In April 1995, all commercialization restrictions on the Internet were lifted. Although still primarily used by academics and businesses, the Internet was growing, with the number of hosts reaching 250,000. Then the invention of the Web catapulted the Internet to mass popularity almost overnight.

The idea for the Web was simple: provide a common format for

documents stored on server computers, and give each document a unique name that can be used by a browser program to locate and retrieve the document. Because the unique names (called universal resource locators, or URLs) are long, including the DNS name of the host on which they are stored, URLs would be represented as shorter hypertext links in other documents. When the user of a browser clicks a mouse on a link, the browser retrieves and displays the document named by the URL.

This idea was implemented by Timothy Berners-Lee and Robert Cailliau at CERN, the high-energy physics laboratory in Geneva, Switzerland, funded by the governments of participating European nations. Berners-Lee and Cailliau proposed to develop a system of links between different sources of information. Certain parts of a file would be made into nodes, which, when called up, would link the user to other, related files. The pair devised a document format called HYpertext Markup Language (HTML), a variant of the Standard Generalized Markup Language used in the publishing industry since the 1950s. It was released at CERN in May 1991. In July 1992, a new Internet protocol, the Hypertext Transfer Protocol (HTTP), was introduced to improve the efficiency of document retrieval. Although the Web was originally intended to improve communications within the physics community at CERN, it—like e-mail 20 years earlier—rapidly became the new killer application for the Internet.

The idea of hypertext was not new. One of the first demonstrations of a hypertext system, in which a user could click a mouse on a highlighted word in a document and immediately access a different part of the document (or, in fact, another document entirely), occurred at the 1967 Fall Joint Computer Conference in San Francisco. At this conference, Douglas Engelbart of SRI gave a stunning demonstration of his NLS (Engelbart, 1986), which provided many of the capabilities of today's Web browsers, albeit limited to a single computer. Engelbart's Augment project was supported by funding from NASA and ARPA. Engelbart was awarded the Association for Computing Machinery's 1997 A. M. Turing Award for this work. Although it never became commercially successful, the mouse-driven user interface inspired researchers at Xerox PARC, who were developing personal computing technology.

Widespread use of the Web, which now accounts for the largest volume of Internet traffic, was accelerated by the development in 1993 of the Mosaic graphical browser. This innovation, by Marc Andreessen at the NSF-funded National Center for Supercomputer Applications, enabled the use of hyperlinks to video, audio, and graphics, as well as text. More important, it provided an effective interface that allowed users to point-and-click on a menu or fill in a blank to search for information.

The development of the Internet and the World Wide Web has had a tremendous impact on the U.S. economy and society more broadly. By

January 1998, almost 30 million host computers were connected to the Internet (Zakon, 1998), and more than 58 million users in the United States and Canada were estimated to be online (Nielsen Media Research, 1997). Numerous companies now sell Internet products worth billions of dollars. Cisco Systems, a leader in network routing technology, for example, reported sales of $8.5 billion in 1998. Netscape Communications Corporation, which commercialized the Mosaic browser, had sales exceeding $530 million in 1997. 5 Microsoft Corporation also entered the market for Web browsers and now competes head-to-head with Netscape. A multitude of other companies offer hardware and software for Internet based systems.

The Internet has also paved the way for a host of services. Companies like Yahoo! and InfoSeek provide portals to the Internet and have attracted considerable attention from Wall Street investors. Other companies, like Amazon.com and Barnes & Noble, have established online stores. Amazon had online sales of almost $150 million for books in 1997. 6 Electronic commerce, more broadly, is taking hold in many types of organizations, from PC manufacturers to retailers to travel agencies. Although estimates of the value of these services vary widely, they all reflect a growing sector of the economy that is wholly dependent on the Internet. Internet retailing could reach $7 billion by the year 2000, and online sales of travel services are expected to approach $8 billion around the turn of the century. Forrester Research estimates that businesses will buy and sell $327 billion worth of goods over the Internet by the year 2002 (Blane, 1997).

The Web has been likened to the world's largest library—with the books piled in the middle of the floor. Search engines, which are programs that follow the Web's hypertext links and index the material they discover, have improved the organization somewhat but are difficult to use, frequently deluging the user with irrelevant information. Although developments in computing and networking over the last 40 years have realized some of the potential described by visionaries such as Licklider and Engelbart, the field continues to offer many opportunities for innovation.

Lessons from History

The development of the Internet demonstrates that federal support for research, applied at the right place and right time, can be extremely effective. DARPA's support gave visibility to the work of individual researchers on packet switching and resulted in the development of the first large-scale packet-switched network. Continued support for experimentation led to the development of networking protocols and applications, such as e-mail, that were used on the ARPANET and, subsequently, the Internet.

By bringing together a diverse mix of researchers from different institutions, such federal programs helped the Internet gain widespread acceptance and established it as a dominant mode of internetworking. Government programs such as ARPANET and NSFNET created a large enough base of users to make the Internet more attractive in many applications than proprietary networking systems being offered by a number of vendors. Though a number of companies continue to sell proprietary systems for wide area networking, some of which are based on packet-switched technology, these systems have not achieved the ubiquity of the Internet and are used mainly within private industry.

Research in packet switching evolved in unexpected directions and had unanticipated consequences. It was originally pursued to make more-efficient use of limited computing capabilities and later seen as a means of linking the research and education communities. The most notable result, however, was the Internet, which has dramatically improved communication across society, changing the way people work, play, and shop. Although DARPA and the NSF were successful in creating an expansive packet-switched network to facilitate communication among researchers, it took the invention of the Web and its browsers to make the Internet more broadly accessible and useful to society.

The widespread adoption of Internet technology has created a number of new companies in industries that did not exist 20 years ago, and most companies that did exist 20 years ago are incorporating Internet technology into their business operations. Companies such as Cisco Systems, Netscape Communications, Yahoo!, and Amazon.com are built on Internet technologies and their applications and generate billions of dollars annually in combined sales revenues. Electronic commerce is also maturing into an established means of conducting business.

The complementary missions and operating styles of federal agencies are important to the development and implementation of new technologies. Whereas DARPA supported early research on packet switching and development of the ARPANET, it was not prepared to support an operational network, nor did it expand its network beyond DARPA-supported research institutions. With its charter to support research and education, the NSF both supported an operational network and greatly expanded its reach, effectively building the infrastructure for the Internet.

1.  

Several other case studies of the Internet have also been written in recent years. In addition to the references cited in the text, see Leiner et al. (1998) and SRI International (1997).

2.  

Personal communication from Robert W. Taylor, former director of the

Information Processing Techniques Office, Defense Advanced Research Projects Agency, August 1988.

3.  

IBM and AT&T did support some in-house research on packet switching, but at the level of individual researchers. This work did not figure prominently in AT&T's plans for network deployment, nor did it receive significant attention at IBM, though researchers in both organizations published important papers.

4.  

Ferreiro, Mirna. 1996. "The Past and Future History of the Internet," research paper for International 610. George Mason University, Fairfax, Va., November.

5.  

Sales figures in this paragraph derive from annual reports filed by the companies cited.

6.  

Sales revenues as reported in Amazon.com's 1997 Annual Report available online at < >.

The past 50 years have witnessed a revolution in computing and related communications technologies. The contributions of industry and university researchers to this revolution are manifest; less widely recognized is the major role the federal government played in launching the computing revolution and sustaining its momentum. Funding a Revolution examines the history of computing since World War II to elucidate the federal government's role in funding computing research, supporting the education of computer scientists and engineers, and equipping university research labs. It reviews the economic rationale for government support of research, characterizes federal support for computing research, and summarizes key historical advances in which government-sponsored research played an important role.

Funding a Revolution contains a series of case studies in relational databases, the Internet, theoretical computer science, artificial intelligence, and virtual reality that demonstrate the complex interactions among government, universities, and industry that have driven the field. It offers a series of lessons that identify factors contributing to the success of the nation's computing enterprise and the government's role within it.

READ FREE ONLINE

Welcome to OpenBook!

You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

Do you want to take a quick tour of the OpenBook's features?

Show this book's table of contents , where you can jump to any chapter by name.

...or use these buttons to go back to the previous chapter or skip to the next one.

Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

Switch between the Original Pages , where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

To search the entire text of this book, type in your search term here and press Enter .

Share a link to this book page on your preferred social network or via email.

View our suggested citation for this chapter.

Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

Get Email Updates

Do you enjoy reading reports from the Academies online for free ? Sign up for email notifications and we'll let you know about new publications in your areas of interest when they're released.

Session 01 - The Internet, World Wide Web, Web Browsers, Web Sites, and HTML

Harvard Extension School   Fall 2023

Course Web Site: https://cscie12.dce.harvard.edu/

How does the web work?

The internet and the world wide web, how do we make the web work, csci e-12 course, a web address - urls, components of the web, client-side web parts: markup, style, function, html introduction, essential html5 document structure, common html5 elements - a element - anchor, relative urls, "hello world" first assigment, file management, course web hosting server: urls and file locations.

Session 01 - The Internet, World Wide Web, Web Browsers, Web Sites, and HTML, slide1 Welcome!, slide2 How does the web work?, slide3 How does the web work?, slide4 The Internet and the World Wide Web, slide5 The Internet: Schematic, slide6 Tim Berners-Lee on The World Wide Web, slide7 The World Wide Web — key aspects, slide8 How has the web changed? A brief look at Southwest Airlines website evolution, slide9 How do we make the web work?, slide10 How do we make the web work — some tools for our toolbox, slide11 Approaching a Web Project, slide12 CSCI E-12 Course, slide13 CSCI E-12 — Goals for the Course, slide14 CSCI E-12 — Key factors for success in the course, slide15 A Web Address - URLs, slide16 Aside: URLs, URIs, and URNs, slide17 Components of the Web, slide18 Client-side Web Parts: Markup, Style, Function, slide19 Our Solar System: Markup, slide20 Our Solar System: Markup + Style, slide21 Our Solar System: Markup + Style + Function, slide22 HTML Introduction, slide23 Markup - HTML, slide24 Essential HTML5 Document Structure, slide25 HTML Elements - the basic building blocks structure, slide26 HTML Elements - Content can be other HTML elements, slide27 HTML Elements - Sometimes you will have more than one attribute, slide28 HTML Elements that are "empty", slide29 HTML Elements - Sometimes HTML allows you to leave off end tags, slide30 HTML5, slide31 Most commonly used or seen elements, slide32 Learning about HTML elements, slide33 HTML Purpose, slide34 Web Page Structure - header, main, footer, slide35 HTML5 Document Template, slide36 Benefits of Web Standards, slide37 HTML Best Practices to start out with, slide38 Common HTML5 Elements - a element - anchor, slide39 Creating Links, slide40 Relative URLs, slide41 Absolute and Relative Locations, slide42 Relative Paths to Parent Locations, slide43 "Hello World" First Assigment, slide44 File Management, slide45 File Management, slide46 Course Web Hosting Server: URLs and File locations, slide47 Workflow, slide48

Presentation contains 48 slides

CSCI E-12, Fundamentals of Website Development Fall Term 2023 Harvard Extension School

Essential Questions to Consider

Four phases for tonight.

  • A bit about the course
  • Getting started with the web and HTML
  • Orientation to the first assignment

What happens when you enter https://extension.harvard.edu/ into your browser?

harvard extension school home page screenshot

  • Web Browser
  • Web Address (URL) and Communication
  • Web Content (HTML, CSS, JS, images)

web parts

1. Web Browser (HTTP Client)

http client

2. Web Server (HTTP Server)

server-side

3. Communication

Communication between the web browser and web server, including how they communicate (HTTP) and the network.

Visualization of the routing paths of the Internet.

Image credit: Barrett Lyon / The Opte Project "Visualization of the routing paths of the Internet."" Used under the Creative Commons Attribution-NonCommercial 4.0 International License.

The Internet: Schematic

Internet

The Internet came before the Web, and Web "traffic" is not the only type of traffic on the Internet

Tim Berners-Lee on The World Wide Web

Suppose all the information stored on computers everywhere were linked. Suppose I could program my computer to create a space in which everything could be linked to everything. Tim Berners-Lee
The Web evolved into a powerful, ubiquitous tool because it was built on egalitarian principles and because thousands of individuals, universities and companies have worked, both independently and together as part of the World Wide Web Consortium, to expand its capabilities based on those principles. Tim Berners-Lee in Long Live the Web (Scientific American, Nov/Dec 2010)
Today, and throughout this year, we should celebrate the Web’s first 25 years. But though the mood is upbeat, we also know we are not done . We have much to do for the Web to reach its full potential. We must continue to defend its core principles and tackle some key challenges . Tim Berners-Lee in Welcome to the Web's 25 Anniversary (2014)
The web is for everyone, and collectively we hold the power to change it. It won’t be easy. But if we dream a little and work a lot, we can get the web we want. Tim Berners-Lee interview on 30 years of the world wide web in The Guardian (2019)

What is Tim Berners-Lee up to today? He's concerned about personal data sovereignty — and has software and a company to help address it.

The World Wide Web — key aspects

  • HyperText Information System
  • Cross-Platform and Cross-Device ...then and now
  • 255 million unique domains, 201 million active sites, and 12 million "web-facing" computers ( Netcraft Web Server Survey , August 2023)
  • HTML, CSS, JavaScript, HTTP, networking
  • Browsers and engines (Chromium, Webkit, Mozilla), languages (PHP, Python), server software (Apache, nginx, WordPress)
  • Information, Shopping, Banking and Finance, Communication, Business workflows, etc.
  • Dynamic, Interactive, Evolving

How has the web changed? A brief look at Southwest Airlines website evolution

Southwest

  • Physical Desk

Southwest

  • Main categories and functions

Southwest

  • Quick Links

Southwest

  • Travel Tools as icons

Southwest

  • Travel tools as panels
  • Main imagery chages

Southwest

  • Travel Products

Southwest

  • Travel Products expanded, multiple locations
  • Travel tool panels
  • Mega footer beginning

Southwest

  • Less is more
  • "Action" or "Do" panel clearer
  • Social media links

Southwest

Screenshots from my collection and from Internet Archive Wayback Machine

Understand the parts, and how they work individually and how they work together.

  • HTML, CSS, JavaScript
  • Hosting, Web server software, programming languages (JavaScript, Python, etc.)
  • User Experience and Design

This works at multiple layers too —

  • we need to understand HTML, CSS, and JavaScript individually as well as how they interact or relate to one another
  • we also need to understand how we use HTML, CSS, and JavaScript together to accomplish a specific design or interface that will be useful to our users

How do we make the web work — some tools for our toolbox

  • A code editor . Recommendation: Visual Studio Code with a few extensions: Live Server , HTMLHint , Prettier - Code Formatter

I'm not going to ask you to change the browser that you know and use all the time, but I will ask you to have a couple more in the rotation for testing, and to remember that not everyone who views your site will be using your favorite browser.

  • An SFTP client (SFTP = secure file transfer protocol), such as Cyberduck . An SFTP client lets you move files you have edited locally to your web server account.
  • VPN client (VPN = virtual private network). A VPN client lets you connect securely to a network in order to access restricted resources. You will need to be using Harvard VPN in order to SFTP content to your course web hosting account.

There are some more to add to the list eventually, but for now, let's keep it with these!

Approaching a Web Project

elements of user experience

5 Planes from The Elements of User Experience: User-Centered Design for the Web

world wide web assignment

  • Class & Sections
  • Announcements

CSCI E-12 — Goals for the Course

  • Think like a web developer (programmer, content, project management)
  • Learn syntax of HTML, CSS, JavaScript.
  • Break problems down in parts and build parts up to whole in iterations
  • Learn tools and workflows to improve results and efficiency
  • Understand how things relate to one another.
  • Troubleshoot effectively when things are not working as expected.

CSCI E-12 — Key factors for success in the course

  • Complete and submit the assignments (and to do this you'll likely need to watch the class recordings, attend or view sections, engage in the readings or other resources)
  • Connect with students and course staff
  • Engage with resources available to you: class meetings, sections, Slack, textbook, office hours
  • Start early (even if it is just reading through it)
  • Work to understand a concept outside the context of an assignment — use a 'playground' or 'sandbox' folder on your computer or even something like Codepen or JSFiddle, then apply your understanding to the assignment.
  • leave yourself time to review and revise an assignment
  • Seek out help! Running into roadblocks? Read through thad MDN doc! Check out the text! Attend or watch sections! Communicate early with David or your TA through Slack or Canvas Inbox!

URL = Uniform Resource Locator

"The crucial thing is the URL. The crucial thing is that you can link to anything." — Tim Berners Lee

URL components - scheme, host, and path

URL/URI https://www.archives.gov/historical-docs/voting-rights-act

  • Scheme (also Protocol ) https ://www.archives.gov/historical-docs/voting-rights-act
  • Host (also Authority ) https:// www.archives.gov /historical-docs/voting-rights-act
  • Path https://www.archives.gov /historical-docs/voting-rights-act

Aside: URLs, URIs, and URNs

In the context of the web, all URIs are URLs!

  • URL : Uniform Resource Locator
  • URN : Uniform Resource Name

URI, URN, URL

Huh? A name may be unique, but may not tell you anything about how to locate it.

"Designing Your New Work Life" by Bill Burnett and Dave Evans is a book. It can be uniquely identified by the URN isbn:9780593467459 , and various URLs can be used to locate it (well, if not locate it, at least locate how to purchase or locate it in a library) https://www.amazon.com/Designing-Your-Work-Life-Happiness/dp/0593467450/ https://www.barnesandnoble.com/w/designing-your-new-work-life-bill-burnett/1140537816 https://designingyour.life/designing-your-new-work-life/ https://www.abebooks.com/9780593467459/Designing-New-Work-Life-Thrive-0593467450/plp https://www.worldcat.org/title/1256628247?oclcNum=1256628247

web parts

  • Presentation
  • Manipulations

Our Solar System: Markup

markup

Our Solar System: Markup + Style

markup + style

Our Solar System: Markup + Style + Function

markup + style + function

  • solarsystem.css

Markup - HTML

How a browser displays it.

web page

How Your Browser Thinks About It

dom tree

HTML Elements - the basic building blocks structure

  • Element Name
  • Attribute and Value Pairs

A Hypertext Link

Markup for a Hypertext link:

Harvard Link in a web browser

Start Tag <a href="https://www.harvard.edu/"> Harvard</a>

Element Name < a href="https://www.harvard.edu/">Harvard</a>

Attribute <a href ="https://www.harvard.edu/">Harvard</a>

Attribute Value <a href=" https://www.harvard.edu/ ">Harvard</a>

Content <a href="htts://www.harvard.edu/"> Harvard </a>

End Tag <a href="https://www.harvard.edu/">Harvard </a>

HTML Elements - Content can be other HTML elements

ul and li nodes

ul is an unordered list li is a list item

HTML Elements - Sometimes you will have more than one attribute

img node with two attributes

img is to embed an image

HTML Elements that are "empty"

Note the "end tag" is part of the "start tag" — <link />

img node with two attributes

link is used to reference a CSS stylesheet, a separate document that contains style rules to apply to the HTML document

HTML Elements - Sometimes HTML allows you to leave off end tags

In these cases the "end tags" are "implied" because of what follows.

Learning tip: Always use end tags!

HTML5 Logo

More information: HTML5 Living Standard from the WHATWG . Section 4 contains the List of elements in HTML .

I've highlighted the 23 elements that you will use and/or see most commonly.

  • h1 , h2 , h3 , h4 , h5 , h6

Most commonly used or seen elements

Learning about html elements.

How to find out more about HTML elements?

Two places that I would start are:

HTML Purpose

  • Gives structure and meaning to our content

Think about three aspects of structure :

  • HTML document structure html , head , body
  • Web page structure header , main , nav , footer
  • Content structure Headings ( h1 , h2 , h3 ), lists ( ul and li ), paragraphs ( p ), images ( img ), text, etc.

Example: solarsystem.html

Web Page Structure - header, main, footer

First, recall the basic document structure:

header, main, footer

MDN HTML elements reference: header , main , footer .

HTML5 Document Template

Benefits of web standards.

  • Markup (HTML) Nu Html Checker https://validator.w3.org/nu/
  • Style (CSS) CSS Validation Service https://jigsaw.w3.org/css-validator/
  • Function (JavaScript)
  • Search Engines
  • Forward-compatibility and backward-compatibility.
  • simpler, cleaner pages
  • easier maintenance
  • easier redesign
  • Validation provides baseline when you go to edit.

" Postel's Law " or the " Robustness Principle "

HTML Best Practices to start out with

  • Use start and end tags, even if optional
  • Lower case element and attribute names
  • Use quotes around attribute values

These best practices essentially follow the "XML" syntax rules for HTML5

The anchor — a — element is at the center of the key "hypertext" feature of the web. The a element is how to create hyperlinks from resource to another!

To go along with the a element is the href attribute. The value of the href attribute is the URL that the browser will load when the link is activated (e.g. a mouse click).

The following paragraph was taken from "'Sunshine vitamin' looks a little brighter", Harvard Gazette, February 5, 2013 :

Adequate levels of vitamin D during young adulthood may reduce the risk of adult-onset type 1 diabetes by as much as 50 percent, according to researchers at the Harvard School of Public Health (HSPH). If confirmed in future studies, the findings could lead to a role for vitamin D supplementation in preventing this serious autoimmune disease in adults.

Creating Links

Build confidence by making your links predictable and differentiable .

  • Am I getting 'closer' to my goal?
  • What is the difference between clicking here or clicking there?

scent

  • Link several words or a phrase, not just one or two words
  • Use "title" attribute to elaborate
  • Lie or Mislead
  • " Click Here "
  • Find out more in this knowledge base article

URL https://www.archives.gov/historical-docs/voting-rights-act

  • Scheme https ://www.archives.gov/historical-docs/voting-rights-act
  • Host https:// www.archives.gov /historical-docs/voting-rights-act

Absolute and Relative Locations

  • Where does https://summer.harvard.edu/ go to?
  • How about /images/mug.png ?
  • What about ../styles/site.css ?

Relative locations (URLs) are resolved according to the location (URL) of the containing (starting) document!

Absolute or Fully Qualified URLs

Absolute, or fully-qualified, URLs specify the complete information (scheme, host, port, path).

https://news.harvard.edu/gazette/story/2020/07/public-health-experts-unite-to-bring-clarity-to-coronavirus-response/

Relative or Partial URLs

Relative, or partial, URIs specify partial information. The information not provided is resolved from the current location.

<a href="slide2.html">Slide 2</a>

Relative to Server Root

Is this relative or absolute? Scheme, host, and port would be resolved from current location, but path is absolute

<a href="/copyright.html">copyright information</a>

Relative Paths to Parent Locations

  • ../ refers to the parent directory
  • ./ refers to current directory
Location:
Relative URLResolved URL

Relative links are "transportable":

locations:
Relative LinkResolved URL

Hello World - Publishing a web page

  • Get the tools (editor, browser, vpn client, sftp client) in place, and begin to use them.
  • working with ".zip" files with Expand/Compress (Windows) or Unzip/Zip (Mac)
  • keeping everything together in a folder and working with that folder (e.g. "File → Open Folder")
  • Edit an existing HTML document
  • Practice validating your HTML
  • Go through the publishing process on your course web hosting server account
  • Understand how to determine the URL to your published content

For Your Class Work

  • Create a directory or folder for your class work.
  • Create a "playground" or "sandbox" folder where you can play in and experiment in without worrying.
  • Assignments - unzip/extract the materials, then move into your class work folder

For Web Sites

  • Use folders or directories to help organize files. Recommendation is to adopt folder names of styles (for CSS files), scripts (for JavaScript files), and images (for images). . ├── images/ ├── index.html ├── scripts/ └── styles/ └── site.css
  • Use index.html filename as appropriate
  • Prefer filenames that only have lowercase, numeric, underscore or dashes (e.g. avoid spaces, and other things like !@#$%^&*(){}\|?/>

See: Course Web Hosting Server (Dreamhost) on the course web site.

URLFile
https://cwe871.students.cs12.net/index.html/home/dh_xyz45/cwe871.students.cs12.net/index.html
https://cwe871.students.cs12.net/big_ideas/extension_school.html/home/users/cwe871/public_html/big_ideas/extension_school.html
https://cwe871.students.cs12.net/big_ideas/elective_system.html/home/dh_xyz45/cwe871.students.cs12.net/big_ideas/elective_system.html

Directory Requests and "index.html"

URL paths that map to a directory. For example the request: https://cwe871.students.cs12.net/big_ideas/ would return the index.html document in the big_ideas directory (e.g. /users/cwe871/public_html/big_ideas/index.html ).

Setup Once for Course

  • Editor . Install Visual Studio Code , with Live Server, W3C Validation, and HTMLHint extensions
  • SFTP client . Install Cyberduck or use Dreamhost File Manager ( https://files.dreamhost.com
  • VPN client . Install Harvard Cisco AnyConnect VPN client software
  • On your computer, create a "course work folder" to keep your work for the course. I recommend something like cscie12-work on your Desktop.

For Assignments

  • Accept the assignment via the GitHub link that creates a repo for you. Then get that repo code locally to edit.
  • Download the assignment ZIP file.
  • Unzip or Extract the ZIP file.
  • Move the folder into your "course work folder" you created above. Move the entire folder.
  • Start Visual Studio Code, and "File → Open Folder" and navigate to the assignment folder to open. In VS Code, I recommend always opening the assignment folder as opposed to opening individual files (treat the assginment folder contents as a unit!)
  • Use VS Code "Live Server" to provide a place to check your work.
  • Edit the HTML, CSS, and/or JavaScript. Save.
  • Periodically check in the browser (a reload may be needed) as well as other checks that may be needed such as validation or accessibility.
  • Repeat the "Edit, Check" cycle until you've satisfied the rubric, yourself, or when the assignment is due.
  • Check in your browser that you can access your work. Copy the URL in your browser, and submit that URL via Canvas
  • Submit the code by giving us the URL to your GitHub repo, or give us the ZIP file of your completed assignment.
  • Full Stack Course
  • React Native
  • CSS Frameworks
  • JS Frameworks
  • Web Development

The Internet and the Web

Introduction  :

The internet is a global network of interconnected computers and servers that allows people to communicate, share information, and access resources from anywhere in the world. It was created in the 1960s by the US Department of Defense as a way to connect computers and share information between researchers and scientists.

The World Wide Web, or simply the web, is a system of interconnected documents and resources, linked together by hyperlinks and URLs. It was created by Tim Berners-Lee in 1989 as a way for scientists to share information more easily. The web quickly grew to become the most popular way to access information on the internet.

Together, the internet and the web have revolutionized the way we communicate, do business, and access information. They have made it possible for people all over the world to connect with each other instantly and have transformed many industries, from media and entertainment to education and healthcare.

 1. The Internet:  In simplest terms, the Internet is a global network comprised of smaller networks that are interconnected using standardized communication protocols. The Internet standards describe a framework known as the Internet protocol suite. This model divides methods into a layered system of protocols.  

These layers are as follows:

  • Application layer (highest) – concerned with the data(URL, type, etc.). This is where HTTP, HTTPS, etc., comes in.
  • Transport layer – responsible for end-to-end communication over a network.
  • Network layer – provides data route. 

The Internet provides a variety of information and communication facilities; contains forums, databases, email, hypertext, etc. It consists of private, public, academic, business, and government networks of local to global scope, linked by a broad array of electronic, wireless, and optical networking technologies. 

2. The World Wide Web:  The Web is the only way to access information through the Internet. It’s a system of Internet servers that support specially formatted documents. The documents are formatted in a markup language called HTML , or “HyperText Markup Language”, which supports a number of features including links and multimedia. These documents are interlinked using hypertext links and are accessible via the Internet. 

To link hypertext to the Internet, we need: 

  • The markup language, i.e., HTML.
  • The transfer protocol, e.g., HTTP.
  • Uniform Resource Locator (URL), the address of the resource. 

We access the Web using Web browsers . 

Difference between Web and Internet:

Internet Web
The Internet is the network of networks and the network allows to exchange of data between two or more computers.                                                                                                                                                  The Web is a way to access information through the Internet.
It is also known as the Network of Networks. The Web is a model for sharing information using the Internet.                                                                                                                                                                                                                  
The Internet is a way of transporting information between devices. The protocol used by the web is HTTP.
Accessible in a variety of ways. The Web is accessed by the Web Browser.
Network protocols are used to transport data. Accesses documents and online sites through browsers.
 Global network of networks Collection of interconnected websites
Access Can be accessed using various devices Accessed through a web browser
Connectivity Network of networks that allows devices to communicate and exchange data Connectivity Allows users to access and view web pages, multimedia content, and other resources over the Internet
Protocols TCP/IP, FTP, SMTP, POP3, etc. Protocols HTTP, HTTPS, FTP, SMTP, etc.
Infrastructure Consists of routers, switches, servers, and other networking hardware Infrastructure Consists of web servers, web browsers, and other software and hardware
 Used for communication, sharing of resources, and accessing information from around the world Used for publishing and accessing web pages, multimedia content, and other resources on the Internet
No single creator Creator Tim Berners-Lee
Provides the underlying infrastructure for the Web, email, and other online services Provides a platform for publishing and accessing information and resources on the Internet

URI:  URI stands for ‘Uniform Resource Identifier’ . A URI can be a name, locator, or both for an online resource whereas a URL is just the locator. URLs are a subset of URIs.  A URL is a human-readable text that was designed to replace the numbers (IP addresses) that computers use to communicate with servers.

A URL consists of a protocol, domain name, and path (which includes the specific subfolder structure where a page is located) like-

       protocol://WebSiteName.topLevelDomain/path

  • Protocol – HTTP or HTTPS.
  • WebSiteName – geeksforgeeks, google etc.
  • topLevelDomain- .com, .edu, .in etc.
  • path- specific folders and/or subfolders that are on a given website.

Who governs the Internet? 

The Internet is not governed and has no single authority figure. The ultimate authority for where the Internet is going rests with the Internet Society , or ISOC.  ISOC is a voluntary membership organization whose purpose is to promote global information exchange through Internet technology.  

  • ISOC appoints the IAB- Internet Architecture Board . They meet regularly to review standards and allocate resources, like addresses.
  • IETF- Internet Engineering Task Force . Another volunteer organization that meets regularly to discuss operational and technical problems. 

Uses of Internet and the Web :

  • Communication: The internet and web have made communication faster and easier than ever before. We can now send emails, chat online, make video calls, and use social media platforms to connect with people all over the world.
  • Information sharing: The web has made it possible to access vast amounts of information on any topic from anywhere in the world. We can read news articles, watch videos, listen to podcasts, and access online libraries and databases.
  • Online shopping: The internet and web have revolutionized the way we shop. We can now browse and purchase products online, from clothes and groceries to electronics and furniture.
  • Entertainment: The internet and web provide a wealth of entertainment options, from streaming movies and TV shows to playing online games and listening to music.
  • Education: The web has made it possible to access educational resources from anywhere in the world. We can take online courses, access e-books and digital libraries, and connect with educators and other learners through online communities.
  • Business: The internet and web have transformed the way businesses operate. Companies can now use e-commerce platforms to sell products and services, collaborate with remote workers, and access global markets.
  • Research: The internet and web have made it easier for researchers to access and share information. We can now access scientific journals and databases, collaborate with other researchers online, and conduct surveys and experiments through online platforms.

Issues in Internet and the Web :

  • Privacy and security: The internet and web are vulnerable to various security threats, such as hacking, identity theft, and phishing attacks. These threats can compromise our personal information, such as login credentials, financial information, and personal data.
  • Cyberbullying: The anonymity of the internet and web can lead to cyberbullying, where individuals are harassed or threatened online. Cyberbullying can have severe consequences, including depression, anxiety, and suicide.
  • Online addiction: The internet and web can be addictive, and individuals can spend hours browsing social media or playing online games, leading to neglect of other important aspects of their lives.
  • Disinformation: The internet and web are filled with inaccurate or false information, which can lead to misinformation, propaganda, and conspiracy theories.
  • Digital divide: Access to the internet and web is not universal, and many individuals, particularly those in low-income areas or rural communities, lack access to reliable and high-speed internet.
  • Online censorship: Some governments or organizations may censor or restrict access to certain websites or information, limiting freedom of speech and expression.
  • Environmental impact: The internet and web consume a significant amount of energy, contributing to carbon emissions and climate change.

Please Login to comment...

Similar reads.

  • Web Technologies
  • OpenAI o1 AI Model Launched: Explore o1-Preview, o1-Mini, Pricing & Comparison
  • How to Merge Cells in Google Sheets: Step by Step Guide
  • How to Lock Cells in Google Sheets : Step by Step Guide
  • PS5 Pro Launched: Controller, Price, Specs & Features, How to Pre-Order, and More
  • #geekstreak2024 – 21 Days POTD Challenge Powered By Deutsche Bank

Improve your Coding Skills with Practice

 alt=

What kind of Experience do you want to share?

Please take our 6-minute community survey on how we fulfill our mission for the world-wide web.

Leading the web to its full potential

The World Wide Web Consortium (W3C) is an international public-interest non-profit organization where Member organizations, a full-time staff, and the public work together to develop Web standards. Founded by Web inventor Tim Berners-Lee and led by President & CEO Seth Dobbs and a Board of Directors , the Web Consortium's mission is to lead the web to its full potential.

W3C's history

Web inventor Tim Berners-Lee founded the World Wide Web Consortium in 1994 to ensure the long-term growth of the Web. He remains W3C's Emeritus Director and Honorary Member of the Board of Directors.

From the start the World Wide Web Consortium (W3C) has been an international multi-stakeholder community where member organizations , a full-time  staff , and the public work together to develop open web standards .

Read the W3C history

W3C Community

W3C's global standards constitute the toolkit for web solutions that scale, enabling innovators to solve hard problems, providing the proper foundations to meet requirements for accessibility, internationalization, privacy, and security on the web.

Standards that meet the varied needs of society are created not by one company but through the work of the Web Consortium community:

  • Members : More than 350 Members from around the world lead the development and implementation of standards.
  • Staff : W3C is a public-interest non-profit organization whose revenues come primarily from Membership dues. These and some grants support a staff of about 50 people who are direct employees or employees of W3C Partners (former Hosts): ERCIM (Europe), Keio University (Japan), Beihang University (China). 
  • Developers : Over 14,700 developers worldwide participate in the standards development.

Learn how W3C is led

Some of our unique advantages

Open standards that make a difference.

Our community has developed several hundreds of open standards that have enabled the creation of two billion websites, the emergence of flourishing business ecosystems, and made the Web accessible to more people, inclusive, and secure.

Royalty-free to boost adoption

W3C standards may be used by anyone at no cost: if they were not free, developers would ignore them.

Built-in inclusivity

W3C technologies and guidelines make it possible for people with disabilities to access the web. The web supports communication in many of the world's languages and writing systems.

Securing the web

W3C standards improve web security through the development of authentication technologies that can replace weak passwords and reduce phishing and other sophisticated cyberattacks.

"The Web is humanity connected by technology." Sir Tim Berners-Lee, inventor of the Web

W3C standard development process

The proven standards development process upheld at the Web Consortium promotes fairness and enables progress.

Our standards work is accomplished in the open, under the W3C Process Document and royalty-free W3C Patent Policy, with input from the broader community. Decisions are taken by consensus. Technical direction and Recommendations require review by W3C Members – large and small. The Advisory Board guides the community-driven enhancement of the Process Document. The Technical Architecture Group is our highest authority on technical matters.

International Participation

W3C conducts its work primarily in English. Organizations located all over the world and involved in many different fields join W3C as Members to participate in a vendor-neutral forum for the creation of Web standards. W3C Members and a dedicated full-time staff of experts have earned W3C international recognition for contributions to the Web. W3C's global efforts include:

  • Liaisons with national, regional and international organizations around the globe. These contacts help W3C maintain a culture of global participation in the development of the World Wide Web. W3C coordinates particularly closely with other organizations that are developing standards for the Web or Internet in order to enable clear progress.
  • The W3C Chapters Program , which promotes adoption of W3C recommendations among developers, application builders, and standards setters, and encourage inclusions of stakeholder organizations in the creation of future standards by joining W3C.
  • Translations of Web standards and other materials from dedicated volunteers in the W3C community. W3C also has a policy for authorized translations of W3C materials. Authorized W3C Translations can be used for official purposes in languages other than English.
  • Talks around the world in a variety of languages on Web standards by people closely involved in the creation of the standards.
  • W3C's Internationalization activity helps ensure that the Web is available to people.

Recognition

In orchestrating these activities, the Web Consortium has earned a reputation for fairness, quality, and efficiency.

Though not well-known by the general public, the Web Consortium has earned recognition for its global impact: the Boston Globe ranked W3C the most important achievement associated with MIT (the first W3C historical Host).

The Web Consortium's impact even extends beyond this planet: NASA regularly uses W3C standards in Mars and space exploration missions.

The organization has won three Emmy Awards : in 2016 for its work to make online videos more accessible with captions and subtitles, in 2019 for standardization of a Full TV Experience on the web, and again in 2022 for standardizing font technology for custom downloadable fonts and typography for web and TV devices.

Organizational Structure

In administrative terms W3C has become its own legal entity in January 2023, moving to a public-interest non-profit organization after 28 years with an atypical organizational structure where legal and fiduciary roles were assumed by four host institutions across the planet. Read more about the W3C history .

In process terms, the W3C Process Document , Member Agreement , Patent Policy , and a few others documents establish the roles and responsibilities of the parties involved in the making of W3C standards.

The Process governs the standards-setting aspect of W3C. The Bylaws govern the operation of the corporation that supports the standards process and W3C’s other efforts to pursue its mission .

Funding model

W3C sources of revenue include:

  • W3C Member dues
  • Research grants and other sources of private and public funding
  • Sponsorship and donations

Internet history timeline: ARPANET to the World Wide Web

The internet history timeline shows how today's vast network evolved from the initial concept

Internet history

  • Internet timeline

Additional resources

Bibliography.

In internet history, credit for the initial concept that developed into the World Wide Web is typically given to Leonard Kleinrock. In 1961, he wrote about ARPANET, the predecessor of the internet, in a paper entitled "Information Flow in Large Communication Nets." 

According to the journal Management and Business Review (MBR), Kleinrock, along with other innovators such as J.C.R. Licklider, the first director of the Information Processing Technology Office (IPTO), provided the backbone for the ubiquitous stream of emails, media, Facebook postings and tweets that are now shared online every day.

Firewall: Definition, technology and facts

Latency: Definition, measurement and testing

What is cyberwarfare?

The precursor to the internet was jumpstarted in the early days of the history of computers , in 1969 with the U.S. Defense Department's Advanced Research Projects Agency Network (ARPANET), according to the journal American Scientist . ARPA-funded researchers developed many of the protocols used for internet communication today. This timeline offers a brief history of the internet’s evolution:

Internet timeline: 1960s

1965: Two computers at MIT Lincoln Lab communicate with one another using packet-switching technology.

1968: Beranek and Newman, Inc. (BBN) unveils the final version of the Interface Message Processor (IMP) specifications. BBN wins ARPANET contract.

1969: On Oct. 29, UCLA’s Network Measurement Center, Stanford Research Institute (SRI), University of California-Santa Barbara and University of Utah install nodes. The first message is "LO," which was an attempt by student Charles Kline to "LOGIN" to the SRI computer from the university. However, the message was unable to be completed because the SRI system crashed.

Internet nodes

1970–1980

1972: BBN’s Ray Tomlinson introduces network email. The Internet Working Group (INWG) forms to address need for establishing standard protocols.

1973: Global networking becomes a reality as the University College of London (England) and Royal Radar Establishment (Norway) connect to ARPANET. The term internet is born.

1974: The first Internet Service Provider (ISP) is born with the introduction of a commercial version of ARPANET, known as Telenet.

1974: Vinton Cerf and Bob Kahn (the duo said by many to be the Fathers of the Internet) publish "A Protocol for Packet Network Interconnection," which details the design of TCP .

1976: Queen Elizabeth II hits the “send button” on her first email.

1979: USENET forms to host news and discussion groups.

1980–1990

1981: The National Science Foundation (NSF) provided a grant to establish the Computer Science Network (CSNET) to provide networking services to university computer scientists.

1982: Transmission Control Protocol (TCP) and Internet Protocol (IP), as the protocol suite, commonly known as TCP/IP, emerge as the protocol for ARPANET. This results in the fledgling definition of the internet as connected TCP/IP internets. TCP/IP remains the standard protocol for the internet.

1983: The Domain Name System (DNS) establishes the familiar .edu, .gov, .com, .mil, .org, .net, and .int system for naming websites. This is easier to remember than the previous designation for websites, such as 123.456.789.10.

1984: William Gibson, author of "Neuromancer," is the first to use the term "cyberspace."

1985: Symbolics.com, the website for Symbolics Computer Corp. in Massachusetts, becomes the first registered domain.

1986: The National Science Foundation’s NSFNET goes online to connected supercomputer centers at 56,000 bits per second — the speed of a typical dial-up computer modem. Over time the network speeds up and regional research and education networks, supported in part by NSF, are connected to the NSFNET backbone — effectively expanding the Internet throughout the United States. The NSFNET was essentially a network of networks that connected academic users along with the ARPANET.

1987: The number of hosts on the internet exceeds 20,000. Cisco ships its first router.

1989: World.std.com becomes the first commercial provider of dial-up access to the internet.

World Wide Web

1990–2000

1990: Tim Berners-Lee, a scientist at CERN, the European Organization for Nuclear Research, develops HyperText Markup Language (HTML). This technology continues to have a large impact on how we navigate and view the internet today.

1991: CERN introduces the World Wide Web to the public.

1992: The first audio and video are distributed over the internet. The phrase "surfing the internet" is popularized.

1993: The number of websites reaches 600 and the White House and United Nations go online. Marc Andreesen develops the Mosaic Web browser at the University of Illinois, Champaign-Urbana. The number of computers connected to NSFNET grows from 2,000 in 1985 to more than 2 million in 1993. The National Science Foundation leads an effort to outline a new internet architecture that would support the burgeoning commercial use of the network.

1994: Netscape Communications is born. Microsoft creates a Web browser for Windows 95.

1994: Yahoo! is created by Jerry Yang and David Filo, two electrical engineering graduate students at Stanford University. The site was originally called "Jerry and David's Guide to the World Wide Web." The company was later incorporated in March 1995.

1995: Compuserve, America Online and Prodigy begin to provide internet access. Amazon.com, Craigslist and eBay go live. The original NSFNET backbone is decommissioned as the internet’s transformation to a commercial enterprise is largely completed.

1995: The first online dating site, Match.com, launches.

1996: The browser war, primarily between the two major players Microsoft and Netscape, heats up. CNET buys tv.com for $15,000.

1996: A 3D animation dubbed " The Dancing Baby " becomes one of the first viral videos.

1997: Netflix is founded by Reed Hastings and Marc Randolph as a company that sends users DVDs by mail.

People watching laptop

1997: PC makers can remove or hide Microsoft’s internet software on new versions of Windows 95, thanks to a settlement with the Justice Department. Netscape announces that its browser will be free.

1998: The Google search engine is born, changing the way users engage with the internet.

1998: The Internet Protocol version 6 introduced, to allow for future growth of Internet Addresses. The current most widely used protocol is version 4. IPv4 uses 32-bit addresses allowing for 4.3 billion unique addresses; IPv6, with 128-bit addresses, will allow 3.4 x 1038 unique addresses, or 340 trillion trillion trillion.

1999: AOL buys Netscape. Peer-to-peer file sharing becomes a reality as Napster arrives on the Internet, much to the displeasure of the music industry.

2000–2010

2000: The dot-com bubble bursts. Websites such as Yahoo! and eBay are hit by a large-scale denial of service attack, highlighting the vulnerability of the Internet. AOL merges with Time Warner

2001: A federal judge shuts down Napster, ruling that it must find a way to stop users from sharing copyrighted material before it can go back online.

2003: The SQL Slammer worm spread worldwide in just 10 minutes. Myspace, Skype and the Safari Web browser debut.

2003: The blog publishing platform WordPress is launched.

2004: Facebook goes online and the era of social networking begins. Mozilla unveils the Mozilla Firefox browser.

2005: YouTube.com launches. The social news site Reddit is also founded. 

2006: AOL changes its business model, offering most services for free and relying on advertising to generate revenue. The Internet Governance Forum meets for the first time.

2006: Twitter launches. The company's founder, Jack Dorsey, sends out the very first tweet: "just setting up my twttr."

2009: The internet marks its 40th anniversary.

2010–2020

2010: Facebook reaches 400 million active users.

2010: The social media sites Pinterest and Instagram are launched.

2011: Twitter and Facebook play a large role in the Middle East revolts.

2012: President Barack Obama's administration announces its opposition to major parts of the Stop Online Piracy Act and the Protect Intellectual Property Act, which would have enacted broad new rules requiring internet service providers to police copyrighted content. The successful push to stop the bill, involving technology companies such as Google and nonprofit organizations including Wikipedia and the Electronic Frontier Foundation, is considered a victory for sites such as YouTube that depend on user-generated content, as well as "fair use" on the internet.

2013: Edward Snowden, a former CIA employee and National Security Agency (NSA) contractor, reveals that the NSA had in place a monitoring program capable of tapping the communications of thousands of people, including U.S. citizens.

2013: Fifty-one percent of U.S. adults report that they bank online, according to a survey conducted by the Pew Research Center.

Online banking

2015: Instagram, the photo-sharing site, reaches 400 million users, outpacing Twitter, which would go on to reach 316 million users by the middle of the same year.

2016: Google unveils Google Assistant, a voice-activated personal assistant program, marking the entry of the internet giant into the "smart" computerized assistant marketplace. Google joins Amazon's Alexa, Siri from Apple, and Cortana from Microsoft.

2018: There is a significant rise in internet-enabled devices. An increase in the Internet of Things (IoT) sees around seven billion devices by the end of the year.  

2019: Fifth–generation ( 5G ) networks are launched, enabling speedier internet connection on some wireless devices. 

2020–2022

2021: By January 2021, there are 4.66 billion people connected to the internet. This is more than half of the global population. 

2022: Low–Earth orbit satellite internet is closer to reality. By early January 2022, SpaceX launches more than 1,900 Starlink  satellites overall. The constellation is now providing broadband service in select areas around the world. 

To find out more about the SpaceX satellite internet project, you can watch this video about the mission. Additionally, to read an interview with Leonard Kleinrock, visit the Communications of the ACM website .

  • " Leonard Kleinrock Internet Pioneer ". Management and Business Review (2022). 
  • " The Science of Computing: The ARPANET after Twenty Years ". American Scientist (1989). 
  • " A brief history of the internet ". Association for Computing Machinery (AGM) (2009). 
  • " Internet Protocol, Version 6 (IPv6) Specification ". S. Deering, R. Hinden (1998). 
  • " Distributed denial of service attacks ". IEEE International Conference on Systems, Man and Cybernetics (2000). 
  • " Statistics and Social Network of YouTube Videos ". 2008 16th Interntional Workshop on Quality of Service (2008). 
  • " Social Media and Crisis Communication ".  (Routledge, 2017). 

Sign up for the Live Science daily newsletter now

Get the world’s most fascinating discoveries delivered straight to your inbox.

Kim Ann Zimmermann is a contributor to Live Science and sister site Space.com, writing mainly evergreen reference articles that provide background on myriad scientific topics, from astronauts to climate, and from culture to medicine. Her work can also be found in Business News Daily and KM World. She holds a bachelor’s degree in communications from Glassboro State College (now known as Rowan University) in New Jersey. 

Follow Live Science on social media

Live Science daily newsletter: Get amazing science every day

Scientists spot ancient 'smiley face' on Mars — and it could contain signs of life

Most Popular

  • 2 The Andromeda Galaxy glows rosy red in gorgeous new Hubble Telescope image
  • 3 Amazingly simple discovery extends Li-ion battery lifespan by 50% — meaning you don't have to replace your gadgets as often
  • 4 Watch an eel climb up its predator's digestive tract and wriggle to freedom through its gills
  • 5 NASA spacecraft captures 1st photo of its giant solar sail while tumbling in space

world wide web assignment

  • Internet technologies

What if the web could conjure up exactly the information you needed in exactly the format you wanted -- before you knew enough to ask for it? This could someday be the reality of Web 3.0, the next version of the web. This guide provides answers to common questions and has hyperlinks to articles that go into depth about the business opportunities and risks. It also has detailed explanations of key Web 3.0 concepts, such as the effects of decentralization on web governance and data management, and what enterprises can do today to test the Web 3.0 waters.

World wide web (www).

Rahul Awati

  • Rahul Awati

What is World Wide Web (WWW, W3)?

The World Wide Web -- also known as the web, WWW or W3 -- refers to all the public websites or pages that users can access on their local computers and other devices through the internet . These pages and documents are interconnected by means of hyperlinks that users click on for information. This information can be in different formats, including text, images, audio and video.

The term World Wide Web isn't synonymous with the internet. Rather, the World Wide Web is part of the internet.

How does the World Wide Web work?

Paving the way for an internet revolution that has transformed the world in only three decades, the World Wide Web consists of multiple components that enable users to access various resources, documents and web pages on the internet. Thus, the WWW is like a vast electronic book whose pages are stored or hosted on different servers worldwide.

These pages are the primary component or building blocks of the WWW and are linked through hyperlinks, which provide access from one specific spot in a hypertext or hypermedia document to another spot within that document or a different one. Hyperlinks are another defining concept of the WWW and provide its identity as a collection of interconnected documents.

Hypertext is a method for instant information cross-referencing that supports communications on the web. Hypertext makes it easy to link content on one web page to content on another web page or site. Hypertext and HTTP enable people to access the millions of websites active on the WWW.

This article is part of

What is Web 3.0 (Web3)? Definition, guide and history

  • Which also includes:
  • Web 2.0 vs. Web 3.0 vs. Web 1.0: What's the difference?
  • 10 core features of Web 3.0 technology
  • 8 top Web 3.0 use cases and examples

The Hypertext Transfer Protocol ( HTTP ) is another key component of the WWW. It enables users to access web pages by standardizing communications and data transfer between the internet's servers and clients.

every URL is also a URI, but not vice versa

Most web documents and pages are created using Hypertext Markup Language ( HTML ), a text-based way of describing how content within an HTML file is structured. HTML describes the structure of web pages using elements or tags and displays the content of these pages through a web browser.

To access one of these pages, a user and their client machine supply a universal identifier to the web server via a browser. This identifier may be a uniform resource locator ( URL ) or uniform resource identifier ( URI ) and is unique to each web page.

A collection of web pages belonging to a URL is called a website. For example, www.techtarget.com is a website, while https://www.techtarget.com/whatis/definition/World-Wide-Web is a web page.

The browser accepts the URL or URI provided by the user and communicates it to the web server. The server then retrieves the web page associated with that URL or URI and presents it to the user in the browser window of their client machine.

basic structure of a URL

History of the World Wide Web

British physicist Tim Berners-Lee invented the World Wide Web. Along with colleagues at Geneva-based CERN -- the European Organization for Nuclear Research -- Berners-Lee had been working on the concept since 1989. Their goal was to combine available technologies and data networks to create a user-friendly system for global communication and information sharing. At the time, they began work on the first WWW server, which they called httpd . They also dubbed the first client WWW .

Originally, WWW was a what you see is what you get ( WYSIWYG ) hypertext browser/editor that ran in the NextStep environment. In 1990, Berners-Lee demonstrated the first web server and browser at CERN to explain his idea of a World Wide Web. The web then entered the public eye in 1991 when Berners-Lee, who also developed hypertext, announced his creation on the alt.hypertext newsgroup ; at the same time, he created the world's first web page with the address http://info.cern.ch/hypertext/WWW/TheProject.html .

This page, which remains operational as of 2022, includes information and links about the WWW project and web servers. In 1993, CERN made the W3 technology publicly available on a royalty-free basis.

Web browser evolution and the growth of the World Wide Web

Berners-Lee and his team developed a text-based web browser that was released in early 1992. However, it took the release of the more user-friendly Mosaic browser in 1993 to kickstart the rapid acceptance and adoption of the WWW. Mosaic provided a point-and-click graphical interface that people had been using in personal computers for a few years. This familiarity increased public interest in WWW and led to its rapid growth all over the world.

Entrepreneur and software engineer Marc Andreessen and others developed Mosaic in the United States. They also developed the Netscape Navigator browser that quickly became the dominant browser in 1994, until it was displaced by Microsoft's Internet Explorer in 1995. IE dominated the web browser space until it was challenged by browsers like Mozilla Firefox -- released in 2004 -- and Google Chrome -- released in 2008. In 2015, Microsoft discontinued IE and replaced it with the Microsoft Edge browser.

Microsoft Internet Explorer Logo History

After inventing the web, Tim Berners-Lee also founded the World Wide Web Consortium ( W3C ), a nonprofit international consortium that aims to standardize the web through specifications and reference software.

For more on Web 3.0, read the following articles:

10 Web3 courses to try

Web 3.0 security risks: What you need to know

5 ways Web 3.0 will impact digital marketing

How to become a Web 3.0 developer: Required skills and guide

Top 8 Web 3.0 trends and predictions for 2023 and beyond

World Wide Web versus the internet

The web is often confused with the internet even though they're different. While the two are intricately connected, the web is just one of many applications built on top of the internet, a vast, global network of multiple smaller networks. The internet incorporates supporting infrastructure and other technologies that connect networks, websites and users to each other. In contrast, the web is a communications model or platform that enables the retrieval or exchange of information over the internet through HTTP. Through the WWW, users can access web pages over the internet by following a series of HTTP links. To retrieve and view these pages, users need to use a browser installed on the computer, such as Microsoft Edge, Google Chrome or Mozilla Firefox.

Both the internet and the web operate within a client-server model . A server is a program that accepts requests from other computers, known as clients, on the network to store and transmit documents. Clients request documents from a server when a user asks for them and then displays them on the user's screen.

The world's first web server went online in 1991 in the U.S. By the end of the year, there were only 10 web servers around the world. Two years later, there were 500 operational web servers; by 2016, the number of web servers had grown to more than 100 million.

Since the release of CERN's first web browser, the WWW has evolved into a massive ecosystem of websites and users. As of 2022, approximately 5 billion people -- or 63% of the world's population -- use the web, which is believed to contain approximately 1.88 billion websites.

how HTTP works

What will Web 3.0 look like compared to Web 1.0 and 2.0?

The World Wide Web continues to evolve. The first generation of the Web, Web 1.0, which Berners-Lee originally defined in 1989, had no video content and a page format similar to that of a printed page. Web 1.0 was primarily static and focused on providing information.

Around the beginning of the 21 st century, Web 2.0 ushered in a new era that was more interactive and dynamic than its predecessor and focused on user collaboration, universal network connectivity and communications channels. As smartphones, mobile internet access and social networks spurred the growth of Web 2.0, applications -- such as Airbnb, TikTok, Twitter and Uber -- which increased online interactivity and utility, became increasingly popular.

With a lofty goal of creating more intelligent, connected and open websites, Web 3.0 is still in its infancy and has yet to be defined fully. Unlike Web 2.0, which includes applications and websites that entail user-generated content, Web 3.0 is expected to be fully decentralized; this places content creation in the hands of the creators rather than platform owners.

three iterations of the World Wide Web

Smarter and more autonomous technology, including artificial intelligence and machine learning , are expected to define Web 3.0. Encrypted digital currencies like Bitcoin and Ethereum may be used to pay for transactions. As peer-to-peer technologies, such as blockchain , and security technologies become more important, Web 3.0 is expected to gain momentum.

Explore what Web 3.0 means for your business , if long URLs are better for security than short URLs , common and avoidable HTML5 mistakes and how to mitigate an HTTP request smuggling vulnerability .

Continue Reading About World Wide Web (WWW)

  • The biggest advantages and disadvantages of Web 3.0
  • The 10 most promising tools for Web 3.0 development

Related Terms

In general, asynchronous -- from Greek asyn- ('not with/together') and chronos ('time') -- describes objects or events not ...

A URL (Uniform Resource Locator) is a unique identifier used to locate a resource on the internet.

File Transfer Protocol (FTP) is a network protocol for transmitting files between computers over TCP/IP connections.

Threat detection and response (TDR) is the process of recognizing potential cyberthreats and reacting to them before harm can be ...

Network detection and response (NDR) technology continuously scrutinizes network traffic to identify suspicious activity and ...

Identity threat detection and response (ITDR) is a collection of tools and best practices aimed at defending against cyberattacks...

A software license is a document that provides legally binding guidelines for the use and distribution of software.

Data storytelling is the process of translating complex data analyses into understandable terms to inform a business decision or ...

Demand shaping is an operational supply chain management (SCM) strategy where a company uses tactics such as price incentives, ...

Employee self-service (ESS) is a widely used human resources technology that enables employees to perform many job-related ...

Diversity, equity and inclusion is a term used to describe policies and programs that promote the representation and ...

Payroll software automates the process of paying salaried, hourly and contingent employees.

Salesforce Developer Experience (Salesforce DX) is a set of software development tools that lets developers build, test and ship ...

Salesforce Platform (formerly known as Force.com) is a platform as a service (PaaS) product that simplifies the development and ...

Salesforce Commerce Cloud is a cloud-based suite of products that enable e-commerce businesses to set up e-commerce sites, drive ...

Internet vs. World Wide Web

The World Wide Web (WWW) is one set of software services running on the Internet. The Internet itself is a global, interconnected network of computing devices. This network supports a wide variety of interactions and communications between its devices. The World Wide Web is a subset of these interactions and supports websites and URIs .

Comparison chart

Internet versus World Wide Web comparison chart
InternetWorld Wide Web
Estimated year of Origin 1969, though opening of the network to commercial interests began only in 1988 1993
Name of the first version ARPANET NSFnet
Comprises Network of Computers, copper wires, & Files, folders & documents stored in various computers
Governed by Internet Protocol Hyper Text Transfer Protocol
Dependency This is the base, independent of the World Wide Web It depends on Internet to work
Nature Hardware Software

The Internet is actually a huge network that is accessible to everyone & everywhere across the world. The network is composed of sub-networks comprising of a number of computers that are enabled to transmit data in packets. The internet is governed by a set of rules, laws & regulations, collectively known as the Internet Protocol (IP). The sub-networks may range from defense networks to academic networks to commercial networks to individual PCs. Internet, essentially provides information & services in the form of E-Mail, chat & file transfers. It also provides access to the World Wide Web & other interlinked web pages.

The Internet & the World Wide Web (the Web), though used interchangeably, are not synonymous. Internet is the hardware part - it is a collection of computer networks connected through either copper wires, fiber-optic cables or wireless connections whereas, the World Wide Web can be termed as the software part – it is a collection of web pages connected through hyperlinks and URLs . In short, the World Wide Web is one of the services provided by the Internet. Other services over the Internet include e-mail, chat and file transfer services. All of these services can be provided to consumers for use by businesses or government or by individuals creating their own networks or platforms.

Another method to differentiate between both is using the Protocol Suite – a collection of laws & regulations that govern the Internet. While internet is governed by the Internet Protocol – specifically dealing with data as whole and their transmission in packets , the World Wide Web is governed by the Hyper Text Transfer Protocol (HTTP) that deals with the linking of files, documents and other resources of the World Wide Web.

The Advanced Research Projects Agency (ARPA) created by the US in 1958 as a reply to the USSR’s launching of the Sputnik , led to creation of a department called the Information Processing Technology Office (IPTO) which started the Semi Automatic Ground Environment (SAGE) that linked all the radar systems of US together. With tremendous research happening across the world, the University of California in Los Angeles (UCLA) got the ARPANET , a smaller version of the Internet in 1969. Since then Internet has taken huge strides in terms of technology and connectivity to reach its current position. In 1978, the International Packet Switched Service (IPSS) was created in Europe by the British Post Office in collaboration with Tymnet & Western Union International and this network slowly spread its wings to the US and Australia. In 1983, the first Wide Area Network (WAN) was created by the National Science Foundation (NSF) of the US called the NSFnet. All these sub-networks merged together post 1985 with new definitions of the Transfer Control Protocols of the Internet Protocol ( TCP /IP) for optimization of resources.

The Web was invented by Sir Tim Berners Lee. In March 1989, Tim Berners-Lee wrote a proposal that described the Web as an elaborate information management system. With help from Robert Cailliau, he published a more formal proposal for the World Wide Web on November 12, 1990. By Christmas 1990, Berners-Lee had built all the tools necessary for a working Web: the first web browser (which was a web editor as well), the first web server , and the first Web pages which described the project itself. On August 6, 1991, he posted a short summary of the World Wide Web project on the alt.hypertext newsgroup. This date also marked the debut of the Web as a publicly available service on the Internet.

Berners-Lee's breakthrough was to marry hypertext to the Internet . In his book Weaving The Web , he explains that he had repeatedly suggested that a marriage between the two technologies was possible to members of both technical communities, but when no one took up his invitation, he finally tackled the project himself. In the process, he developed a system of globally unique identifiers for resources on the Web and elsewhere: the Uniform Resource Identifier.

The World Wide Web had a number of differences from other hypertext systems that were then available. The Web required only unidirectional links rather than bidirectional ones. This made it possible for someone to link to another resource without action by the owner of that resource. It also significantly reduced the difficulty of implementing web servers and browsers (in comparison to earlier systems), but in turn presented the chronic problem of link rot . Unlike predecessors such as HyperCard , the World Wide Web was non-proprietary, making it possible to develop servers and clients independently and to add extensions without licensing restrictions.

For more details see The History of the Internet and The History of the World Wide Web .

Internet of Things

In recent years, the phrase Internet of Things—or IoT—has been used to denote a subset of the Internet that connects physical devices, such as home appliances, vehicles, industrial sensors. Historically the devices connected to the Internet have been computers, cell phones and tablets. With the Internet of Things, other devices like refrigerators, HVAC systems, light bulbs, cars, thermostats, video cameras, and locks can also connect to the Internet. This allows better monitoring and more control of the physical world through the Internet.

About the Author

Nick Jasuja

Related Comparisons

Comcast vs FiOS

Share this comparison via:

If you read this far, you should follow us:

"Internet vs World Wide Web." Diffen.com. Diffen LLC, n.d. Web. 10 Sep 2024. < >

Comments: Internet vs World Wide Web

Anonymous comments (2).

September 12, 2012, 3:28pm This was very educating once you really get to reading these paragraphs. — 170.✗.✗.19
May 8, 2014, 9:23am Nice ans — 101.✗.✗.158
  • Comcast vs FiOS
  • Flickr vs Picasa
  • Google vs Yahoo
  • Cat5 vs Cat5e
  • WikiLeaks vs Wikipedia
  • Cable vs DSL
  • Modem vs Router

Edit or create new comparisons in your area of expertise.

Stay connected

© All rights reserved.

Numbers, Facts and Trends Shaping Your World

Read our research on:

Full Topic List

Regions & Countries

  • Publications
  • Our Methods
  • Short Reads
  • Tools & Resources

Read Our Research On:

  • The Web at 25 in the U.S.

The overall verdict: The internet has been a plus for society and an especially good thing for individual users

Table of contents.

  • About This Report
  • Part 1: How the internet has woven itself into American life
  • Part 2: Americans’ views about the role of the internet in their lives

Summary of Findings

The World Wide Web turns 25 on March 12, 2014. It is one of the most important and heavily-used parts of the network of computer networks that make up the internet. Indeed, the invention of the Web by Sir Tim Berners-Lee was instrumental in turning the internet from a geeky data-transfer system embraced by specialists and a small number of enthusiasts into a mass-adopted technology easily used by hundreds of millions around the world. 1

Internet use 1995 - 2014

The Web’s birthday provides an occasion to take stock of the impact of the rapid growth of the internet since its invention and the attendant rise of mobile connectivity. Since 1995 , the Pew Research Center has documented this explosive adoption of the internet and its wide-ranging impacts on everything from: the way people get , share , and create news; the way they take care of their health ; the way they perform their jobs ; the way they learn ; the nature of their political activity ; their interactions with government ; the style and scope of their communications with friends and family ;  and the way they organize in communities.

In a new national survey to mark the 25 th anniversary of the Web, Pew Research finds further confirmation of the incredible spread and impact of the internet:

Adoption: 87% of American adults now use the internet, with near-saturation usage among those living in households earning $75,000 or more (99%), young adults ages 18-29 (97%), and those with college degrees (97%). Fully 68% of adults connect to the internet with mobile devices like smartphones or tablet computers.

The adoption of related technologies has also been extraordinary: Over the course of Pew Research Center polling, adult ownership of cell phones has risen from 53% in our first survey in 2000 to 90% now. Ownership of smartphones has grown from 35% when we first asked in 2011 to 58% now.

Impact: Asked for their overall judgment about the impact of the internet, toting up all the pluses and minuses of connected life, the public’s verdict is overwhelmingly positive:

  • 90% of internet users say the internet has been a good thing for them personally and only 6% say it has been a bad thing, while 3% volunteer that it has been some of both.
  • 76% of internet users say the internet has been a good thing for society, while 15% say it has been a bad thing and 8% say it has been equally good and bad.

Is the internet a good or bad thing?

Digital technology is viewed as increasingly essential

Technologies that would be very hard to give up

We asked the adults who use basic technologies whether it would be hard to give them up and users of the internet and mobile phones made clear those technologies feel increasingly essential, while more traditional technologies like landline phones and television are becoming easier to part with:

  • 53% of internet users say the internet would be, at minimum, “very hard” to give up, compared with 38% in 2006. That amounts to 46% of all adults who now say the internet would be very hard to give up.
  • 49% of cell phone owners say the same thing about their cell, up from to 43% in 2006. That amounts to 44% of all adults who now say cell phones would be very hard to give up.
  • Overall, 35% of all adults say their television would be very hard to give up, a share that has dipped from 44% who said that in 2006.
  • 28% of landline telephone owners say their phone would be very hard to give up, a major drop from 2006 when 48% of landline owners said it would be very hard to give up their wired phone. That amounts to 17% of all adults who now say their landline phones would be very hard to give up.

In addition to this enthusiasm, a notable share of Americans say the internet is essential to them. Among those internet users who said it would be very hard to give up net access, most (61% of this group) said being online was essential for job-related or other reasons. Translated to the whole population, about four in ten adults (39%) feel they absolutely need to have internet access. Among those most deeply tied to the internet, about half as many (some 30%) said it would be hard to give up access because they simply enjoy being online.

Most internet users think online communication has strengthened their relationships and the majority report the environment is kind

There is considerable debate about whether online communication—through email, messaging, or social media—has strengthened or weakened relationships. Internet users’ own verdict is overwhelmingly positive when it comes to their own ties to family and friends: 67% of internet users say their online communication with family and friends has generally strengthened those relationships, while 18% say it generally weakens those relationships.

Interestingly enough, there are no significant demographic differences tied to users’ feelings about the impact of online communication on relationships. Equal proportions of online men and women, young and old, rich and poor, highly educated and less-well educated, veterans and relative newbies say by 3-to-1 or better that online communication is a relationship enhancer, rather than a relationship detractor.

The online social climate is mostly kind

Asked for a broad perspective about the civility or incivility they have either witnessed or encountered during their online tenure, 76% of internet users said the people they witnessed or encountered online were mostly kind and 13% said people were mostly unkind.

People were also considerably more likely to say they themselves had been treated kindly than they had been treated unkindly or attacked. And internet users were more likely to say online group behavior they had seen had been helpful, rather than harmful.

  • 70% of internet users say they had been treated kindly or generously by others online . That compares with 25% who say they have been treated unkindly or been attacked .
  • 56% of internet users say they have seen an online group come together to help a person or a community solve a problem . That compares with 25% who say they have left an online group because the interaction became too heated or members were unpleasant to one another .

About this survey

The results in this report are based on data from telephone interviews conducted by Princeton Survey Research Associates International from January 9-12, 2014, among a sample of 1,006 adults, age 18 and older. Telephone interviews were conducted in English and Spanish by landline and cell phone. For results based on the total sample, one can say with 95% confidence that the error attributable to sampling is plus or minus 3.5 percentage points. For results based on internet users (N=857), the margin of sampling error is plus or minus 3.9 percentage points.

  • The internet and the Web are not the same thing. The Web is a service that uses the internet’s architecture and is technologically distinct from some other internet functions such as email and peer-to-peer file sharing. In our survey questions, we broadly use the word “internet” when we are asking about what people do online. Many of the things people report to us involve Web activities, even if respondents do not necessarily know that is the layer of the internet they are using. As a result, it is a common practice for us in this report and earlier work to use the words “internet” and “Web” interchangeably, even though they are different things. ↩

Sign up for our weekly newsletter

Fresh data delivery Saturday mornings

Sign up for The Briefing

Weekly updates on the world of news & information

  • Emerging Technology
  • Future of the Internet (Project)
  • Platforms & Services
  • Social Media
  • Technology Adoption

A quarter of U.S. teachers say AI tools do more harm than good in K-12 education

Many americans think generative ai programs should credit the sources they rely on, americans’ use of chatgpt is ticking up, but few trust its election information, q&a: how we used large language models to identify guests on popular podcasts, computer chips in human brains: how americans view the technology amid recent advances, most popular, report materials.

  • Jan. 9-12, 2014 – 25th Anniversary of the Web

901 E St. NW, Suite 300 Washington, DC 20004 USA (+1) 202-419-4300 | Main (+1) 202-857-8562 | Fax (+1) 202-419-4372 |  Media Inquiries

Research Topics

  • Email Newsletters

ABOUT PEW RESEARCH CENTER  Pew Research Center is a nonpartisan, nonadvocacy fact tank that informs the public about the issues, attitudes and trends shaping the world. It does not take policy positions. The Center conducts public opinion polling, demographic research, computational social science research and other data-driven research. Pew Research Center is a subsidiary of The Pew Charitable Trusts , its primary funder.

© 2024 Pew Research Center

IMAGES

  1. Assignment

    world wide web assignment

  2. Assignment 2

    world wide web assignment

  3. Digital Assignment

    world wide web assignment

  4. Assignment on World Wide Web

    world wide web assignment

  5. WP Assignment

    world wide web assignment

  6. Web Assignment 1

    world wide web assignment

VIDEO

  1. What In The World Wide Web

  2. The World Wide Web

  3. N11907673 rapid web assignment 2

  4. Web assignment (bus rental system)

  5. WORLD WIDE WEB #historyfacts #worldwideweb #internet #didyouknow #history #computer #www

  6. WWW/WORLD WIDE WEB/1 AUGUST/ Inventor

COMMENTS

  1. Session 01

    Communication between the web browser and web server, including how they communicate (HTTP) and the network. The Internet and the World Wide Web. Image credit: Barrett Lyon / The Opte Project "Visualization of the routing paths of the Internet."" Used under the Creative Commons Attribution-NonCommercial 4.0 International License. The Internet ...

  2. Reading: The World Wide Web

    The World Wide Web enabled the spread of information over the Internet through an easy-to-use and flexible format. It thus played an important role in popularizing use of the Internet. Although the two terms are sometimes conflated in popular use, World Wide Web is not synonymous with Internet. The Web is an information space containing ...

  3. World Wide Web (WWW)

    World Wide Web (WWW) The World Wide Web (WWW), often called the Web, is a system of interconnected webpages and information that you can access using the Internet. It was created to help people share and find information easily, using links that connect different pages together. The Web allows us to browse websites, watch videos, shop online ...

  4. Overview of the World Wide Web Flashcards

    World Wide Web. Provides access to Internet information through documents including text, graphics, audio, and video files that use a special formatting language called Hypertext Markup Language. Hypertext Markup Language (HTML) Publishes hypertext on the World Wide Web, which allows users to move from one document to another simply by clicking ...

  5. World Wide Web

    A web page from Wikipedia displayed in Google Chrome. The World Wide Web (WWW or simply the Web) is an information system that enables content sharing over the Internet through user-friendly ways meant to appeal to users beyond IT specialists and hobbyists. [1] It allows documents and other web resources to be accessed over the Internet according to specific rules of the Hypertext Transfer ...

  6. World Wide Web

    The development of the World Wide Web was begun in 1989 by Tim Berners-Lee and his colleagues at CERN, an international scientific organization based in Geneva, Switzerland.They created a protocol, HyperText Transfer Protocol (), which standardized communication between servers and clients. Their text-based Web browser was made available for general release in January 1992.

  7. World Wide Web

    WWW Activities, Parts 1 & 2. World Wide Web. Netscape. or. Internet Explorer. While some slight differences do exist among browsers, they all serve the basic purpose of letting you access materials and information on the web. This information might be in the form of plain HTML documents, multimedia files, or any combination thereof.

  8. Assignment on World Wide Web

    Assignment on World Wide Web. Assignment. Web Site: A web site is a collection of web pages, images, videos or other digital assets that is hosted on one or several web server (s), usually accessible via the Internet, cell phone or a LAN. A web page is a document, typically written in HTML, that is almost always accessible via HTTP, a protocol ...

  9. The internet and the World Wide Web

    Describe how the web pages for the website are requested and displayed on a user's computer. Ahmed uses the Internet for some time and is puzzled by the terminology. Draw a line to match each description to the appropriate technical term. Ahmed sees the message "Set your browser to accept cookies".

  10. PDF The world-wide web

    The world-wide web T.J. Berners-Lee, R. Cailliau and J.-F. Groff CERN, 1211 Geneva 23, Switzerland Abstract Berners-Lee, T.J., R. Cailliau and J.-F. Groff, The world-wide web, Computer Networks and ISDN Systems 25 (1992) 454-459. This paper describes the World-Wide Web (W3) global information system initiative, its protocols and data formats ...

  11. Lesson: The World Wide Web

    Key learning points. In this lesson, we will introduce the key components of the World Wide Web. We will understand the difference between HTTP and HTTPS protocols. This content is made available by Oak National Academy Limited and its partners and licensed under Oak's terms & conditions (Collection 1), except where otherwise stated.

  12. The World Wide Web

    When he came to MIT in 1994, he formed the World Wide Web Consortium (W3C) at MIT's Computer Science and Artificial Intelligence Laboratory to create and maintain open standards for this essential global system. In 2017 he won the Turing Award, the most prestigious honor in computer science. With your support, we will build a better world.

  13. The birth of the Web

    In 2013, CERN launched a project to restore this first ever website: info.cern.ch. On 30 April 1993, CERN put the World Wide Web software in the public domain. Later, CERN made a release available with an open licence, a more sure way to maximise its dissemination. These actions allowed the web to flourish.

  14. Development of the Internet and the World Wide Web

    The recent growth of the Internet and the World Wide Web makes it appear that the world is witnessing the arrival of a completely new technology. In fact, the Web—now considered to be a major driver of the way society accesses and views information—is the result of numerous projects in computer networking, mostly funded by the federal ...

  15. Session 01

    Then get that repo code locally to edit. Download the assignment ZIP file. Unzip or Extract the ZIP file. Move the folder into your "course work folder" you created above. Move the entire folder. Start Visual Studio Code, and "File → Open Folder" and navigate to the assignment folder to open.

  16. The Internet and the Web

    Web 1.0 was all about fetching, and reading information. Web 2.0 is all about reading, writing, creating, and interacting with the end user. It was famously called the participative social web. Web 3.0 is the third generation of the World Wide Web, and is a vision of a decentralized web which is currently a work in progress. It is all about reading

  17. World Wide Web Assignment

    Unlike the other assignments, all you need to do is print this page or copy the assignment down. As you work through the assignment, the most effective way to provide evidence of the completion of a step is to print the web page. If your search has returned a very large web document, just print the first page. Part One

  18. Our mission

    Read about the history of the World Wide Web Consortium, founded in 1994 by Web inventor Sir Tim Berners-Lee at the urging of companies investing increasingly in the Web, to foster a consistent architecture and robust web standards. Page . About W3C web standards. This page explains more about W3C web standards, including the value of creating ...

  19. The World Wide Web as an Instructional Tool

    The Internet—and its graphically attractive application, the World Wide Web (WWW)—is potentially a powerful educational tool. Barrie and Presti discuss three ways in which the WWW can be profitably used in education: as a giant encyclopedia, as a virtual classroom, and as a supplement to conventional courses. T he Internet was born in ...

  20. About us

    The World Wide Web Consortium (W3C) is an international public-interest non-profit organization where Member organizations, a full-time staff, and the public work together to develop Web standards. Founded by Web inventor Tim Berners-Lee and led by President & CEO Seth Dobbs and a Board of Directors, the Web Consortium's mission is to lead the ...

  21. Internet history timeline: ARPANET to the World Wide Web

    The internet is older than the World Wide Web (WWW). (Image credit: Getty Images) 1990-2000. 1990: Tim Berners-Lee, a scientist at CERN, the European Organization for Nuclear Research, develops ...

  22. World Wide Web (WWW)

    Web browser evolution and the growth of the World Wide Web Berners-Lee and his team developed a text-based web browser that was released in early 1992. However, it took the release of the more user-friendly Mosaic browser in 1993 to kickstart the rapid acceptance and adoption of the WWW.

  23. Internet vs World Wide Web

    The World Wide Web (WWW) is one set of software services running on the Internet. The Internet itself is a global, interconnected network of computing devices. This network supports a wide variety of interactions and communications between its devices. The World Wide Web is a subset of these interactions and supports websites and URIs.

  24. 25th Web Anniversary

    Summary of Findings. The World Wide Web turns 25 on March 12, 2014. It is one of the most important and heavily-used parts of the network of computer networks that make up the internet. Indeed, the invention of the Web by Sir Tim Berners-Lee was instrumental in turning the internet from a geeky data-transfer system embraced by specialists and a ...