From Wikipedia, the free encyclopedia
Prior to the widespread inter-networking that led to the Internet, most communication networks were limited by their nature to only allow communications between the stations on the network, and the prevalent computer networking method was based on the central mainframe method. In the 1960s, computer researchers, J. C. R. Licklider and Robert W. Taylor pioneered calls for a joined-up global network to address interoperability problems. Concurrently, several research programs began to research principles of networking between separate physical networks, and this lead to the development of Packet switching. These included Donald Davies (NPL), Paul Baran (RAND Corporation), and Leonard Kleinrock (MIT)'s research programs,
This lead to the development of several packet switched networking solutions in the late 1960s and 1970s, including ARPANET, and X.25. Additionally, public access and hobbyist networking systems grew in popularity, including UUCP. They were however still disjointed separate networks, served only by limited gateways between networks. This lead to the application of packet switching to develop a protocol for inter-networking, where multiple different networks could by joined together into a super-framework of networks. By defining a simple common network system, the Internet protocol suite, the concept of the network could be separated from its physical implementation. This spread of inter-network began to form into the idea of a global inter-network that would be called 'The Internet', and this began to quickly spread as pre-existing networks were converted over to become compatible with this. This spread quickly across the advanced telecommunication networks of the western world, and then began to penetrate into the rest of the world as it became the de-facto international standard and global network. However, the disparity of growth lead to a Digital divide that is still a concern today.
Following commercialisation and introduction of privately run Internet Service Providers in the 1980s, and its expansion into mass popular use in the 1990s, the Internet has had a drastic impact on culture and commerce. This includes the rise of near instant communication by e-mail, text based discussion forums, the World Wide Web. Investor speculation in new markets provided by these innovations would also lead to the inflation and collapse of the Dot-com bubble, a major market collapse. But despite this, growth of the internet continued, and still does.
Before the Internet
In the 1950s and early 1960s, prior to the widespread inter-networking that led to the Internet, most communication networks were limited by their nature to only allow communications between the stations on the network. Some networks had gateways or bridges between them, but these bridges were often limited or built specifically for a single use. One prevalent computer networking method was based on the central mainframe method, simply allowing its terminals to be connected via long leased lines. This method was used in the 1950s by Project RAND to support researchers such as Herbert Simon, in Pittsburgh, Pennsylvania, when collaborating across the continent with researchers in Sullivan, Illinois, on automated theorem proving and artificial intelligence.
Three terminals and an ARPA
Main articles: RAND and ARPANET
A fundamental pioneer in the call for a global network, J.C.R. Licklider, articulated the ideas in his January 1960 paper, Man-Computer Symbiosis.
"A network of such [computers], connected to one another by wide-band communication lines [which provided] the functions of present-day libraries together with anticipated advances in information storage and retrieval and [other] symbiotic functions."
—J.C.R. Licklider, [1]
In October 1962, Licklider was appointed head of the United States Department of Defense's Advanced Research Projects Agency, now known as DARPA, within the information processing office. There he formed an informal group within DARPA to further computer research. As part of the information processing office's role, three network terminals had been installed: one for System Development Corporation in Santa Monica, one for Project Genie at the University of California, Berkeley and one for the Multics project SHOPPING at the Massachusetts Institute of Technology (MIT). Licklider's identified need for inter-networking would be made obviously evident by the problems this caused.
"For each of these three terminals, I had three different sets of user commands. So if I was talking online with someone at S.D.C. and I wanted to talk to someone I knew at Berkeley or M.I.T. about this, I had to get up from the S.D.C. terminal, go over and log into the other terminal and get in touch with them. [...] I said, it's obvious what to do (But I don't want to do it): If you have these three terminals, there ought to be one terminal that goes anywhere you want to go where you have interactive computing. That idea is the ARPAnet."
Packet switching
At the tip of the inter-networking problem lay the issue of connecting separate physical networks to form one logical network, with much wasted capacity inside the assorted separate networks. During the 1960s, Donald Davies (NPL), Paul Baran (RAND Corporation), and Leonard Kleinrock (MIT) developed and implemented packet switching. The notion that the Internet was developed to survive a nuclear attack has its roots in the early theories developed by RAND, but is an urban legend, not supported by any Internet Engineering Task Force or other document. Early networks used for the command and control of nuclear forces were message switched, not packet-switched, although current strategic military networks are, indeed, packet-switching and connectionless. Baran's research had approached packet switching from studies of decentralisation to avoid combat damage compromising the entire network.[3]
Networks that led to the Internet
ARPANET
Promoted to the head of the information processing office at DARPA, Robert Taylor intended to realize Licklider's ideas of an interconnected networking system. Bringing in Larry Roberts from MIT, he initiated a project to build such a network. The first ARPANET link was established between the University of California, Los Angeles and the Stanford Research Institute on 29 November 1969. By 5 December 1969, a 4-node network was connected by adding the University of Utah and the University of California, Santa Barbara. Building on ideas developed in ALOHAnet, the ARPANET grew rapidly. By 1981, the number of hosts had grown to 213, with a new host being added approximately every twenty days.[5][6]
ARPANET became the technical core of what would become the Internet, and a primary tool in developing the technologies used. ARPANET development was centered around the Request for Comments (RFC) process, still used today for proposing and distributing Internet Protocols and Systems. RFC 1, entitled "Host Software", was written by Steve Crocker from the University of California, Los Angeles, and published on April 7, 1969. These early years were documented in the 1972 film Computer Networks: The Heralds of Resource Sharing.
International collaborations on ARPANET were sparse. For various political reasons, European developers were concerned with developing the X.25 networks. Notable exceptions were the Norwegian Seismic Array (NORSAR) in 1972, followed in 1973 by Sweden with satellite links to the Tanum Earth Station and University College London.
X.25 and public access
Following on from ARPA's research, packet switching network standards were developed by the International Telecommunication Union (ITU) in the form of X.25 and related standards. In 1974, X.25 formed the basis for the SERCnet network between British academic and research sites, which later became JANET. The initial ITU Standard on X.25 was approved in March 1976. This standard was based on the concept of virtual circuits.
The British Post Office, Western Union International and Tymnet collaborated to create the first international packet switched network, referred to as the International Packet Switched Service (IPSS), in 1978. This network grew from Europe and the US to cover Canada, Hong Kong and Australia by 1981. By the 1990s it provided a worldwide networking infrastructure.[7]
Unlike ARPAnet, X.25 was also commonly available for business use. Telenet offered its Telemail electronic mail service, but this was oriented to enterprise use rather than the general email of ARPANET.
The first dial-in public networks used asynchronous TTY terminal protocols to reach a concentrator operated by the public network. Some public networks, such as CompuServe used X.25 to multiplex the terminal sessions into their packet-switched backbones, while others, such as Tymnet, used proprietary protocols. In 1979, CompuServe became the first service to offer electronic mail capabilities and technical support to personal computer users. The company broke new ground again in 1980 as the first to offer real-time chat with its CB Simulator. There were also the America Online (AOL) and Prodigy dial in networks and many bulletin board system (BBS) networks such as FidoNet. FidoNet in particular was popular amongst hobbyist computer users, many of them hackers and amateur radio operators.
UUCP
n 1979, two students at Duke University, Tom Truscott and Jim Ellis, came up with the idea of using simple Bourne shell scripts to transfer news and messages on a serial line with nearby University of North Carolina at Chapel Hill. Following public release of the software, the mesh of UUCP hosts forwarding on the Usenet news rapidly expanded. UUCPnet, as it would later be named, also created gateways and links between FidoNet and dial-up BBS hosts. UUCP networks spread quickly due to the lower costs involved, and ability to use existing leased lines, X.25 links or even ARPANET connections. By 1981 the number of UUCP hosts had grown to 550, nearly doubling to 940 in 1984.
Merging the networks and creating the Internet
TCP/IP
With so many different network methods, something was needed to unify them. Robert E. Kahn of DARPA and ARPANET recruited Vinton Cerf of Stanford University to work with him on the problem. By 1973, they had soon worked out a fundamental reformulation, where the differences between network protocols were hidden by using a common internetwork protocol, and instead of the network being responsible for reliability, as in the ARPANET, the hosts became responsible. Cerf credits Hubert Zimmerman, Gerard LeLann and Louis Pouzin (designer of the CYCLADES network) with important work on this design.[8]
At this time, the earliest known use of the term Internet was by Vinton Cerf, who wrote:
“ Specification of Internet Transmission Control Program. ”
"Request for Comments No. 675" (Network Working Group, electronic text (1974) [9]
With the role of the network reduced to the bare minimum, it became possible to join almost any networks together, no matter what their characteristics were, thereby solving Kahn's initial problem. DARPA agreed to fund development of prototype software, and after several years of work, the first somewhat crude demonstration of a gateway between the Packet Radio network in the SF Bay area and the ARPANET was conducted. On November 22, 1977[10] a three network demonstration was conducted including the ARPANET, the Packet Radio Network and the Atlantic Packet Satellite network—all sponsored by DARPA. Stemming from the first specifications of TCP in 1974, TCP/IP emerged in mid-late 1978 in nearly final form. By 1981, the associated standards were published as RFCs 791, 792 and 793 and adopted for use. DARPA sponsored or encouraged the development of TCP/IP implementations for many operating systems and then scheduled a migration of all hosts on all of its packet networks to TCP/IP. On 1 January 1983, TCP/IP protocols became the only approved protocol on the ARPANET, replacing the earlier NCP protocol.[11]
ARPANET to Several Federal Wide Area Networks: MILNET, NSI, and NSFNet
After the ARPANET had been up and running for several years, ARPA looked for another agency to hand off the network to; ARPA's primary mission was funding cutting edge research and development, not running a communications utility. Eventually, in July 1975, the network had been turned over to the Defense Communications Agency, also part of the Department of Defense. In 1983, the U.S. military portion of the ARPANET was broken off as a separate network, the MILNET. MILNET subsequently became the unclassified but military-only NIPRNET, in parallel with the SECRET-level SIPRNET and JWICS for TOP SECRET and above. NIPRNET does have controlled security gateways to the public Internet.
The networks based around the ARPANET were government funded and therefore restricted to noncommercial uses such as research; unrelated commercial use was strictly forbidden. This initially restricted connections to military sites and universities. During the 1980s, the connections expanded to more educational institutions, and even to a growing number of companies such as Digital Equipment Corporation and Hewlett-Packard, which were participating in research projects or providing services to those who were.
Several other branches of the U.S. government, the National Aeronautics and Space Agency (NASA), the National Science Foundation (NSF), and the Department of Energy (DOE) became heavily involved in internet research and started development of a successor to ARPANET. In the mid 1980s all three of these branches developed the first Wide Area Networks based on TCP/IP. NASA developed the NASA Science Network, NSF developed CSNET and DOE evolved the Energy Sciences Network or ESNet.
More explicitly, NASA developed a TCP/IP based Wide Area Network, NASA Science Network (NSN), in the mid 1980s connecting space scientists to data and information stored anywhere in the world. In 1989, the DECnet-based Space Physics Analysis Network (SPAN) and the TCP/IP-based NASA Science Network (NSN) were brought together at NASA Ames Research Center creating the first multiprotocol wide area network called the NASA Science Internet, or NSI. NSI was established to provide a total integrated communications infrastructure to the NASA scientific community for the advancement of earth, space and life sciences. As a high-speed, multiprotocol, international network, NSI provided connectivity to over 20,000 scientists across all seven continents.
In 1984 NSF developed CSNET exclusively based on TCP/IP. CSNET connected with ARPANET using TCP/IP, and ran TCP/IP over X.25, but it also supported departments without sophisticated network connections, using automated dial-up mail exchange. This grew into the NSFNet backbone, established in 1986, and intended to connect and provide access to a number of supercomputing centers established by the NSF.[
Transition toward an Internet
The term "Internet" was adopted in the first RFC published on the TCP protocol (RFC 675[1]: Internet Transmission Control Program, December 1974). It was around the time when ARPANET was interlinked with NSFNet, that the term Internet came into more general use,[13] with "an internet" meaning any network using TCP/IP. "The Internet" came to mean a global and large network using TCP/IP. Previously "internet" and "internetwork" had been used interchangeably, and "internet protocol" had been used to refer to other networking systems such as Xerox Network Services.[14]
As interest in wide spread networking grew and new applications for it arrived, the Internet's technologies spread throughout the rest of the world. TCP/IP's network-agnostic approach meant that it was easy to use any existing network infrastructure, such as the IPSS X.25 network, to carry Internet traffic. In 1984, University College London replaced its transatlantic satellite links with TCP/IP over IPSS.
Many sites unable to link directly to the Internet started to create simple gateways to allow transfer of e-mail, at that time the most important application. Sites which only had intermittent connections used UUCP or FidoNet and relied on the gateways between these networks and the Internet. Some gateway services went beyond simple e-mail peering, such as allowing access to FTP sites via UUCP or e-mail.
TCP/IP becomes worldwide
The first ARPANET connection outside the US was established to NORSAR in Norway in 1973, just ahead of the connection to Great Britain. These links were all converted to TCP/IP in 1982, at the same time as the rest of the Arpanet.
CERN, the European internet, the link to the Pacific and beyond
Between 1984 and 1988 CERN began installation and operation of TCP/IP to interconnect its major internal computer systems, workstations, PC's and an accelerator control system. CERN continued to operate a limited self-developed system CERNET internally and several incompatible (typically proprietary) network protocols externally. There was considerable resistance in Europe towards more widespread use of TCP/IP and the CERN TCP/IP intranets remained isolated from the rest of the Internet until 1989.
In 1988 Daniel Karrenberg, from CWI in Amsterdam, visited Ben Segal, CERN's TCP/IP Coordinator, looking for advice about the transition of the European side of the UUCP Usenet network (much of which ran over X.25 links) over to TCP/IP. In 1987, Ben Segal had met with Len Bosack from the then still small company Cisco about purchasing some TCP/IP routers for CERN, and was able to give Karrenberg advice and forward him on to Cisco for the appropriate hardware. This expanded the European portion of the Internet across the existing UUCP networks, and in 1989 CERN opened its first external TCP/IP connections.[15] This coincided with the creation of Réseaux IP Européens (RIPE), initially a group of IP network administrators who met regularly to carry out co-ordination work together. Later, in 1992, RIPE was formally registered as a cooperative in Amsterdam.
At the same time as the rise of internetworking in Europe, adhoc networking to ARPA and in-between Australian universities formed, based on various technologies such as X.25 and UUCPNet. These were limited in their connection to the global networks, due to the cost of making individual international UUCP dial-up or X.25 connections. In 1989, Australian universities joined the push towards using IP protocols to unify their networking infrastructures. AARNet was formed in 1989 by the Australian Vice-Chancellors' Committee and provided a dedicated IP based network for Australia.
The Internet began to penetrate Asia in the late 1980s. Japan, which had built the UUCP-based network JUNET in 1984, connected to NSFNet in 1989. It hosted the annual meeting of the Internet Society, INET'92, in Kobe. Singapore developed TECHNET in 1990, and Thailand gained a global Internet connection between Chulalongkorn University and UUNET in 1992
Digital divide
While developed countries with technological infrastructures were joining the Internet, developing countries began to experience a digital divide separating them from the Internet. On an essentially continental basis, they are building organizations for Internet resource administration and sharing operational experience, as more and more transmission facilities go into place.
Africa
At the beginning of the 1990s, African countries relied upon X.25 IPSS and 2400 baud modem UUCP links for international and internetwork computer communications. In 1996 a USAID funded project, the Leland initiative, started work on developing full Internet connectivity for the continent. Guinea, Mozambique, Madagascar and Rwanda gained satellite earth stations in 1997, followed by Côte d'Ivoire and Benin in 1998.
Africa is building an Internet infrastructure. AfriNIC, headquartered in Mauritius, manages IP address allocation for the continent. As do the other Internet regions, there is an operational forum, the Internet Community of Operational Networking Specialists.[17]
There are a wide range of programs both to provide high-performance transmission plant, and the western and southern coasts have undersea optical cable. High-speed cables join North Africa and the Horn of Africa to intercontinental cable systems. Undersea cable development is slower for East Africa; the original joint effort between New Partnership for Africa's Development (NEPAD) and the East Africa Submarine System (Eassy) has broken off and may become two efforts.[18]
Asia and Oceania
The Asia Pacific Network Information Centre (APNIC), headquartered in Australia, manages IP address allocation for the continent. APNIC sponsors an operational forum, the Asia-Pacific Regional Internet Conference on Operational Technologies (APRICOT).[19]
In 1991, the People's Republic of China saw its first TCP/IP college network, Tsinghua University's TUNET. The PRC went on to make its first global Internet connection in 1995, between the Beijing Electro-Spectrometer Collaboration and Stanford University's Linear Accelerator Center. However, China went on to implement its own digital divide by implementing a country-wide content filter
Latin America
As with the other regions, the Latin American and Caribbean Internet Addresses Registry (LACNIC) manages the IP address space and other resources for its area. LACNIC, headquartered in Uruguay, operates DNS root, reverse DNS, and other key services.
Opening the network to commerce
The interest in commercial use of the Internet became a hotly debated topic. Although commercial use was forbidden, the exact definition of commercial use could be unclear and subjective. UUCPNet and the X.25 IPSS had no such restrictions, which would eventually see the official barring of UUCPNet use of ARPANET and NSFNet connections. Some UUCP links still remained connecting to these networks however, as administrators cast a blind eye to their operation.
During the late 1980s, the first Internet service provider (ISP) companies were formed. Companies like PSINet, UUNET, Netcom, and Portal Software were formed to provide service to the regional research networks and provide alternate network access, UUCP-based email and Usenet News to the public. The first dial-up in the West Coast, was Best Internet[2] - now Verio Communications, opened in 1986. The first dialup ISP in the East was world.std.com, opened in 1989.
This caused controversy amongst university users, who were outraged at the idea of noneducational use of their networks. Eventually, it was the commercial Internet service providers who brought prices low enough that junior colleges and other schools could afford to participate in the new arenas of education and research.
By 1990, ARPANET had been overtaken and replaced by newer networking technologies and the project came to a close. In 1994, the NSFNet, now renamed ANSNET (Advanced Networks and Services) and allowing non-profit corporations access, lost its standing as the backbone of the Internet. Both government institutions and competing commercial providers created their own backbones and interconnections. Regional network access points (NAPs) became the primary interconnections between the many networks and the final commercial restrictions ended.
IETF and a standard for standards
The Internet has developed a significant subculture dedicated to the idea that the Internet is not owned or controlled by any one person, company, group, or organization. Nevertheless, some standardization and control is necessary for the system to function.
The liberal Request for Comments (RFC) publication procedure engendered confusion about the Internet standardization process, and led to more formalization of official accepted standards. The IETF started in January of 1985 as a quarterly meeting of U.S. government funded researchers. Representatives from non-government vendors were invited starting with the fourth IETF meeting in October of that year.
Acceptance of an RFC by the RFC Editor for publication does not automatically make the RFC into a standard. It may be recognized as such by the IETF only after experimentation, use, and acceptance have proved it to be worthy of that designation. Official standards are numbered with a prefix "STD" and a number, similar to the RFC naming style. However, even after becoming a standard, most are still commonly referred to by their RFC number.
In 1992, the Internet Society, a professional membership society, was formed and the IETF was transferred to operation under it as an independent international standards body.
NIC, InterNIC, IANA and ICANN
The first central authority to coordinate the operation of the network was the Network Information Centre (NIC) at Stanford Research Institute (SRI) in Menlo Park, California. In 1972, management of these issues was given to the newly created Internet Assigned Numbers Authority (IANA). In addition to his role as the RFC Editor, Jon Postel worked as the manager of IANA until his death in 1998.
As the early ARPANET grew, hosts were referred to by names, and a HOSTS.TXT file would be distributed from SRI International to each host on the network. As the network grew, this became cumbersome. A technical solution came in the form of the Domain Name System, created by Paul Mockapetris. The Defense Data Network—Network Information Center (DDN-NIC) at SRI handled all registration services, including the top-level domains (TLDs) of .mil, .gov, .edu, .org, .net, .com and .us, root nameserver administration and Internet number assignments under a United States Department of Defense contract.[21] In 1991, the Defense Information Systems Agency (DISA) awarded the administration and maintenance of DDN-NIC (managed by SRI up until this point) to Government Systems, Inc., who subcontracted it to the small private-sector Network Solutions, Inc.[22] [23]
Since at this point in history most of the growth on the Internet was coming from non-military sources, it was decided that the Department of Defense would no longer fund registration services outside of the .mil TLD. In 1993 the U.S. National Science Foundation, after a competitive bidding process in 1992, created the InterNIC to manage the allocations of addresses and management of the address databases, and awarded the contract to three organizations. Registration Services would be provided by Network Solutions; Directory and Database Services would be provided by AT&T; and Information Services would be provided by General Atomics.[24]
In 1998 both IANA and InterNIC were reorganized under the control of ICANN, a California non-profit corporation contracted by the US Department of Commerce to manage a number of Internet-related tasks. The role of operating the DNS system was privatized and opened up to competition, while the central management of name allocations would be awarded on a contract tender basis.
Use and culture
E-mail and Usenet
E-mail is often called the killer application of the Internet. However, it actually predates the Internet and was a crucial tool in creating it. E-mail started in 1965 as a way for multiple users of a time-sharing mainframe computer to communicate. Although the history is unclear, among the first systems to have such a facility were SDC's Q32 and MIT's CTSS.[25]
The ARPANET computer network made a large contribution to the evolution of e-mail. There is one report[26] indicating experimental inter-system e-mail transfers on it shortly after ARPANET's creation. In 1971 Ray Tomlinson created what was to become the standard Internet e-mail address format, using the @ sign to separate user names from host names.[27]
A number of protocols were developed to deliver e-mail among groups of time-sharing computers over alternative transmission systems, such as UUCP and IBM's VNET e-mail system. E-mail could be passed this way between a number of networks, including ARPANET, BITNET and NSFNet, as well as to hosts connected directly to other sites via UUCP.
In addition, UUCP allowed the publication of text files that could be read by many others. The News software developed by Steve Daniel and Tom Truscott in 1979 was used to distribute news and bulletin board-like messages. This quickly grew into discussion groups, known as newsgroups, on a wide range of topics. On ARPANET and NSFNet similar discussion groups would form via mailing lists, discussing both technical issues and more culturally focused topics (such as science fiction, discussed on the sflovers mailing list)
From gopher to the WWW
As the Internet grew through the 1980s and early 1990s, many people realized the increasing need to be able to find and organize files and information. Projects such as Gopher, WAIS, and the FTP Archive list attempted to create ways to organize distributed data. Unfortunately, these projects fell short in being able to accommodate all the existing data types and in being able to grow without bottlenecks.[citation needed]
One of the most promising user interface paradigms during this period was hypertext. The technology had been inspired by Vannevar Bush's "Memex"[28] and developed through Ted Nelson's research on Project Xanadu and Douglas Engelbart's research on NLS.[29] Many small self-contained hypertext systems had been created before, such as Apple Computer's HyperCard. Gopher became the first commonly-used hypertext interface to the Internet. While Gopher menu items were examples of hypertext, they were not commonly perceived in that way.
In 1989, whilst working at CERN, Tim Berners-Lee invented a network-based implementation of the hypertext concept. By releasing his invention to public use, he ensured the technology would become widespread.[30]. One early popular web browser, modeled after HyperCard, was ViolaWWW.
Scholars generally agree,[citation needed] however, that the turning point for the World Wide Web began with the introduction[31] of the Mosaic web browser[32] in 1993, a graphical browser developed by a team at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign (NCSA-UIUC), led by Marc Andreessen. Funding for Mosaic came from the High-Performance Computing and Communications Initiative, a funding program initiated by then-Senator Al Gore's High Performance Computing and Communication Act of 1991 also known as the Gore Bill .[33] Indeed, Mosaic's graphical interface soon became more popular than Gopher, which at the time was primarily text-based, and the WWW became the preferred interface for accessing the Internet. (Gore's reference to his role in "creating the Internet", however, was ridiculed in his Presidential election campaign: see full article Al Gore contributions to the internet and technology).
Mosaic was eventually superseded in 1994 by Andreessen's Netscape Navigator, which replaced Mosaic as the world's most popular browser. While it held this title for some time, eventually competition from Internet Explorer and a variety of other browsers almost completely displaced it. Another important event held on January 11, 1994, was The Superhighway Summit at UCLA's Royce Hall. This was the "first public conference bringing together all of the major industry, government and academic leaders in the field [and] also began the national dialogue about the Information Superhighway and its implications."[34]
24 Hours in Cyberspace, the "the largest one-day online event" (February 8, 1996) up to that date, took place on the then-active website, cyber24.com.[35][36] It was headed by photographer Rick Smolan.[37] A photographic exhibition was unveiled at the Smithsonian Institution's National Museum of American History on 23 January 1997, featuring 70 photos from the project.[38]
Search engines
Even before the World Wide Web, there were search engines that attempted to organize the Internet. The first of these was the Archie search engine from McGill University in 1990, followed in 1991 by WAIS and Gopher. All three of those systems predated the invention of the World Wide Web but all continued to index the Web and the rest of the Internet for several years after the Web appeared. There are still Gopher servers as of 2006, although there are a great many more web servers.
As the Web grew, search engines and Web directories were created to track pages on the Web and allow people to find things. The first full-text Web search engine was WebCrawler in 1994. Before WebCrawler, only Web page titles were searched. Another early search engine, Lycos, was created in 1993 as a university project, and was the first to achieve commercial success. During the late 1990s, both Web directories and Web search engines were popular—Yahoo! (founded 1995) and Altavista (founded 1995) were the respective industry leaders.
By August 2001, the directory model had begun to give way to search engines, tracking the rise of Google (founded 1998), which had developed new approaches to relevancy ranking. Directory features, while still commonly available, became after-thoughts to search engines.
Database size, which had been a significant marketing feature through the early 2000s, was similarly displaced by emphasis on relevancy ranking, the methods by which search engines attempt to sort the best results first. Relevancy ranking first became a major issue circa 1996, when it became apparent that it was impractical to review full lists of results. Consequently, algorithms for relevancy ranking have continuously improved. Google's PageRank method for ordering the results has received the most press, but all major search engines continually refine their ranking methodologies with a view toward improving the ordering of results. As of 2006, search engine rankings are more important than ever, so much so that an industry has developed ("search engine optimizers", or "SEO") to help web-developers improve their search ranking, and an entire body of case law has developed around matters that affect search engine rankings, such as use of trademarks in metatags. The sale of search rankings by some search engines has also created controversy among librarians and consumer advocates.
Dot-com bubble
The suddenly low price of reaching millions worldwide, and the possibility of selling to or hearing from those people at the same moment when they were reached, promised to overturn established business dogma in advertising, mail-order sales, customer relationship management, and many more areas. The web was a new killer app—it could bring together unrelated buyers and sellers in seamless and low-cost ways. Visionaries around the world developed new business models, and ran to their nearest venture capitalist. Of course some of the new entrepreneurs were truly talented at business administration, sales, and growth; but the majority were just people with ideas, and didn't manage the capital influx prudently. Additionally, many dot-com business plans were predicated on the assumption that by using the Internet, they would bypass the distribution channels of existing businesses and therefore not have to compete with them; when the established businesses with strong existing brands developed their own Internet presence, these hopes were shattered, and the newcomers were left attempting to break into markets dominated by larger, more established businesses. Many did not have the ability to do so.
The dot-com bubble burst on March 10, 2000, when the technology heavy NASDAQ Composite index peaked at 5048.62 (intra-day peak 5132.52), more than double its value just a year before. By 2001, the bubble's deflation was running full speed. A majority of the dot-coms had ceased trading, after having burnt through their venture capital, often without ever making a gross profit.
Worldwide Online Population Forecast
In its "Worldwide Online Population Forecast, 2006 to 2011," JupiterResearch anticipates that a 38 percent increase in the number of people with online access will mean that, by 2011, 22 percent of the Earth's population will surf the Internet regularly.
JupiterResearch says the worldwide online population will increase at a compound annual growth rate of 6.6 percent during the next five years, far outpacing the 1.1 percent compound annual growth rate for the planet's population as a whole. The report says 1.1 billion people currently enjoy regular access to the Web.
North America will remain on top in terms of the number of people with online access. According to JupiterResearch, online penetration rates on the continent will increase from the current 70 percent of the overall North American population to 76 percent by 2011. However, Internet adoption has "matured," and its adoption pace has slowed, in more developed countries including the United States, Canada, Japan and much of Western Europe, notes the report.
As the online population of the United States and Canada grows by about only 3 percent, explosive adoption rates in China and India will take place, says JupiterResearch. The report says China should reach an online penetration rate of 17 percent by 2011 and India should hit 7 percent during the same time frame. This growth is directly related to infrastructure development and increased consumer purchasing power, notes JupiterResearch.
By 2011, Asians will make up about 42 percent of the world's population with regular Internet access, 5 percent more than today, says the study.
Penetration levels similar to North America's are found in Scandinavia and bigger Western European nations such as the United Kingdom and Germany, but JupiterResearch says that a number of Central European countries "are relative Internet laggards."
Brazil "with its soaring economy," is predicted by JupiterResearch to experience a 9 percent compound annual growth rate, the fastest in Latin America, but China and India are likely to do the most to boost the world's online penetration in the near future.
For the study, JupiterResearch defined "online users" as people who regularly access the Internet by "dedicated Internet access" devices. Those devices do not include cell phones
Historiography
Some concerns have been raised over the historiography of the Internet's development. This is due to lack of centralised documentation for much of the early developments that led to the Internet.
"The Arpanet period is somewhat well documented because the corporation in charge - BBN - left a physical record. Moving into the NSFNET era, it became an extraordinarily decentralised process. The record exists in people's basements, in closets. [...] So much of what happened was done verbally and on the basis of individual trust."
Footnotes
1. ^ J. C. R. Licklider (1960). "Man-Computer Symbiosis".
2. ^ An Internet Pioneer Ponders the Next Revolution. An Internet Pioneer Ponders the Next Revolution. Retrieved on November 25, 2005.
3. ^ About Rand. Paul Baran and the Origins of the Internet. Retrieved on January 14, 2006.
4. ^ "The history of the Internet," http://www.lk.cs.ucla.edu/personal_history.html
5. ^ Hafner, Katie (1998). Where Wizards Stay Up Late: The Origins Of The Internet. Simon & Schuster. 0-68-483267-4.
6. ^ Ronda Hauben (2001). "From the ARPANET to the Internet".
7. ^ Events in British Telecomms History. Events in British TelecommsHistory. Retrieved on November 25, 2005.
8. ^ Barry M. Leiner, Vinton G. Cerf, David D. Clark, Robert E. Kahn, Leonard Kleinrock, Daniel C. Lynch, Jon Postel, Larry G. Roberts, Stephen Wolff (2003). "A Brief History of Internet".
9. ^ "The Yale Book of Quotations" (2006) Yale University Press edited by Fred R. Shapiro
10. ^ Computer History Museum and Web History Center Celebrate 30th Anniversary of Internet Milestone. Retrieved on November 22, 2007.
11. ^ Jon Postel, NCP/TCP Transition Plan, RFC 801
12. ^ David Roessner, Barry Bozeman, Irwin Feller, Christopher Hill, Nils Newman (1997). "The Role of NSF's Support of Engineering in Enabling Technological Innovation".
13. ^ Tanenbaum, Andrew S. (1996). Computer Networks. Prentice Hall. 0-13-394248-1.
14. ^ Mike Muuss (5th January 1983). "Aucbvax.5690 TCP-IP Digest, Vol 1 #10". fa.tcp-ip. (Web link).
15. ^ Ben Segal (1995). "A Short History of Internet Protocols at CERN".
16. ^ Internet History in Asia. 16th APAN Meetings/Advanced Network Conference in Busan. Retrieved on December 25, 2005.
17. ^ ICONS webpage
18. ^ Nepad, Eassy partnership ends in divorce,(South African) Financial Times FMTech, 2007
19. ^ APRICOT webpage
20. ^ A brief history of the Internet in China. China celebrates 10 years of being connected to the Internet. Retrieved on December 25, 2005.
21. ^ DDN NIC. IAB Recommended Policy on Distributing Internet Identifier Assignment. Retrieved on December 26, 2005.
22. ^ GSI-Network Solutions. TRANSITION OF NIC SERVICES. Retrieved on December 26, 2005.
23. ^ Thomas v. NSI, Civ. No. 97-2412 (TFH), Sec. I.A. (DCDC April 6, 1998)
24. ^ NIS Manager Award Announced. NSF NETWORK INFORMATION SERVICES AWARDS. Retrieved on December 25, 2005.
25. ^ The Risks Digest. Great moments in e-mail history. Retrieved on April 27, 2006.
26. ^ The History of Electronic Mail. The History of Electronic Mail. Retrieved on December 23, 2005.
27. ^ The First Network Email. The First Network Email. Retrieved on December 23, 2005.
28. ^ Vannevar Bush (1945). "As We May Think".
29. ^ Douglas Engelbart (1962). "Augmenting Human Intellect: A Conceptual Framework".
30. ^ The Early World Wide Web at SLAC. The Early World Wide Web at SLAC: Documentation of the Early Web at SLAC. Retrieved on November 25, 2005.
31. ^ Mosaic Web Browser History - NCSA, Marc Andreessen, Eric Bina
32. ^ NCSA Mosaic - September 10, 1993 Demo
33. ^ Vice President Al Gore's ENIAC Anniversary Speech
34. ^ UCLA Center for Communication Policy
35. ^ Mirror of Official site map
36. ^ Mirror of Official Site
37. ^ "24 Hours in Cyberspace" (and more)
38. ^ The human face of cyberspace, painted in random images
39. ^ Brazil, Russia, India and China to Lead Internet Growth Through 2011
40. ^ An Internet Pioneer Ponders the Next Revolution. Illuminating the net's Dark Ages. Retrieved on February 26, 2008.
References
* Campbell-Kelly, Martin; Aspray, William. Computer: A History of the Information Machine. New York: BasicBooks, 1996.
* Graham, Ian S. The HTML Sourcebook: The Complete Guide to HTML. New York: John Wiley and Sons, 1995.
* Krol, Ed. Hitchhiker's Guide to the Internet, 1987.
* Krol, Ed. Whole Internet User's Guide and Catalog. O'Reilly & Associates, 1992.
* Scientific American Special Issue on Communications, Computers, and Networks, September, 1991
* Ellen Rony and Peter Rony, The Domain Name Handbook: High Stakes and Strategies in Cyberspace (R&D Books 1998) Out of Print
External links
* Thomas Greene, Larry Landweber, George Strawn (2003). "A Brief History of NSF and the Internet".
* Internet History: People. Internet History People. Retrieved on July 3, 2006.
* Internet History Timeline. Internet History Timeline. Retrieved on November 25, 2005.
* Internet History. Internet History. Retrieved on November 25, 2005.
* Hobbes' Internet Timeline v8.1. Retrieved on November 25, 2005.
* Internet Society. Retrieved on December 1, 2007.
* "Overhearing the Internet" —by Robert Wright, The New Republic, 1993
* Cybertelecom :: Internet History Focusing on government, legal, and policy history of the Internet
Senin, 03 Maret 2008
History of the Internet
Diposting oleh Unknown di 14.19 0 komentar
Kamis, 14 Februari 2008
Internet
From Wikipedia, the free encyclopedia
The Internet is a worldwide, publicly accessible series of interconnected computer networks that transmit data by packet switching using the standard Internet Protocol (IP). It is a "network of networks" that consists of millions of smaller domestic, academic, business, and government networks, which together carry various information and services, such as electronic mail, online chat, file transfer, and the interlinked web pages and other resources of the World Wide Web (WWW).
Terminology
The International Network, or more commonly known as the Internet and the World Wide Web are not synonymous. The Internet is a collection of interconnected computer networks, linked by copper wires, fiber-optic cables, wireless connections, etc. In contrast, the Web is a collection of interconnected documents and other resources, linked by hyperlinks and URLs. The World Wide Web is one of the services accessible via the Internet, along with various others including e-mail, file sharing, online gaming and others described below.
History
Main article: History of the Internet
Creation
Main article: ARPANET
The USSR's launch of Sputnik spurred the United States to create the Advanced Research Projects Agency, known as ARPA, in February 1958 to regain a technological lead.[1][2] ARPA created the Information Processing Technology Office (IPTO) to further the research of the Semi Automatic Ground Environment (SAGE) program, which had networked country-wide radar systems together for the first time. J. C. R. Licklider was selected to head the IPTO, and saw universal networking as a potential unifying human revolution.
Licklider moved from the Psycho-Acoustic Laboratory at Harvard University to MIT in 1950, after becoming interested in information technology. At MIT, he served on a committee that established Lincoln Laboratory and worked on the SAGE project. In 1957 he became a Vice President at BBN, where he bought the first production PDP-1 computer and conducted the first public demonstration of time-sharing.
At the IPTO, Licklider recruited Lawrence Roberts to head a project to implement a network, and Roberts based the technology on the work of Paul Baran[citation needed] who had written an exhaustive study for the U.S. Air Force that recommended packet switching (as opposed to circuit switching) to make a network highly robust and survivable. After much work, the first two nodes of what would become the ARPANET were interconnected between UCLA and SRI International in Menlo Park, California, on October 29, 1969. The ARPANET was one of the "eve" networks of today's Internet. Following on from the demonstration that packet switching worked on the ARPANET, the British Post Office, Telenet, DATAPAC and TRANSPAC collaborated to create the first international packet switched network service. In the UK, this was referred to as the International Packet Stream Service (IPSS), in 1978. The collection of X.25-based networks grew from Europe and the US to cover Canada, Hong Kong and Australia by 1981. The X.25 packet switching standard was developed in the CCITT (now called ITU-T) around 1976. X.25 was independent of the TCP/IP protocols that arose from the experimental work of DARPA on the ARPANET, Packet Radio Net and Packet Satellite Net during the same time period. Vinton Cerf and Robert Kahn developed the first description of the TCP protocols during 1973 and published a paper on the subject in May 1974. Use of the term "Internet" to describe a single global TCP/IP network originated in December 1974 with the publication of RFC 674, the first full specification of TCP that was written by Vinton Cerf, Yogen Dalal and Carl Sunshine then at Stanford University. During the next nine years, work proceeded to refine the protocols and to implement them on a wide range of operating systems.
The first TCP/IP-wide area network was made operational by January 1, 1983 when all hosts on the ARPANET were switched over from the older NCP protocols to TCP/IP. In 1985, the United States' National Science Foundation (NSF) commissioned the construction of a university 56 kilobit/second network backbone using computers called "fuzzballs" by their inventor, David L. Mills. The following year, NSF sponsored the development of a higher speed 1.5 megabit/second backbone that become the NSFNet. A key decision to use the DARPA TCP/IP protocols was made by Dennis Jennings, then in charge of the Supercomputer program at NSF.
The opening of the network to commercial interests began in 1988. The US Federal Networking Council approved the interconnection of the NSFNET to the commercial MCI Mail system in that year and the link was made in the summer of 1989. Other commercial electronic email services were soon connected, including OnTyme, Telemail and Compuserve. In that same year, three commercial Internet Service Providers were created: UUNET, PSINET and CERFNET. Important, separate networks that offered gateways into, then later merged with the Internet include Usenet and BITNET. Various other commercial and educational networks, such as Telenet, Tymnet, Compuserve and JANET were interconnected with the growing Internet. Telenet (later called Sprintnet) was a large privately-funded national computer network with free dial-up access in cities throughout the U.S. that had been in operation since the 1970s. This network was eventually interconnected with the others in the 1980s as the TCP/IP protocol became increasingly popular. The ability of TCP/IP to work over virtually any pre-existing communication networks allowed for a great ease of growth although the rapid growth of the Internet was due primarily to the availability of commercial routers from companies such as Cisco Systems, Proteon and Juniper, the availability of commercial Ethernet equipment for local area networking and the widespread implementation of TCP/IP on the UNIX operating system.
Growth
Although the basic applications and guidelines that make the Internet possible had existed for almost a decade, the network did not gain a public face until the 1990s. On August 6, 1991, CERN, which straddles the border between France and Switzerland, publicized the new World Wide Web project. The Web was invented by English scientist Tim Berners-Lee in 1989.
An early popular web browser was ViolaWWW based upon HyperCard. It was eventually replaced in popularity by the Mosaic web browser. In 1993, the National Center for Supercomputing Applications at the University of Illinois released version 1.0 of Mosaic, and by late 1994 there was growing public interest in the previously academic/technical Internet. By 1996 usage of the word "Internet" had become commonplace, and consequently, so had its misuse as a reference to the World Wide Web.
Meanwhile, over the course of the decade, the Internet successfully accommodated the majority of previously existing public computer networks (although some networks, such as FidoNet, have remained separate). During the 1990s, it was estimated that the Internet grew by 100% per year, with a brief period of explosive growth in 1996 and 1997.[3] This growth is often attributed to the lack of central administration, which allows organic growth of the network, as well as the non-proprietary open nature of the Internet protocols, which encourages vendor interoperability and prevents any one company from exerting too much control over the network.[citation needed]
University Students Appreciation and Contributions
New findings in the field of communications during the 1960s, 1970s and 1980s were quickly adopted by universities across the United States.
Examples of early university Internet communities are Cleveland FreeNet, Blacksburg Electronic Village and NSTN in Nova Scotia ( [1] ). Students took up the opportunity of free communications and saw this new phenomenon as a tool of liberation. Personal computers and the Internet would free them from corporations and governments (Nelson, Jennings, Stallman).
Graduate students played a huge part in the creation of ARPANET. In the 1960’s, the network working group, which did most of the design for ARPANET’s protocols was composed mainly of graduate students.
Today's Internet
Aside from the complex physical connections that make up its infrastructure, the Internet is facilitated by bi- or multi-lateral commercial contracts (e.g., peering agreements), and by technical specifications or protocols that describe how to exchange data over the network. Indeed, the Internet is essentially defined by its interconnections and routing policies.
As of September 30, 2007, 1.244 billion people use the Internet according to Internet World Stats. Writing in the Harvard International Review, philosopher N.J.Slabbert, a writer on policy issues for the Washington DC-based Urban Land Institute, has asserted that the Internet is fast becoming a basic feature of global civilization, so that what has traditionally been called "civil society" is now becoming identical with information technology society as defined by Internet use. Some suggest that as low as 2% of the World's population regularly accesses the internet.[4] "http://www.edchange.org/multicultural/quiz/quiz_key.pdf"
Internet protocols
For more details on this topic, see Internet Protocols.
In this context, there are three layers of protocols:
* At the lower level (OSI layer 3) is IP (Internet Protocol), which defines the datagrams or packets that carry blocks of data from one node to another. The vast majority of today's Internet uses version four of the IP protocol (i.e. IPv4), and although IPv6 is standardized, it exists only as "islands" of connectivity, and there are many ISPs without any IPv6 connectivity. [2]. ICMP (Internet Control Message Protocol) also exists at this level. ICMP is connectionless; it is used for control, signaling, and error reporting purposes.
* TCP (Transmission Control Protocol) and UDP (User Datagram Protocol) exist at the next layer up (OSI layer 4); these are the protocols by which data is transmitted. TCP makes a virtual 'connection', which gives some level of guarantee of reliability. UDP is a best-effort, connectionless transport, in which data packets that are lost in transit will not be re-sent.
* The application protocols sit on top of TCP and UDP and occupy layers 5, 6, and 7 of the OSI model. These define the specific messages and data formats sent and understood by the applications running at each end of the communication. Examples of these protocols are HTTP, FTP, and SMTP.
Internet structure
There have been many analyses of the Internet and its structure. For example, it has been determined that the Internet IP routing structure and hypertext links of the World Wide Web are examples of scale-free networks.
Similar to the way the commercial Internet providers connect via Internet exchange points, research networks tend to interconnect into large subnetworks such as:
* GEANT
* GLORIAD
* The Internet2 Network (formally known as the Abilene Network)
* JANET (the UK's national research and education network)
These in turn are built around relatively smaller networks. See also the list of academic computer network organizations
In network diagrams, the Internet is often represented by a cloud symbol, into and out of which network communications can pass.
ICANN
For more details on this topic, see ICANN.
The Internet Corporation for Assigned Names and Numbers (ICANN) is the authority that coordinates the assignment of unique identifiers on the Internet, including domain names, Internet Protocol (IP) addresses, and protocol port and parameter numbers. A globally unified namespace (i.e., a system of names in which there is at most one holder for each possible name) is essential for the Internet to function. ICANN is headquartered in Marina del Rey, California, but is overseen by an international board of directors drawn from across the Internet technical, business, academic, and non-commercial communities. The US government continues to have the primary role in approving changes to the root zone file that lies at the heart of the domain name system. Because the Internet is a distributed network comprising many voluntarily interconnected networks, the Internet, as such, has no governing body. ICANN's role in coordinating the assignment of unique identifiers distinguishes it as perhaps the only central coordinating body on the global Internet, but the scope of its authority extends only to the Internet's systems of domain names, IP addresses, protocol ports and parameter numbers.
On November 16, 2005, the World Summit on the Information Society, held in Tunis, established the Internet Governance Forum (IGF) to discuss Internet-related issues.
Language
For more details on this topic, see English on the Internet.
Further information: Unicode
The prevalent language for communication on the Internet is English. This may be a result of the Internet's origins, as well as English's role as the lingua franca. It may also be related to the poor capability of early computers, largely originating in the United States, to handle characters other than those in the English variant of the Latin alphabet.
After English (31% of Web visitors) the most-requested languages on the World Wide Web are Chinese 16%, Spanish 9%, Japanese 7%, German 5% and French 5% (from Internet World Stats, updated June 30, 2007).
By continent, 37% of the world's Internet users are based in Asia, 27% in Europe, 19% in North America, and 9% in Latin America and the Carribean ([3] updated September 30, 2007).
The Internet's technologies have developed enough in recent years, especially in the use of Unicode, that good facilities are available for development and communication in most widely used languages. However, some glitches such as mojibake (incorrect display of foreign language characters, also known as kryakozyabry) still remain.
Internet and the workplace
The Internet is allowing greater flexibility in working hours and location, especially with the spread of unmetered high-speed connections and Web applications.
The Internet viewed on mobile devices
The Internet can now be accessed virtually anywhere by numerous means. Mobile phones, datacards, handheld game consoles and cellular routers allow users to connect to the Internet from anywhere there is a cellular network supporting that device's technology.
Common uses of the Internet
E-mail
For more details on this topic, see E-mail.
The concept of sending electronic text messages between parties in a way analogous to mailing letters or memos predates the creation of the Internet. Even today it can be important to distinguish between Internet and internal e-mail systems. Internet e-mail may travel and be stored unencrypted on many other networks and machines out of both the sender's and the recipient's control. During this time it is quite possible for the content to be read and even tampered with by third parties, if anyone considers it important enough. Purely internal or intranet mail systems, where the information never leaves the corporate or organization's network, are much more secure, although in any organization there will be IT and other personnel whose job may involve monitoring, and occasionally accessing, the email of other employees not addressed to them.
The World Wide Web
For more details on this topic, see World Wide Web.
Many people use the terms Internet and World Wide Web (or just the Web) interchangeably, but, as discussed above, the two terms are not synonymous.
The World Wide Web is a huge set of interlinked documents, images and other resources, linked by hyperlinks and URLs. These hyperlinks and URLs allow the web-servers and other machines that store originals, and cached copies, of these resources to deliver them as required using HTTP (Hypertext Transfer Protocol). HTTP is only one of the communication protocols used on the Internet.
Web services also use HTTP to allow software systems to communicate in order to share and exchange business logic and data.
Software products that can access the resources of the Web are correctly termed user agents. In normal use, web browsers, such as Internet Explorer and Firefox access web pages and allow users to navigate from one to another via hyperlinks. Web documents may contain almost any combination of computer data including photographs, graphics, sounds, text, video, multimedia and interactive content including games, office applications and scientific demonstrations.
Through keyword-driven Internet research using search engines, like Yahoo!, and Google, millions of people worldwide have easy, instant access to a vast and diverse amount of online information. Compared to encyclopedias and traditional libraries, the World Wide Web has enabled a sudden and extreme decentralization of information and data.
It is also easier, using the Web, than ever before for individuals and organisations to publish ideas and information to an extremely large audience. Anyone can find ways to publish a web page or build a website for very little initial cost. Publishing and maintaining large, professional websites full of attractive, diverse and up-to-date information is still a difficult and expensive proposition, however.
Many individuals and some companies and groups use "web logs" or blogs, which are largely used as easily-updatable online diaries. Some commercial organizations encourage staff to fill them with advice on their areas of specialization in the hope that visitors will be impressed by the expert knowledge and free information, and be attracted to the corporation as a result. One example of this practice is Microsoft, whose product developers publish their personal blogs in order to pique the public's interest in their work.
Collections of personal web pages published by large service providers remain popular, and have become increasingly sophisticated. Whereas operations such as Angelfire and GeoCities have existed since the early days of the Web, newer offerings from, for example, Facebook and MySpace currently have large followings. These operations often brand themselves as social network services rather than simply as web page hosts.
Advertising on popular web pages can be lucrative, and e-commerce or the sale of products and services directly via the Web continues to grow.
In the early days, web pages were usually created as sets of complete and isolated HTML text files stored on a web server. More recently, web sites are more often created using content management system (CMS) or wiki software with, initially, very little content. Users of these systems, who may be paid staff, members of a club or other organisation or members of the public, fill the underlying databases with content using editing pages designed for that purpose, while casual visitors view and read this content in its final HTML form. There may or may not be editorial, approval and security systems built into the process of taking newly entered content and making it available to the target visitors.
Remote access
Further information: Remote access
The Internet allows computer users to connect to other computers and information stores easily, wherever they may be across the world. They may do this with or without the use of security, authentication and encryption technologies, depending on the requirements.
This is encouraging new ways of working from home, collaboration and information sharing in many industries. An accountant sitting at home can audit the books of a company based in another country, on a server situated in a third country that is remotely maintained by IT specialists in a fourth. These accounts could have been created by home-working book-keepers, in other remote locations, based on information e-mailed to them from offices all over the world. Some of these things were possible before the widespread use of the Internet, but the cost of private, leased lines would have made many of them infeasible in practice.
An office worker away from his desk, perhaps the other side of the world on a business trip or a holiday, can open a remote desktop session into their normal office PC using a secure Virtual Private Network (VPN) connection via the Internet. This gives the worker complete access to all of their normal files and data, including e-mail and other applications, while away from the office.
This concept is also referred to by some network security people as the Virtual Private Nightmare, because it extends the secure perimeter of a corporate network into its employees' homes; this has been the source of some notable security breaches, but also provides security for the workers.
Collaboration
See also: Collaborative software
The low cost and nearly instantaneous sharing of ideas, knowledge, and skills has made collaborative work dramatically easier. Not only can a group cheaply communicate and test, but the wide reach of the Internet allows such groups to easily form in the first place, even among niche interests. An example of this is the free software movement in software development which produced GNU and Linux from scratch and has taken over development of Mozilla and OpenOffice.org (formerly known as Netscape Communicator and StarOffice). Films such as Zeitgeist, Loose Change and Endgame have had extensive coverage on the internet, while being virtually ignored in the mainstream media.
Internet 'chat', whether in the form of IRC 'chat rooms' or channels, or via instant messaging systems allow colleagues to stay in touch in a very convenient way when working at their computers during the day. Messages can be sent and viewed even more quickly and conveniently than via e-mail. Extension to these systems may allow files to be exchanged, 'whiteboard' drawings to be shared as well as voice and video contact between team members.
Version control systems allow collaborating teams to work on shared sets of documents without either accidentally overwriting each other's work or having members wait until they get 'sent' documents to be able to add their thoughts and changes.
File sharing
For more details on this topic, see File sharing.
A computer file can be e-mailed to customers, colleagues and friends as an attachment. It can be uploaded to a Web site or FTP server for easy download by others. It can be put into a "shared location" or onto a file server for instant use by colleagues. The load of bulk downloads to many users can be eased by the use of "mirror" servers or peer-to-peer networks.
In any of these cases, access to the file may be controlled by user authentication; the transit of the file over the Internet may be obscured by encryption and money may change hands before or after access to the file is given. The price can be paid by the remote charging of funds from, for example a credit card whose details are also passed—hopefully fully encrypted—across the Internet. The origin and authenticity of the file received may be checked by digital signatures or by MD5 or other message digests.
These simple features of the Internet, over a world-wide basis, are changing the basis for the production, sale, and distribution of anything that can be reduced to a computer file for transmission. This includes all manner of print publications, software products, news, music, film, video, photography, graphics and the other arts. This in turn has caused seismic shifts in each of the existing industries that previously controlled the production and distribution of these products.
Internet collaboration technology enables business and project teams to share documents, calendars and other information. Such collaboration occurs in a wide variety of areas including scientific research, software development, conference planning, political activism and creative writing.
Streaming media
Many existing radio and television broadcasters provide Internet 'feeds' of their live audio and video streams (for example, the BBC). They may also allow time-shift viewing or listening such as Preview, Classic Clips and Listen Again features. These providers have been joined by a range of pure Internet 'broadcasters' who never had on-air licenses. This means that an Internet-connected device, such as a computer or something more specific, can be used to access on-line media in much the same way as was previously possible only with a television or radio receiver. The range of material is much wider, from pornography to highly specialized technical Web-casts. Podcasting is a variation on this theme, where—usually audio—material is first downloaded in full and then may be played back on a computer or shifted to a digital audio player to be listened to on the move. These techniques using simple equipment allow anybody, with little censorship or licensing control, to broadcast audio-visual material on a worldwide basis.
Webcams can be seen as an even lower-budget extension of this phenomenon. While some webcams can give full frame rate video, the picture is usually either small or updates slowly. Internet users can watch animals around an African waterhole, ships in the Panama Canal, the traffic at a local roundabout or their own premises, live and in real time. Video chat rooms, video conferencing, and remote controllable webcams are also popular. Many uses can be found for personal webcams in and around the home, with and without two-way sound.
YouTube, sometimes described as an internet phenomenon because of the vast amount of users and how rapidly the sites popularity has grown. Youtube was founded in February 15, 2005. It is now the leading website for free streaming video. It uses a flash based web player which streams video files in the format FLV. Users are able to watch videos without signing up however if users do sign up they are able to upload an unlimited amount of videos and they are given their own personal profile. It is currently estimated that there are 64,000,000 videos on Youtube and it is also currently estimated that 825,000 new videos are uploaded every day.
Voice telephony (VoIP)
For more details on this topic, see VoIP.
VoIP stands for Voice over IP, where IP refers to the Internet Protocol that underlies all Internet communication. This phenomenon began as an optional two-way voice extension to some of the Instant Messaging systems that took off around the year 2000. In recent years many VoIP systems have become as easy to use and as convenient as a normal telephone. The benefit is that, as the Internet carries the actual voice traffic, VoIP can be free or cost much less than a normal telephone call, especially over long distances and especially for those with always-on Internet connections such as cable or ADSL.
Thus VoIP is maturing into a viable alternative to traditional telephones. Interoperability between different providers has improved and the ability to call or receive a call from a traditional telephone is available. Simple inexpensive VoIP modems are now available that eliminate the need for a PC.
Voice quality can still vary from call to call but is often equal to and can even exceed that of traditional calls.
Remaining problems for VoIP include emergency telephone number dialling and reliability. Currently a few VoIP providers provide an emergency service but it is not universally available. Traditional phones are line powered and operate during a power failure, VoIP does not do so without a backup power source for the electronics.
Most VoIP providers offer unlimited national calling but the direction in VoIP is clearly toward global coverage with unlimited minutes for a low monthly fee.
VoIP has also become increasingly popular within the gaming world, as a form of communication between players. Popular gaming VoIP clients include Ventrilo and Teamspeak, and there are others available also. The PlayStation 3 and Xbox 360 also offer VoIP chat features.
Internet by region
[show]
v • d • e
Internet in Africa
Sovereign states Algeria · Angola · Benin · Botswana · Burkina Faso · Burundi · Cameroon · Cape Verde · Central African Republic · Chad · Comoros · Democratic Republic of the Congo · Republic of the Congo · Côte d'Ivoire (Ivory Coast) · Djibouti · Egypt · Equatorial Guinea · Eritrea · Ethiopia · Gabon · The Gambia · Ghana · Guinea · Guinea-Bissau · Kenya · Lesotho · Liberia · Libya · Madagascar · Malawi · Mali · Mauritania · Mauritius · Morocco · Mozambique · Namibia · Niger · Nigeria · Rwanda · São Tomé and Príncipe · Senegal · Seychelles · Sierra Leone · Somalia · South Africa · Sudan · Swaziland · Tanzania · Togo · Tunisia · Uganda · Zambia · Zimbabwe
Dependencies,
autonomies and
other territories Canary Islands (Spain) · Ceuta (Spain) · Madeira (Portugal) · Mayotte (France) · Melilla (Spain) · Puntland · Réunion (France) · St. Helena (UK) · Socotra (Yemen) · Somaliland · Southern Sudan · Western Sahara · Zanzibar (Tanzania)
[show]
v • d • e
Internet in North America
Sovereign states Antigua and Barbuda · Bahamas · Barbados · Belize · Canada · Costa Rica · Cuba · Dominica · Dominican Republic · El Salvador · Grenada · Guatemala · Haiti · Honduras · Jamaica · Mexico · Nicaragua · Panama* · Saint Kitts and Nevis · Saint Lucia · Saint Vincent and the Grenadines · Trinidad and Tobago* · United States
Dependencies and
other territories Anguilla · Aruba* · Bermuda · British Virgin Islands · Cayman Islands · Greenland · Guadeloupe · Martinique · Montserrat · Navassa Island · Netherlands Antilles* · Puerto Rico · Saint Barthélemy · Saint Martin · Saint Pierre and Miquelon · Turks and Caicos Islands · U. S. Virgin Islands
* Territories also in or commonly reckoned elsewhere in the Americas (South America).
[show]
v • d • e
Internet in South America
Sovereign states Argentina · Bolivia · Brazil · Chile · Colombia · Ecuador · Guyana · Panama* · Paraguay · Peru · Suriname · Trinidad and Tobago* · Uruguay · Venezuela
Dependencies Aruba* (Netherlands) · Falkland Islands (UK) · French Guiana (France) · Netherlands Antilles* (Netherlands) · South Georgia and the South Sandwich Islands (UK)
* Territories also in or commonly reckoned elsewhere in the Americas (North America).
[show]
v • d • e
Internet in Asia
Sovereign states
and other territories Afghanistan · Armenia · Azerbaijan1 · Bahrain · Bangladesh · Bhutan · Brunei · Burma · Cambodia · China (People's Republic of China [Hong Kong · Macau]) · Cyprus · East Timor1 · Egypt1 · Georgia1 · India · Indonesia1 · Iran · Iraq · Israel · Japan · Jordan · Kazakhstan1 · Korea (North Korea · South Korea) · Kuwait · Kyrgyzstan · Laos · Lebanon · Malaysia · Maldives · Mongolia · Nepal · Oman · Pakistan · Palestinian territories · Philippines · Qatar · Russia1 · Saudi Arabia · Singapore · Sri Lanka · Republic of China (Taiwan) · Syria · Tajikistan · Thailand · Turkey1 · Turkmenistan · United Arab Emirates · Uzbekistan · Vietnam · Yemen1
1countries spanning more than one continent
[show]
v • d • e
Internet in Europe
Sovereign states Albania · Andorra · Armenia1 · Austria · Azerbaijan2 · Belarus · Belgium · Bosnia and Herzegovina · Bulgaria · Croatia · Cyprus1 · Czech Republic · Denmark · Estonia · Finland · France · Georgia2 · Germany · Greece · Hungary · Iceland · Ireland · Italy · Kazakhstan2 · Latvia · Liechtenstein · Lithuania · Luxembourg · Republic of Macedonia · Malta · Moldova · Monaco · Montenegro · Netherlands · Norway · Poland · Portugal · Romania · Russia3 · San Marino · Serbia · Slovakia · Slovenia · Spain · Sweden · Switzerland · Turkey3 · Ukraine · United Kingdom
1 Entirely in Southwest Asia; included here because of cultural, political and historical association with Europe. 2 Partially or entirely in Asia, depending on the definition of the border between Europe and Asia. 3 Mostly in Asia.
[show]
v • d • e
Internet in Oceania
Australasia Australia · Norfolk Island · Christmas Island · Cocos (Keeling) Islands · New Zealand
Melanesia East Timor1 · Fiji · Indonesia1 · New Caledonia · Papua New Guinea · Solomon Islands · Vanuatu
Micronesia Guam · Kiribati · Marshall Islands · Northern Mariana Islands · Federated States of Micronesia · Nauru · Palau
Polynesia American Samoa · Cook Islands · French Polynesia · Niue · Pitcairn · Samoa · Tokelau · Tonga · Tuvalu · Wallis and Futuna
1 countries spanning more than one continent
Censorship
For more details on this topic, see Internet censorship.
Some governments, such as those of Cuba, Iran, North Korea, Myanmar, the People's Republic of China, and Saudi Arabia, restrict what people in their countries can access on the Internet, especially political and religious content. This is accomplished through software that filters domains and content so that they may not be easily accessed or obtained without elaborate circumvention.
In Norway, Denmark, Finland and Sweden, major Internet service providers have voluntarily (possibly to avoid such an arrangement being turned into law) agreed to restrict access to sites listed by police. While this list of forbidden URLs is only supposed to contain addresses of known child pornography sites, the content of the list is secret.[citation needed]
Many countries, including the United States, have enacted laws making the possession or distribution of certain material, such as child pornography, illegal, but do not use filtering software.
There are many free and commercially available software programs with which a user can choose to block offensive Web sites on individual computers or networks, such as to limit a child's access to pornography or violence. See Content-control software.
Internet access
For more details on this topic, see Internet access.
Wikibooks
Wikibooks has a book on the topic of
Online linux connect
Common methods of home access include dial-up, landline broadband (over coaxial cable, fiber optic or copper wires), Wi-Fi, satellite and 3G technology cell phones.
Public places to use the Internet include libraries and Internet cafes, where computers with Internet connections are available. There are also Internet access points in many public places such as airport halls and coffee shops, in some cases just for brief use while standing. Various terms are used, such as "public Internet kiosk", "public access terminal", and "Web payphone". Many hotels now also have public terminals, though these are usually fee-based. These terminals are widely accessed for various usage like ticket booking, bank deposit, online payment etc. Wi-Fi provides wireless access to computer networks, and therefore can do so to the Internet itself. Hotspots providing such access include Wi-Fi-cafes, where a would-be user needs to bring their own wireless-enabled devices such as a laptop or PDA. These services may be free to all, free to customers only, or fee-based. A hotspot need not be limited to a confined location. The whole campus or park, or even the entire city can be enabled. Grassroots efforts have led to wireless community networks. Commercial WiFi services covering large city areas are in place in London, Vienna, Toronto, San Francisco, Philadelphia, Chicago and Pittsburgh. The Internet can then be accessed from such places as a park bench.[5]
Apart from Wi-Fi, there have been experiments with proprietary mobile wireless networks like Ricochet, various high-speed data services over cellular phone networks, and fixed wireless services.
High-end mobile phones such as smartphones generally come with Internet access through the phone network. Web browsers such as Opera are available on these advanced handsets, which can also run a wide variety of other Internet software. More mobile phones have Internet access than PCs, though this is not as widely used. An Internet access provider and protocol matrix differentiates the methods used to get online.
Leisure
The Internet has been a major source of leisure since before the World Wide Web, with entertaining social experiments such as MUDs and MOOs being conducted on university servers, and humor-related Usenet groups receiving much of the main traffic. Today, many Internet forums have sections devoted to games and funny videos; short cartoons in the form of Flash movies are also popular. Over 6 million people use blogs or message boards as a means of communication and for the sharing of ideas.
The pornography and gambling industries have both taken full advantage of the World Wide Web, and often provide a significant source of advertising revenue for other Web sites. Although many governments have attempted to put restrictions on both industries' use of the Internet, this has generally failed to stop their widespread popularity.
One main area of leisure on the Internet is multiplayer gaming. This form of leisure creates communities, bringing people of all ages and origins to enjoy the fast-paced world of multiplayer games. These range from MMORPG to first-person shooters, from role-playing games to online gambling. This has revolutionized the way many people interact and spend their free time on the Internet.
While online gaming has been around since the 1970s, modern modes of online gaming began with services such as GameSpy and MPlayer, which players of games would typically subscribe to. Non-subscribers were limited to certain types of gameplay or certain games.
Many use the Internet to access and download music, movies and other works for their enjoyment and relaxation. As discussed above, there are paid and unpaid sources for all of these, using centralized servers and distributed peer-to-peer technologies. Discretion is needed as some of these sources take more care over the original artists' rights and over copyright laws than others.
Many use the World Wide Web to access news, weather and sports reports, to plan and book holidays and to find out more about their random ideas and casual interests.
People use chat, messaging and email to make and stay in touch with friends worldwide, sometimes in the same way as some previously had pen pals. Social networking Web sites like Myspace and Facebook many others like them also put and keep people in contact for their enjoyment.
The Internet has seen a growing number of Internet operating systems, where users can access their files, folders, and settings via the Internet. An example of an opensource webOS is Eyeos.
Cyberslacking has become a serious drain on corporate resources; the average UK employee spends 57 minutes a day surfing the Web at work, according to a study by Peninsula Business Services [4].
Complex architecture
Many computer scientists see the Internet as a "prime example of a large-scale, highly engineered, yet highly complex system".[6] The Internet is extremely heterogeneous. (For instance, data transfer rates and physical characteristics of connections vary widely.) The Internet exhibits "emergent phenomena" that depend on its large-scale organization. For example, data transfer rates exhibit temporal self-similarity. Further adding to the complexity of the Internet is the ability of more than one computer to use the Internet through only one node, thus creating the possibility for a very deep and hierarchal based sub-network that can theoretically be extended infinitely (disregarding the programmatic limitations of the IPv4 protocol). However, since principles of this architecture date back to the 1960s, it might not be a solution best suited to modern needs, and thus the possibility of developing alternative structures is currently being looked into.[7]
According to a June 2007 article in Discover Magazine, the combined weight of all the electrons moved within the internet in a day is 0.2 millionths of an ounce.[8] Others have estimated this at nearer 2 ounces (50 grams).[9]
Marketing
The Internet has also become a large market for companies; some of the biggest companies today have grown by taking advantage of the efficient nature of low-cost advertising and commerce through the Internet, also known as e-commerce. It is the fastest way to spread information to a vast number of people simultaneously. The Internet has also subsequently revolutionized shopping—for example; a person can order a CD online and receive it in the mail within a couple of days, or download it directly in some cases. The Internet has also greatly facilitated personalized marketing which allows a company to market a product to a specific person or a specific group of people more so than any other advertising medium.
Examples of personalized marketing include online communities such as MySpace, Friendster, Orkut, Facebook and others which thousands of Internet users join to advertise themselves and make friends online. Many of these users are young teens and adolescents ranging from 13 to 25 years old. In turn, when they advertise themselves they advertise interests and hobbies, which online marketing companies can use as information as to what those users will purchase online, and advertise their own companies' products to those users.
Further information: Disintermediation#Impact of Internet-related disintermediation upon various industries and Travel agency#The Internet threat
The name Internet
For more details on this topic, see Internet capitalization conventions.
Look up Internet, internet in Wiktionary, the free dictionary.
Internet is traditionally written with a capital first letter, as it is a proper noun. The Internet Society, the Internet Engineering Task Force, the Internet Corporation for Assigned Names and Numbers, the World Wide Web Consortium, and several other Internet-related organizations use this convention in their publications.
Many newspapers, newswires, periodicals, and technical journals capitalize the term (Internet). Examples include The New York Times, the Associated Press, Time, The Times of India, Hindustan Times, and Communications of the ACM.
Others assert that the first letter should be in lower case (internet), and that the specific article “the” is sufficient to distinguish “the internet” from other internets. A significant number of publications use this form, including The Economist, the Canadian Broadcasting Corporation, the Financial Times, The Guardian, The Times, and The Sydney Morning Herald. As of 2005, many publications using internet appear to be located outside of North America—although one U.S. news source, Wired News, has adopted the lower-case spelling.
Historically, Internet and internet have had different meanings, with internet meaning “an interconnected set of distinct networks,” and Internet referring to the world-wide, publicly-available IP internet. Under this distinction, "the Internet" is the familiar network via which websites exist, however "an internet" can exist between any two remote locations.[10] Any group of distinct networks connected together is an internet; each of these networks may or may not be part of the Internet. The distinction was evident in many RFCs, books, and articles from the 1980s and early 1990s (some of which, such as RFC 1918, refer to "internets" in the plural), but has recently fallen into disuse.[citation needed] Instead, the term intranet is generally used for private networks, whether they are connected to the Internet or not. See also: extranet.
Some people use the lower-case term as a medium (like radio or newspaper, e.g. I've found it on the internet), and first letter capitalized as the global network.
See also
Find more about Internet on Wikipedia's sister projects:
Dictionary definitions
Textbooks
Quotations
Source texts
Images and media
News stories
Learning resources
Main articles: List of basic Internet topics and List of Internet topics
Major aspects and issues
* Internet democracy
* History of the Internet
* Net neutrality
* Privacy on the Internet
Functions
* E-mail
* File-sharing
* Instant messaging
* Internet fax
* World Wide Web
* Voice over IP
* Mobile VoIP
Underlying infrastructure
* Internet Protocol (IP)
* Internet Service Provider (ISP)
Regulatory bodies
* Internet Assigned Numbers Authority (IANA)
* Internet Corporation for Assigned Names and Numbers (ICANN)
Notes
1. ^ ARPA/DARPA. Defense Advanced Research Projects Agency. Retrieved on 2007-05-21.
2. ^ DARPA Over the Years. Defense Advanced Research Projects Agency. Retrieved on 2007-05-21.
3. ^ Coffman, K. G; Odlyzko, A. M. (1998-10-02). "The size and growth rate of the Internet". AT&T Labs. Retrieved on 2007-05-21.
4. ^ Slabbert,N.J. The Technologies of Peace, Harvard International Review, June 2006.
5. ^ "Toronto Hydro to Install Wireless Network in Downtown Toronto". Bloomberg.com. Retrieved 19-Mar-2006.
6. ^ Walter Willinger, Ramesh Govindan, Sugih Jamin, Vern Paxson, and Scott Shenker (2002). Scaling phenomena in the Internet. In Proceedings of the National Academy of Sciences, 99, suppl. 1, 2573–2580.
7. ^ "Internet Makeover? Some argue it's time". The Seattle Times, April 16, 2007.
8. ^ "How Much Does The Internet Weigh?". Discover Magazine, June 2007.
9. ^ How Much Does The Internet Weigh? - The Unbearable Lightness Of Fact Checking
10. ^ What is the Internet?
References
* Media Freedom Internet Cookbook by the OSCE Representative on Freedom of the Media Vienna, 2004
* Living Internet—Internet history and related information, including information from many creators of the Internet
* First Monday peer-reviewed journal on the Internet
* How Much Does The Internet Weigh? by Stephen Cass, Discover 2007
* Rehmeyer, Julie J. 2007. Mapping a medusa: The Internet spreads its tentacles. Science News 171(June 23):387-388. Available at http://www.sciencenews.org/articles/20070623/fob2.asp .
* Sohn, Emily. 2006. Internet generation. Science News for Kids (Oct. 25). Available at http://www.sciencenewsforkids.org/articles/20061025/Feature1.asp .
* Castells, M. 1996. Rise of the Network Society. 3 vols. Vol. 1. Cambridge, MA: Blackwell Publishers.
* Castells, M. (2001), “Lessons from the History of Internet”, in “The Internet Galaxy”, Ch. 1, pp 9-35. Oxford Univ. Press.
External links
* "10 Years that changed the world" — Wired looks back at the evolution of the Internet over last 10 years
* Berkman Center for Internet and Society at Harvard
* A comprehensive history with people, concepts and quotations
* CBC Digital Archives—Inventing the Internet Age
* How the Internet Came to Be
* Internet Explained
* Global Internet Traffic Report
* The Internet Society History Page
* RFC 801, planning the TCP/IP switchover
* Archive CBC Video Circa 1990 about the Internet
* "The beginners guide to the internet."
* "Warriors of the net - A movie about the internet."
* "History of Nova Scotia-First on the Net"
Retrieved from "http://en.wikipedia.org/wiki/Internet"
Categories: Semi-protected | All articles with unsourced statements | Articles with
unsourced statements since August 2007 | Articles with unsourced statements since September 2007 | Articles with unsourced statements since April 2007 | Articles with unsourced statements since February 2007 | Internet | Telecommunications
Diposting oleh Unknown di 17.11 0 komentar
THE INTERNET DEBACLE - AN ALTERNATIVE VIEW
Originally written for Performing Songwriter Magazine, May 2002
"The Internet, and downloading, are here to stay... Anyone who thinks otherwise should prepare themselves to end up on the slagheap of history." (Janis Ian during a live European radio interview, 9-1-98) *Please see author's note at end
When I research an article, I normally send 30 or so emails to friends and acquaintances asking for opinions and anecdotes. I usually receive 10-20 in reply. But not so on this subject! I sent 36 emails requesting opinions and facts on free music downloading from the Net. I stated that I planned to adopt the viewpoint of devil's advocate: free Internet downloads are good for the music industry and its artists.
I've received, to date, over 300 replies, every single one from someone legitimately "in the music business." What's more interesting than the emails are the phone calls. I don't know anyone at NARAS (home of the Grammy Awards), and I know Hilary Rosen (head of rhe Recording Industry Association of America, or RIAA) only vaguely. Yet within 24 hours of sending my original email, I'd received two messages from Rosen and four from NARAS requesting that I call to "discuss the article."
Huh. Didn't know I was that widely read.
Ms. Rosen, to be fair, stressed that she was only interested in presenting RIAA's side of the issue, and was kind enough to send me a fair amount of statistics and documentation, including a number of focus group studies RIAA had run on the matter. However, the problem with focus groups is the same problem anthropologists have when studying peoples in the field - the moment the anthropologist's presence is known, everything changes. Hundreds of scientific studies have shown that any experimental group wants to please the examiner. For focus groups, this is particularly true. Coffee and donuts are the least of the pay-offs.
The NARAS people were a bit more pushy. They told me downloads were "destroying sales", "ruining the music industry", and "costing you money". Costing me money? I don't pretend to be an expert on intellectual property law, but I do know one thing. If a music industry executive claims I should agree with their agenda because it will make me more money, I put my hand on my wallet…and check it after they leave, just to make sure nothing's missing
.Am I suspicious of all this hysteria? You bet. Do I think the issue has been badly handled? Absolutely. Am I concerned about losing friends, opportunities, my 10th Grammy nomination by publishing this article? Yeah. I am. But sometimes things are just wrong, and when they're that wrong, they have to be addressed.
The premise of all this ballyhoo is that the industry (and its artists) are being harmed by free downloading. Nonsense.
Let's take it from my personal experience. My site (www.janisian.com ) gets an average of 75,000 hits a year. Not bad for someone whose last hit record was in 1975. When Napster was running full-tilt, we received about 100 hits a month from people who'd downloaded Society's Child or At Seventeen for free, then decided they wanted more information. Of those 100 people (and these are only the ones who let us know how they'd found the site), 15 bought CDs. Not huge sales, right? No record company is interested in 180 extra sales a year. But… that translates into $2700, which is a lot of money in my book. And that doesn't include the ones who bought the CDs in stores, or who came to my shows.
Or take author Mercedes Lackey, who occupies entire shelves in stores and libraries. As she said herself: "For the past ten years, my three "Arrows" books, which were published by DAW about 15 years ago, have been generating a nice, steady royalty check per pay-period each. A reasonable amount, for fifteen-year-old books. However... I just got the first half of my DAW royalties...And suddenly, out of nowhere, each Arrows book has paid me three times the normal amount!...And because those books have never been out of print, and have always been promoted along with the rest of the backlist, the only significant change during that pay-period was something that happened over at Baen, one of my other publishers. That was when I had my co-author Eric Flint put the first of my Baen books on the Baen Free Library site. Because I have significantly more books with DAW than with Baen, the increases showed up at DAW first.There's an increase in all of the books on that statement, actually, and what it looks like is what I'd expect to happen if a steady line of people who'd never read my stuff encountered it on the Free Library - a certain percentage of them liked it, and started to work through my backlist, beginning with the earliest books published.
"The really interesting thing is, of course, that these aren't Baen books, they're DAW---another publisher---so it's 'name loyalty' rather than 'brand loyalty.' I'll tell you what, I'm sold. Free works."
I've found that to be true myself; every time we make a few songs available on my website, sales of all the CDs go up. A lot. And I don't know about you, but as an artist with an in-print record catalogue that dates back to 1965, I'd be thrilled to see sales on my old catalogue rise.
Now, RIAA and NARAS, as well as most of the entrenched music industry, are arguing that free downloads hurt sales. (More than hurt - they're saying it's destroying the industry.) Alas, the music industry needs no outside help to destroy itself. We're doing a very adequate job of that on our own, thank you.
Here are a few statements from the RIAA's website:
1. "Analysts report that just one of the many peer-to-peer systems in operation is responsible for over 1.8 billion unauthorized downloads per month". (Hilary B. Rosen letter to the Honorable Rick Boucher, Congressman, February 28, 2002) "Sales of blank CD-R discs have…grown nearly 2 ½ times in the last two years…if just half the blank discs sold in 2001 were used to copy music, the number of burned CDs worldwide is about the same as the number of CDs sold at retail." (Hilary B. Rosen letter to the Honorable Rick Boucher, Congressman, February 28, 2002) "Music sales are already suffering from the impact…in the United States, sales decreased by more than 10% in 2001." (Hilary B. Rosen letter to the Honorable Rick Boucher, Congressman, February 28, 2002)
2. "In a recent survey of music consumers, 23%…said they are not buying more music because they are downloading or copying their music for free." (Hilary B. Rosen letter to the Honorable Rick Boucher, Congressman, February 28, 2002)
Let's take these points one by one, but before that, let me remind you of something: the music industry had exactly the same response to the advent of reel-to-reel home tape recorders, cassettes, DATs, minidiscs, VHS, BETA, music videos ("Why buy the record when you can tape it?"), MTV, and a host of other technological advances designed to make the consumer's life easier and better. I know, because I was there.The only reason they didn't react that way publicly to the advent of CDs was because they believed CD's were uncopyable. I was told this personally by a former head of Sony marketing, when they asked me to license Between the Lines in CD format at a reduced royalty rate. ("Because it's a brand new technology.")
1. Who's to say that any of those people would have bought the CD's if the songs weren't available for free? I can't find a single study on this, one where a reputable surveyor such as Gallup actually asks people that question. I think no one's run one because everyone is afraid of the truth - most of the downloads are people who want to try an artist out, or who can't find the music in print.
And if a percentage of that 1.8 billion is because people are downloading a current hit by Britney or In Sync, who's to say it really hurt their sales? Soft statistics are easily manipulated. How many of those people went out and bought an album that had been over-played at radio for months, just because they downloaded a portion of it? Sales of blank CDs have grown? You bet. I bought a new Vaio computer in December (ironically enough, made by Sony), and now back up all my files onto CD. I go through 7-15 CD's a week that way, or about 500 a year. Most new PC's come with XP, which makes backing up to CD painless; how many people are doing what I'm doing?
2. Additionally, when I buy a new CD, I make a copy for my car, a copy for upstairs, and a copy for my partner. That's three blank discs per CD. So I alone account for around 750 blank CDs yearly. I'm sure the sales decrease had nothing to do with the economy's decrease, or a steady downward spiral in the music industry, or the garbage being pushed by record companies. Aren't you?
There were 32,000 new titles released in this country in 2001, and that's not including re-issues, DIY's , or smaller labels that don't report to SoundScan. Our "Unreleased" series, which we haven't bothered SoundScanning, sold 6,000+ copies last year. A conservative estimate would place the number of "newly available" CD's per year at 100,000. That's an awful lot of releases for an industry that's being destroyed.
And to make matters worse, we hear music everywhere, whether we want to or not; stores, amusement parks, highway rest stops. The original concept of Muzak (to be played in elevators so quietly that its soothing effect would be subliminal) has run amok. Why buy records when you can learn the entire Top 40 just by going shopping for groceries?
3. Which music consumers? College kids who can't afford to buy 10 new CDs a month, but want to hear their favorite groups? When I bought my nephews a new Backstreet Boys CD, I asked why they hadn't downloaded it instead. They patiently explained to their senile aunt that the download wouldn't give them the cool artwork, and more important, the video they could see only on the CD.
Realistically, why do most people download music? To hear new music, or records that have been deleted and are no longer available for purchase. Not to avoid paying $5 at the local used CD store, or taping it off the radio, but to hear music they can't find anywhere else. Face it - most people can't afford to spend $15.99 to experiment. That's why listening booths (which labels fought against, too) are such a success.
You can't hear new music on radio these days; I live in Nashville, "Music City USA", and we have exactly one station willing to play a non-top-40 format. On a clear day, I can even tune it in. The situation's not much better in Los Angeles or New York. College stations are sometimes bolder, but their wattage is so low that most of us can't get them.
One other major point: in the hysteria of the moment, everyone is forgetting the main way an artist becomes successful - exposure. Without exposure, no one comes to shows, no one buys CDs, no one enables you to earn a living doing what you love. Again, from personal experience: in 37 years as a recording artist, I've created 25+ albums for major labels, and I've never once received a royalty check that didn't show I owed them money. So I make the bulk of my living from live touring, playing for 80-1500 people a night, doing my own show. I spend hours each week doing press, writing articles, making sure my website tour information is up to date. Why? Because all of that gives me exposure to an audience that might not come otherwise. So when someone writes and tells me they came to my show because they'd downloaded a song and gotten curious, I am thrilled!
Who gets hurt by free downloads? Save a handful of super-successes like Celine Dion, none of us. We only get helped.
But not to hear Congress tell it. Senator Fritz Hollings, chairman of the Senate Commerce Committee studying this, said "When Congress sits idly by in the face of these [file-sharing] activities, we essentially sanction the Internet as a haven for thievery", then went on to charge "over 10 million people" with stealing. [Steven Levy, Newsweek 3/11/02]. That's what we think of consumers - they're thieves, out to get something for nothing.
Baloney. Most consumers have no problem paying for entertainment. One has only to look at the success of Fictionwise.com and the few other websites offering books and music at reasonable prices to understand that. If the music industry had a shred of sense, they'd have addressed this problem seven years ago, when people like Michael Camp were trying to obtain legitimate licenses for music online. Instead, the industry-wide attitude was "It'll go away". That's the same attitude CBS Records had about rock 'n' roll when Mitch Miller was head of A&R. (And you wondered why they passed on The Beatles and The Rolling Stones.)
I don't blame the RIAA for Holling's attitude. They are, after all, the Recording Industry Association of America, formed so the labels would have a lobbying group in Washington. (In other words, they're permitted to make contributions to politicians and their parties.) But given that our industry's success is based on communication, the industry response to the Internet has been abysmal. Statements like the one above do nothing to help the cause.
Of course, communication has always been the artist's job, not the executives. That's why it's so scary when people like current NARAS president Michael Greene begin using shows like the Grammy Awards to drive their point home.
Grammy viewership hit a six-year low in 2002. Personally, I found the program so scintillating that it made me long for Rob Lowe dancing with Snow White, which at least was so bad that it was entertaining. Moves like the ridiculous Elton John-Eminem duet did little to make people want to watch again the next year. And we're not going to go into the Los Angeles Times' Pulitzer Prize-winning series on Greene and NARAS, where they pointed out that MusiCares has spent less than 10% of its revenue on disbursing emergency funds for people in the music industry (its primary purpose), or that Greene recorded his own album, pitched it to record executives while discussing Grammy business, then negotiated a $250,000 contract with Mercury Records for it (later withdrawn after the public flap). Or that NARAS quietly paid out at least $650,000 to settle a sexual harassment suit against him, a portion of which the non-profit Academy paid. Or that he's paid two million dollars a year, along with "perks" like his million-dollar country club membership and Mercedes. (Though it does make one wonder when he last entered a record store and bought something with his own hard-earned money.)
Let's just note that in his speech he told the viewing audience that NARAS and RIAA were, in large part, taking their stance to protect artists. He hired three teenagers to spend a couple of days doing nothing but downloading, and they managed to download "6,000 songs". Come on. For free "front-row seats" at the Grammys and an appearance on national TV, I'd download twice that amount! But…who's got time to download that many songs? Does Greene really think people out there are spending twelve hours a day downloading our music? If they are, they must be starving to death, because they're not making a living or going to school. How many of us can afford a T-1 line?
This sort of thing is indicative of the way statistics and information are being tossed around. It's dreadful to think that consumers are being asked to take responsibility for the industry's problems, which have been around far longer than the Internet. It's even worse to think that the consumer is being told they are charged with protecting us, the artists, when our own industry squanders the dollars we earn on waste and personal vendettas.
Greene went on to say that "Many of the nominees here tonight, especially the new, less-established artists, are in immediate danger of being marginalized out of our business." Right. Any "new" artist who manages to make the Grammys has millions of dollars in record company money behind them. The "real" new artists aren't people you're going to see on national TV, or hear on most radio. They're people you'll hear because someone gave you a disc, or they opened at a show you attended, or were lucky enough to be featured on NPR or another program still open to playing records that aren't already hits.
As to artists being "marginalized out of our business," the only people being marginalized out are the employees of our Enron-minded record companies, who are being fired in droves because the higher-ups are incompetent.
And it's difficult to convince an educated audience that artists and record labels are about to go down the drain because they, the consumer, are downloading music. Particularly when they're paying $50-$125 apiece for concert tickets, and $15.99 for a new CD they know costs less than a couple of dollars to manufacture and distribute.
I suspect Greene thinks of downloaders as the equivalent of an old-style television drug dealer, lurking next to playgrounds, wearing big coats and whipping them open for wide-eyed children who then purchase black market CD's at generous prices.
What's the new industry byword? Encryption. They're going to make sure no one can copy CDs, even for themselves, or download them for free. Brilliant, except that it flouts previous court decisions about blank cassettes, blank videotapes, etc. And it pisses people off.
How many of you know that many car makers are now manufacturing all their CD players to also play DVD's? or that part of the encryption record companies are using doesn't allow your store-bought CD to be played on a DVD player, because that's the same technology as your computer? And if you've had trouble playing your own self-recorded copy of O Brother Where Art Thou in the car, it's because of this lunacy.
The industry's answer is to put on the label: "This audio CD is protected against unauthorized copying. It is designed to play in standard audio CD players and computers running Windows O/S; however, playback problems may be experienced. If you experience such problems, return this disc for a refund."
Now I ask you. After three or four experiences like that, shlepping to the store to buy it, then shlepping back to return it (and you still don't have your music), who's going to bother buying CD's?
The industry has been complaining for years about the stranglehold the middle-man has on their dollars, yet they wish to do nothing to offend those middle-men. (BMG has a strict policy for artists buying their own CDs to sell at concerts - $11 per CD. They know very well that most of us lose money if we have to pay that much; the point is to keep the big record stores happy by ensuring sales go to them. What actually happens is no sales to us or the stores.) NARAS and RIAA are moaning about the little mom & pop stores being shoved out of business; no one worked harder to shove them out than our own industry, which greeted every new Tower or mega-music store with glee, and offered steep discounts to Target and WalMart et al for stocking CDs. The Internet has zero to do with store closings and lowered sales.
And for those of us with major label contracts who want some of our music available for free downloading… well, the record companies own our masters, our outtakes, even our demos, and they won't allow it. Furthermore, they own our voices for the duration of the contract, so we can't even post a live track for downloading!
If you think about it, the music industry should be rejoicing at this new technological advance! Here's a fool-proof way to deliver music to millions who might otherwise never purchase a CD in a store. The cross-marketing opportunities are unbelievable. It's instantaneous, costs are minimal, shipping non-existant…a staggering vehicle for higher earnings and lower costs.
Instead, they're running around like chickens with their heads cut off, bleeding on everyone and making no sense.
As an alternative to encrypting everything, and tying up money for years (potentially decades) fighting consumer suits demanding their first amendment rights be protected (which have always gone to the consumer, as witness the availability of blank and unencrypted VHS tapes and casettes), why not take a tip from book publishers and writers? Baen Free Library is one success story. SFWA is another. The SFWA site is one of the best out there for hands-on advice to writers, featuring in depth articles about everything from agent and publisher scams, to a continuously updated series of reports on various intellectual property issues. More important, many of the science fiction writers it represents have been heavily involved in the Internet since its inception. Each year, when the science fiction community votes for the Hugo and Nebula Awards (their equivalent of the Grammys), most of the works nominated are put on the site in their entirety, allowing voters and non-voters the opportunity to peruse them. Free. If you are a member or associate (at a nominal fee), you have access to even more works. The site is also full of links to members' own web pages and on-line stories, even when they aren't nominated for anything. Reading this material, again for free, allows browsers to figure out which writers they want to find more of - and buy their books.
Wouldn't it be nice if all the records nominated for awards each year were available for free downloading, even if it were only the winners? People who hadn't bought the albums might actually listen to the singles, then go out and purchase the records.
I have no objection to Greene et al trying to protect the record labels, who are the ones fomenting this hysteria. RIAA is funded by them. NARAS is supported by them. However, I object violently to the pretense that they are in any way doing this for our benefit. If they really wanted to do something for the great majority of artists, who eke out a living against all odds, they could tackle some of the real issues facing us:
* The normal industry contract is for seven albums, with no end date, which would be considered at best indentured servitude (and at worst slavery) in any other business. In fact, it would be illegal.
* A label can shelve your project, then extend your contract by one more album because what you turned in was "commercially or artistically unacceptable". They alone determine that criteria.
* Singer-songwriters have to accept the "Controlled Composition Clause" (which dictates that they'll be paid only 75% of the rates set by Congress in publishing royalties) for any major or subsidiary label recording contract, or lose the contract. Simply put, the clause demanded by the labels provides that a) if you write your own songs, you will only be paid 3/4 of what Congress has told the record companies they must pay you, and b) if you co-write, you will use your "best efforts" to ensure that other songwriters accept the 75% rate as well. If they refuse, you must agree to make up the difference out of your share.
* Congressionally set writer/publisher royalties have risen from their 1960's high (2 cents per side) to a munificent 8 cents.Many of us began in the 50's and 60's; our records are still in release, and we're still being paid royalty rates of 2% (if anything) on them.If we're not songwriters, and not hugely successful commercially (as in platinum-plus), we don't make a dime off our recordings. Recording industry accounting procedures are right up there with films.
* Worse yet, when records go out-of-print, we don't get them back! We can't even take them to another company. Careers have been deliberately killed in this manner, with the record company refusing to release product or allow the artist to take it somewhere else.
* And because a record label "owns" your voice for the duration of the contract, you can't go somewhere else and re-record those same songs they turned down.
* And because of the re-record provision, even after your contract is over, you can't record those songs for someone else for years, and sometimes decades.
* Last but not least, America is the only country I am aware of that pays no live performance royalties to songwriters. In Europe, Japan, Australia, when you finish a show, you turn your set list in to the promoter, who files it with the appropriate organization, and then pays a small royalty per song to the writer. It costs the singer nothing, the rates are based on venue size, and it ensures that writers whose songs no longer get airplay, but are still performed widely, can continue receiving the benefit from those songs.
Additionally, we should be speaking up, and Congress should be listening. At this point they're only hearing from multi-platinum acts. What about someone like Ani Difranco, one of the most trusted voices in college entertainment today? What about those of us who live most of our lives outside the big corporate system, and who might have very different views on the subject?
There is zero evidence that material available for free online downloading is financially harming anyone. In fact, most of the hard evidence is to the contrary.
Greene and the RIAA are correct in one thing - these are times of great change in our industry. But at a time when there are arguably only four record labels left in America (Sony, AOL/Time/Warner, Universal, BMG - and where is the RICO act when we need it?) …when entire genres are glorifying the gangster mentality and losing their biggest voices to violence …when executives change positions as often as Zsa Zsa Gabor changed clothes, and "A&R" has become a euphemism for "Absent & Redundant" … well, we have other things to worry about.
It's absurd for us, as artists, to sanction - or countenance - the shutting down of something like this. It's sheer stupidity to rejoice at the Napster decision. Short-sighted, and ignorant.
Free exposure is practically a thing of the past for entertainers. Getting your record played at radio costs more money than most of us dream of ever earning. Free downloading gives a chance to every do-it-yourselfer out there. Every act that can't get signed to a major, for whatever reason, can reach literally millions of new listeners, enticing them to buy the CD and come to the concerts. Where else can a new act, or one that doesn't have a label deal, get that kind of exposure?
Please note that I am not advocating indiscriminate downloading without the artist's permission. I am not saying copyrights are meaningless. I am objecting to the RIAA spin that they are doing this to protect "the artists", and make us more money. I am annoyed that so many records I once owned are out of print, and the only place I could find them was Napster. Most of all, I'd like to see an end to the hysteria that causes a group like RIAA to spend over 45 million dollars in 2001 lobbying "on our behalf", when every record company out there is complaining that they have no money.
We'll turn into Microsoft if we're not careful, folks, insisting that any household wanting an extra copy for the car, the kids, or the portable CD player, has to go out and "license" multiple copies.
As artists, we have the ear of the masses. We have the trust of the masses. By speaking out in our concerts and in the press, we can do a great deal to damp this hysteria, and put the blame for the sad state of our industry right back where it belongs - in the laps of record companies, radio programmers, and our own apparent inability to organize ourselves in order to better our own lives - and those of our fans. If we don't take the reins, no one will.
Sources:
Baenbooks.com, BMG Records, Chicago Tribune, CNN.com, Congressional Record, Eonline.com, Grammy.com, LATimes.com, Newsweek, Radiocrow.com, RIAA.org, personal communications
* for more information on the Free Library, go to www.baen.com/library .
Read Janis' follow up to this article: FALLOUT - a follow up to The Internet Debacle
This article has been revised to ensure factual accuracy.
Author's note: You are welcome to post this article on any cooperating website, or in any print magazine, provided that you include a link directed to
http://www.janisian.com
and writer's credit!
Additionally, we put our money where my mouth is. We offer songs in mp3 format for free downloading...and if we can ever afford the server space, we'll try to put a bunch of them up there at once! These are songs I own and control both the copyright and master to; you are welcome to share these files with your friends.
Want to know how your politicians are voting on these issues? Go to www.vote-smart.org/
Write to your representative and be heard on this subject!
Diposting oleh Unknown di 17.08 0 komentar