So what is meant by Web2.0? Is it really anything fundamentally new, or just an advancement in the use of web technology? Well, really just the latter. Really Web2.0 is about neat, tidy websites, that put the reader first. They are more minimalist, and also offer services other than just content etc. Probably the most obvious difference between Web2.0 and what came before is the way readers can become contributors to websites.
Web 2.0 refers to a supposed second-generation of Internet-based services — such as social networking sites, wikis, communication tools, and folksonomies — that let people collaborate and share information online in previously unavailable ways. O’Reilly Media, in collaboration with MediaLive International, used the phrase as a title for a series of conferences and since 2004 it has become a popular (though ill-defined and often criticized) buzzword amongst certain technical and marketing communities.
Alluding to the version-numbers that commonly designate software upgrades, the phrase “Web 2.0″ hints at an improved form of the World Wide Web, and some people have used the term for several years.
In the opening talk of the first Web 2.0 conference, Tim O’Reilly and John Battelle summarized key principles they believed characterized Web 2.0 applications:
- The Web as a platform
- Data as the driving force
- Network effects created by an architecture of participation
- Innovation in assembly of systems and sites composed by pulling together features from distributed, independent developers (a kind of “open source” development)
- Lightweight business models enabled by content and service syndication
- The end of the software adoption cycle (“the perpetual beta”)
- Software above the level of a single device, leveraging the power of The Long Tail.
Earlier users of the phrase “Web 2.0″ employed it as a synonym for “semantic web,” and indeed, the two concepts complement each other. The combination of social-networking systems such as FOAF and XFN with the development of tag-based folksonomies, delivered through blogs and wikis, sets up a basis for a semantic environment. Although the technologies and services that make up Web 2.0 lack the effectiveness of an internet in which the machines can understand and extract meaning (as proponents of the Semantic Web envision), Web 2.0 represents a step in its direction.
As used by its proponents, the phrase “Web 2.0″ refers to one or more of the following:
- The transition of websites from isolated information silos to sources of content and functionality, thus becoming computing platforms serving web applications to end users
- A social phenomenon embracing an approach to generating and distributing Web content itself, characterized by open communication, decentralization of authority, freedom to share and re-use, and “the market as a conversation”
- A more organized and categorized content, with a far more developed deep-linking web architecture
- A shift in economic value of the Web, possibly surpassing that of the dot com boom of the late 1990s
- A marketing-term used to differentiate new web businesses from those of the dot com boom, which due to the bust subsequently seem discredited
- The resurgence of excitement around the implications of innovative web-applications and services that gained a lot of momentum around mid-2005
Many find it easiest to define Web 2.0 by associating it with companies or products that embody its principles. Tim O’Reilly gave examples in his description of his “four plus one” levels in the hierarchy of Web 2.0-ness:[1]
Level-3 applications, the most “Web 2.0″, which could only exist on the Internet, deriving their power from the human connections and network effects Web 2.0 makes possible, and growing in effectiveness the more people use them. O’Reilly gives as examples: eBay, craigslist, Wikipedia, del.icio.us, Skype, dodgeball, and Adsense.
Level-2 applications, which can operate offline but which gain advantages from going online. O’Reilly cited Flickr, which benefits from its shared photo-database and from its community-generated tag database.
Level-1 applications, also available offline but which gain features online. O’Reilly pointed to Writely (gaining group-editing capability online) and iTunes (because of its music-store portion).
Level-0 applications would work as well offline. O’Reilly gave the examples of MapQuest, Yahoo! Local, and Google Maps.
Examples of Web 2.0 other than those cited by O’Reilly include Digg, Pligg, Shoutwire, last.fm, and Technorati.
Commentators see many recently-developed concepts and technologies as contributing to Web 2.0, including weblogs, linklogs, wikis, podcasts, RSS feeds and other forms of many-to-many publishing; social software, web APIs, web standards, online web services, and others.
Proponents of the Web 2.0 concept say that it differs from early web development (retrospectively labeled Web 1.0) in that it moves away from static websites, the use of search engines, and surfing from one website to the next, towards a more dynamic and interactive World Wide Web. Others argue that later developments have not actually superseded the original and fundamental concepts of the Skeptics may see the term “Web 2.0″ as little more than a buzzword; or they may suggest that it means whatever its proponents want it to mean in order to convince their customers, investors and the media that they have begun building something fundamentally new, rather than continuing to develop and use well-established technologies[2].
Earlier web applications or “Web 1.0″ (so dubbed after the event by proponents of Web 2.0) often consisted of static HTML pages, rarely (if ever) updated. They depended solely on HTML, which a new Internet content-provider could learn fairly easily. The success of the dot-com era depended on a more dynamic Web (sometimes labeled Web 1.5) where content-management systems served dynamic HTML web-pages generated on-the-fly from a content database more amenable than raw HTML-code to change. In both senses, marketeers regarded so-called “eyeballing” as intrinsic to the Web experience, thus making page-hits and visual aesthetics important factors.
Proponents of the Web 2.0 approach believe that Web usage has started increasingly moving towards interaction and towards rudimentary social networks, which can serve content that exploits network effects with or without creating a visual, interactive web page. In one view, Web 2.0 sites act more as points of presence, or user-dependent web portals, than as traditional websites. They have become so internally complex that new Internet users cannot create analogous websites, but remain mere users of web services provided by specialist professional experts.
Access to consumer-generated content facilitated by Web 2.0 brings the web closer to Tim Berners-Lee’s original concept of the web as a democratic, personal, and DIY medium of communication.
Characteristics of Web 2.0
While interested parties continue to debate the definition of a Web 2.0 application, some suggest that a Web 2.0 website may exhibit some basic characteristics. These might include:
“Network as platform” — delivering (and allowing users to use) applications entirely through a web-browser[3] [4].
Users owning the data on the site and exercising control over that data[5][3].
An architecture of participation and democracy that encourages users to add value to the application as they use it[3][6].
A rich, interactive, user-friendly interface based on Ajax[3][6].
Some social-networking aspects[5][3].
Technology overview
The complex and evolving technology infrastructure of Web 2.0 includes server-software, content-syndication, messaging-protocols, standards-based browsers with plugins and extensions, and various client-applications. These differing but complementary approaches provide Web 2.0 with information-storage, creation, and dissemination capabilities that go beyond what the public formerly expected of websites.
A Web 2.0 website typically features a number of the following techniques:
- AJAX and other Rich Internet Application techniques
- CSS
- Semantically valid XHTML markup and/or the use of Microformats
- Syndication and aggregation of data in RSS/Atom
- Clean and meaningful URLs
- Extensive use of folksonomies (in the form of tags or tagclouds, for example)
- Weblog publishing
- Mashups
- REST or XML Webservice APIs
Innovations associated with “Web 2.0″ – Web-based communities
Some websites that potentially sit under the Web 2.0 umbrella have built new online social networks amongst the general public. Some of the websites run social software where people work together. Other websites reproduce several individuals’ RSS feeds on one page. Other ones provide deeplinking between individual websites.
The syndication and messaging capabilities of Web 2.0 have fostered, to a greater or lesser degree, a tightly-woven social fabric among individuals. Arguably, the nature of online communities has changed in recent months and years. The meaning of these inferred changes, however, has pundits divided. Basically, ideological lines run thusly: Web 2.0 either empowers the individual and provides an outlet for the “voice of the voiceless”; or it elevates the amateur to the detriment of professionalism, expertise and clarity.
Web-based applications and desktops
The richer user-experience afforded by Ajax has prompted the development of web-sites that mimic personal computer applications, such as word processing, the spreadsheet, and slide-show presentation. WYSIWYG wiki sites replicate many features of PC authoring applications. Still other sites perform collaboration and project management functions. Java enables sites that provide computation-intensive video capability. Google, Inc. acquired one of the best-known sites of this broad class, Writely, in early 2006.
Several browser-based “operating systems” or “online desktops” have also appeared. They essentially function as application platforms, not as operating systems per se. These services mimic the user experience of desktop operating-systems, offering features and applications similar to a PC environment. They have as their distinguishing characteristic the ability to run within any modern browser.
Numerous web-based application services appeared during the Dot-com bubble of 1997 – 2001 and then vanished, having failed to gain a critical mass of customers. In 2005 WebEx acquired the best-known of these, Intranets.com, for slightly more than the total it had raised in venture capital after six years of trading.
Rich Internet applications
Recently, rich-Internet application techniques such as Ajax, Adobe Flash and Flex have evolved that can improve the user-experience in browser-based web applications. Flash/Flex involves a web-page requesting an update for some part of its content, and altering that part in the browser, without refreshing the whole page at the same time.
Server-side software
The functionality of Web 2.0 rich Internet applications builds on the existing web server architecture, but puts much greater emphasis on back-end software. Syndication differs only nominally from the methods of publishing using dynamic content management, but web services typically require much more robust database and workflow support, and become very similar to the traditional intranet functionality of an application server. Vendor approaches to date fall under either a universal server approach, which bundles most of the necessary functionality in a single server platform, or a web-server plugin approach, which uses standard publishing tools enhanced with API interfaces and other tools.
Client-side software
The extra functionality provided by Web 2.0 depends on the ability of users to work with the data stored on servers. This can come about through forms in an HTML page, through a scripting language such as Javascript, or through Flash or Java. These methods all make use of the client computer to reduce the server workload.
RSS
The first and the most important step (by one point of view) of the evolution towards Web 2.0 involves the syndication of website content, using standardized protocols which permit end-users to make use of a site’s data in another context, ranging from another website, to a browser plugin, or to a separate desktop application. Protocols which permit syndication include RSS (Really Simple Syndication — also known as web syndication), RDF (as in RSS 1.1), and Atom, all of them flavors of XML. Specialized protocols such as FOAF and XFN (both for social networking) extend functionality of sites or permit end-users to interact without centralized websites. See microformats for more specialized data formats.
Due to the recent development of these trends, many of these protocols remain de facto (rather than formal) standards.
Web protocols
Web communication protocols provide a key element of the Web 2.0 infrastructure. Major protocols include REST and SOAP.
REST (Representational State Transfer) indicates a way to access and manipulate data on a server using the HTTP verbs GET, POST, PUT, and DELETE
SOAP involves POSTing XML messages and requests to a server that may contain quite complex, but pre-defined, instructions for the server to follow
In both cases, an API defines access to the service. Often servers use proprietary APIs, but standard web-service APIs (for example, for posting to a blog) have also come into wide use. Most (but not all) communications with web services involve some form of XML (eXtensible Markup Language).
See also WSDL (Web Services Description Language) (the standard way of publishing a SOAP API) and the list of Web service specifications for links to many other web service standards, including those many whose names begin ‘WS-’.
Criticism
Given the lack of set standards as to what “Web 2.0″ actually means, implies, or requires, the term can mean radically different things to different people. For instance, many people pushing Web 2.0, talk about well-formed, validated HTML; however, not many production sites actually adhere to this standard. Many people will also talk about web sites “degrading gracefully” (designing a website so that its fundamental features remain usable by people who access it with software that does not support every technology employed by the site); however, the addition of Ajax scripting to websites can render the website completely unusable to anyone browsing with JavaScript turned off, or using a slightly older browser. Many have complained that the proliferation of Ajax scripts, in combination with unknowledgable webmasters, has increased the instances of “tag soup”: websites where coders have apparently thrown script tags and other semantically useless tags about the HTML-file with little organization in mind, in a way that occurred more commonly during the dot-com boom, and which many standards-proponents have tried to eschew. Some critics also object to cluttered, arcane navigation structures in Web 2.0 websites.
Many of the ideas of Web 2.0 featured on networked systems well before the term “Web 2.0″ emerged; Amazon.com, for instance, has allowed users to write reviews and consumer guides since its inception, in a form of self-publishing; and opened up its API to outside developers in 2002[7]. Prior art also comes from research in Computer Supported Collaborative Learning and Computer Supported Cooperative Work.
Conversely, when a website proclaims itself “Web 2.0″ for the use of some trivial feature (such as blogs or gradient boxes) observers may generally consider it more an attempt at self-promotion than an actual endorsement of the ideas behind Web 2.0. “Web 2.0″ in such circumstances has sometimes sunk simply to the status of a marketing buzzword, like ‘synergy’, that can mean whatever a salesperson wants it to mean, with little connection to most of the worthy but (currently) unrelated ideas originally brought together under the “Web 2.0″ banner. The argument also exists that “Web 2.0″ does not represent a new version of World Wide Web at all, but merely continues to use “Web 1.0″ technologies and concepts.
Other criticism has included the term “a second bubble“, suggesting that too many Web 2.0 companies attempt to develop the same product with a lack of business models. The Economist magazine has written of “Bubble 2.0“.
Some venture capitalists have noted that the second generation of web applications has too few users to make them an economically-viable target for consumer applications. Josh Kopelman noted that Web 2.0 excited only 53,651 people (the then number of subscribers to TechCrunch, a weblog covering Web 2.0 matters).
Trademark controversy
In November 2003, CMP Media applied to the USPTO for a service mark on the use of the term “WEB 2.0″ for live events[8]. On the basis of this application, CMP Media sent a cease-and-desist demand to the Irish non-profit organization IT@Cork on May 24, 2006[9], but retracted it two days later[10]. The “WEB 2.0″ service mark registration passed final PTO Examining Attorney review on May 10, 2006, but as of June 12, 2006 the PTO has not published the mark for opposition. The European Union application (which would confer unambigious status in Ireland) remains pending (app no 004972212) after its filing on March 23, 2006.
References
^ Tim O’Reilly (2006-07-17). Levels of the Game: The Hierarchy of Web 2.0 Applications. O’Reilly radar. Retrieved on 2006-08-08.
^ Jeffrey Zeldman (2006-01-16). Web 3.0. A List Apart. Retrieved on 2006-05-27.
^ a b c d e Tim O’Reilly (2005-09-30). What Is Web 2.0. O’Reilly Network. Retrieved on 2006-08-06.
^ Web operating system
^ a b Dion Hinchcliffe (2006-04-02). The State of Web 2.0. Web Services Journal. Retrieved on 2006-08-06.
^ a b Paul Graham (November 2005). Web 2.0. Retrieved on 2006-08-02.
^ Tim O’Reilly (2002-06-18). Amazon Web Services API. O’Reilly Network. Retrieved on 2006-05-27.
^ USPTO serial number 78322306
^ O’Reilly and CMP Exercise Trademark on ‘Web 2.0′. Slashdot (2006-05-26). Retrieved on 2006-05-27.
^ Nathan Torkington (2006-05-26). O’Reilly’s coverage of Web 2.0 as a service mark. O’Reilly Radar. Retrieved on 2006-06-01.
This article is licensed under the GNU Free Documentation License. It uses material from the Wikipedia article “Web 2.0“