Puneet Varma (Editor)

Web 2.0

Updated on
Edit
Like
Comment
Share on FacebookTweet on TwitterShare on LinkedInShare on Reddit
Web 2.0

Web 2.0 describes World Wide Web websites that emphasize user-generated content, usability (ease of use, even by non-experts), and interoperability (this means that a website can work well with other products, systems and devices) for end users. The term was popularized by Tim O'Reilly and Dale Dougherty at the O'Reilly Media Web 2.0 Conference in late 2004, though it was coined by Darcy DiNucci in 1999. Web 2.0 does not refer to an update to any technical specification, but to changes in the way Web pages are designed and used.

Contents

A Web 2.0 website may allow users to interact and collaborate with each other in a social media dialogue as creators of user-generated content in a virtual community, in contrast to the first generation of Web 1.0-era websites where people were limited to the passive viewing of content. Examples of Web 2.0 include social networking sites and social media sites (e.g., Facebook), blogs, wikis, folksonomies ("tagging" keywords on websites and links), video sharing sites (e.g., YouTube), hosted services, Web applications ("apps"), collaborative consumption platforms, and mashup applications.

Whether Web 2.0 is substantively different from prior Web technologies has been challenged by World Wide Web inventor Tim Berners-Lee, who describes the term as jargon. His original vision of the Web was "a collaborative medium, a place where we [could] all meet and read and write". On the other hand, the term Semantic Web (sometimes referred to as Web 3.0) was coined by Berners-Lee to refer to a web of content where the meaning can be processed by machines.

"Web 1.0"

Web 1.0 is a retronym referring to the first stage of the World Wide Web's evolution. According to Cormode, G. and Krishnamurthy, B. (2008): "content creators were few in Web 1.0 with the vast majority of users simply acting as consumers of content." Personal web pages were common, consisting mainly of static pages hosted on ISP-run web servers, or on free web hosting services such as GeoCities. With the advent of Web 2.0, it was more common for the average web user to have social networking profiles on sites such as Myspace and Facebook, as well as personal blogs on one of the new low-cost web hosting services or a dedicated blog host like Blogger or LiveJournal. The content for both were generated dynamically from stored content, allowing for readers to comment directly on pages in a way that was not previously common.

Some Web 2.0 capabilities were present in the days of Web 1.0 but they were implemented differently. For example, a Web 1.0 site may have had a guestbook page to publish visitor comments, instead of a comment section at the end of each page. Server performance and bandwidth considerations had a long comments thread on each page, which could potentially slow down the site. Terry Flew, in his 3rd edition of New Media described the differences between Web 1.0 and Web 2.0:

"move from personal websites to blogs and blog site aggregation, from publishing to participation, from web content as the outcome of large up-front investment to an ongoing and interactive process, and from content management systems to links based on "tagging" website content using keywords (folksonomy)".

Flew believed it to be the above factors that form the basic change in trends that resulted in the onset of the Web 2.0 "craze".

Characteristics

Some design elements of a Web 1.0 site include:

  • Static pages instead of dynamic HTML. With static pages, the web user can read the text and look at digital photos or other images, but none of the text or images can be "clicked" on with a mouse or keyboard, to obtain more information
  • Content served from the server's filesystem instead of a relational database management system (RDBMS).
  • Pages built using Server Side Includes or Common Gateway Interface (CGI) instead of a web application written in a dynamic programming language such as Perl, PHP, Python or Ruby.
  • The use of HTML 3.2-era elements such as frames and tables to position and align elements on a page. These were often used in combination with spacer GIFs.
  • Proprietary HTML extensions, such as the <blink> and <marquee> tags, introduced during the first browser war.
  • Online guestbooks.
  • GIF buttons, graphics (typically 88x31 pixels in size) promoting web browsers, operating systems, text editors and various other products.
  • HTML forms sent via email. Support for server side scripting was rare on shared servers during this period. To provide a feedback mechanism for web site visitors, mailto forms were used. A user would fill in a form, and upon clicking the form's submit button, their email client would launch and attempt to send an email containing the form's details. The popularity and complications of the mailto protocol led browser developers to incorporate email clients into their browsers.
  • Web 2.0

    The term "Web 2.0" was first used in January 1999 by Darcy DiNucci, an information architecture consultant. In her article, "Fragmented Future", DiNucci writes:

    The Web we know now, which loads into a browser window in essentially static screenfuls, is only an embryo of the Web to come. The first glimmerings of Web 2.0 are beginning to appear, and we are just starting to see how that embryo might develop. The Web will be understood not as screenfuls of text and graphics but as a transport mechanism, the ether through which interactivity happens. It will [...] appear on your computer screen, [...] on your TV set [...] your car dashboard [...] your cell phone [...] hand-held game machines [...] maybe even your microwave oven.

    Writing when Palm Inc. was introducing its first Web-capable personal digital assistant, supporting Web access with WAP, DiNucci saw the Web "fragmenting" into a future that extended beyond the browser/PC combination it was identified with. She focused on how the basic information structure and hyperlinking mechanism introduced by HTTP would be used by a variety of devices and platforms. As such, her use of the "2.0" designation refers to a next version of the Web that does not directly relate to the term's current use.

    The term Web 2.0 did not resurface until 2002. These authors focus on the concepts currently associated with the term where, as Scott Dietzen puts it, "the Web becomes a universal, standards-based integration platform". In 2004, the term began its rise in popularity when O'Reilly Media and MediaLive hosted the first Web 2.0 conference. In their opening remarks, John Battelle and Tim O'Reilly outlined their definition of the "Web as Platform", where software applications are built upon the Web as opposed to upon the desktop. The unique aspect of this migration, they argued, is that "customers are building your business for you". They argued that the activities of users generating content (in the form of ideas, text, videos, or pictures) could be "harnessed" to create value. O'Reilly and Battelle contrasted Web 2.0 with what they called "Web 1.0". They associated this term with the business models of Netscape and the Encyclopædia Britannica Online. For example,

    Netscape framed "the web as platform" in terms of the old software paradigm: their flagship product was the web browser, a desktop application, and their strategy was to use their dominance in the browser market to establish a market for high-priced server products. Control over standards for displaying content and applications in the browser would, in theory, give Netscape the kind of market power enjoyed by Microsoft in the PC market. Much like the "horseless carriage" framed the automobile as an extension of the familiar, Netscape promoted a "webtop" to replace the desktop, and planned to populate that webtop with information updates and applets pushed to the webtop by information providers who would purchase Netscape servers.

    In short, Netscape focused on creating software, releasing updates and bug fixes, and distributing it to the end users. O'Reilly contrasted this with Google, a company that did not at the time focus on producing end-user software, but instead on providing a service based on data such as the links Web page authors make between sites. Google exploits this user-generated content to offer Web search based on reputation through its "PageRank" algorithm. Unlike software, which undergoes scheduled releases, such services are constantly updated, a process called "the perpetual beta". A similar difference can be seen between the Encyclopædia Britannica Online and Wikipedia: while the Britannica relies upon experts to write articles and releases them periodically in publications, Wikipedia relies on trust in (sometimes anonymous) community members to constantly write and edit content. Wikipedia editors are not required to have educational credentials, such as degrees, in the subjects in which they are editing. Wikipedia is not based on subject-matter expertise, but rather on an adaptation of the open source software adage "given enough eyeballs, all bugs are shallow". This maxim is stating that if enough users are able to look at a software product's code (or a website), then these users will be able to fix any "bugs" or other problems. Wikipedia's volunteer editor community produces, edits and updates articles constantly. O'Reilly's Web 2.0 conferences have been held every year since 2004, attracting entrepreneurs, representatives from large companies, tech experts and technology reporters.

    The popularity of Web 2.0 was acknowledged by 2006 TIME magazine Person of The Year (You). That is, TIME selected the masses of users who were participating in content creation on social networks, blogs, wikis, and media sharing sites. In the cover story, Lev Grossman explains:

    It's a story about community and collaboration on a scale never seen before. It's about the cosmic compendium of knowledge Wikipedia and the million-channel people's network YouTube and the online metropolis MySpace. It's about the many wresting power from the few and helping one another for nothing and how that will not only change the world but also change the way the world changes.

    Characteristics

    Instead of merely reading a Web 2.0 site, a user is invited to contribute to the site's content by commenting on published articles or creating a user account or profile on the site, which may enable increased participation. By increasing emphasis on these already-extant capabilities, they encourage the user to rely more on their browser for user interface, application software ("apps") and file storage facilities. This has been called "network as platform" computing. Major features of Web 2.0 include social networking websites, self-publishing platforms (e.g., WordPress' easy-to-use blog and website creation tools), "tagging" (which enables users to label websites, videos or photos in some fashion), "like" buttons (which enable a user to indicate that they are pleased by online content), and social bookmarking. Users can provide the data that is on a Web 2.0 site and exercise some control over that data. These sites may have an "architecture of participation" that encourages users to add value to the application as they use it. Users can add value in many ways, such as by commenting on a news story on a news website, by uploading a relevant photo on a travel website, or by adding a link to a video or TED talk which is pertinent to the subject being discussed on a website. Some scholars argue that cloud computing is an example of Web 2.0 because cloud computing is simply an implication of computing on the Internet.

    Web 2.0 offers almost all users the same freedom to contribute. While this opens the possibility for serious debate and collaboration, it also increases the incidence of "spamming", "trolling", and can even create a venue for racist hate speech, cyberbullying and defamation. The impossibility of excluding group members who do not contribute to the provision of goods (i.e., to the creation of a user-generated website) from sharing the benefits (of using the website) gives rise to the possibility that serious members will prefer to withhold their contribution of effort and "free ride" on the contributions of others. This requires what is sometimes called radical trust by the management of the Web site. According to Best, the characteristics of Web 2.0 are: rich user experience, user participation, dynamic content, metadata, Web standards, and scalability. Further characteristics, such as openness, freedom and collective intelligence by way of user participation, can also be viewed as essential attributes of Web 2.0. Some websites require users to contribute user-generated content to have access to the website, to discourage "free riding".

    The key features of Web 2.0 include:

    1. Folksonomy - free classification of information; allows users to collectively classify and find information (e.g. "tagging" of websites, images, videos or links)
    2. Rich user experience - dynamic content that is responsive to user input (e.g., a user can "click" on an image to enlarge it or find out more information)
    3. User participation - information flows two ways between site owner and site users by means of evaluation, review, and online commenting. Site users also typically create user-generated content for others to see (e.g., Wikipedia, an online encyclopedia that anyone can write articles for or edit)
    4. Software as a service (SaaS) - Web 2.0 sites developed APIs to allow automated usage, such as by an Web "app" (software application) or a mashup
    5. Mass participation - near-universal web access leads to differentiation of concerns, from the traditional Internet user base (who tended to be hackers and computer hobbyists) to a wider variety of users

    Comparison with Web 1.0

    In 2005, Tim O'Reilly and Dale Dougherty held a brainstorming session to elucidate characteristics and components of the Web 1.0 to Web 2.0 transition and what changed:

    Technologies

    The client-side (Web browser) technologies used in Web 2.0 development include Ajax and JavaScript frameworks. Ajax programming uses JavaScript and the Document Object Model to update selected regions of the page area without undergoing a full page reload. To allow users to continue to interact with the page, communications such as data requests going to the server are separated from data coming back to the page (asynchronously). Otherwise, the user would have to routinely wait for the data to come back before they can do anything else on that page, just as a user has to wait for a page to complete the reload. This also increases overall performance of the site, as the sending of requests can complete quicker independent of blocking and queueing required to send data back to the client. The data fetched by an Ajax request is typically formatted in XML or JSON (JavaScript Object Notation) format, two widely used structured data formats. Since both of these formats are natively understood by JavaScript, a programmer can easily use them to transmit structured data in their Web application. When this data is received via Ajax, the JavaScript program then uses the Document Object Model (DOM) to dynamically update the Web page based on the new data, allowing for a rapid and interactive user experience. In short, using these techniques, Web designers can make their pages function like desktop applications. For example, Google Docs uses this technique to create a Web-based word processor.

    As a widely available plugin independent of W3C standards (the World Wide Web Consortium is the governing body of Web standards and protocols), Adobe Flash is capable of doing many things that were not possible pre-HTML5. Of Flash's many capabilities, the most commonly used is its ability to integrate streaming multimedia into HTML pages. With the introduction of HTML5 in 2010 and growing concerns with Flash's security, the role of Flash is decreasing. In addition to Flash and Ajax, JavaScript/Ajax frameworks have recently become a very popular means of creating Web 2.0 sites. At their core, these frameworks use the same technology as JavaScript, Ajax, and the DOM. However, frameworks smooth over inconsistencies between Web browsers and extend the functionality available to developers. Many of them also come with customizable, prefabricated 'widgets' that accomplish such common tasks as picking a date from a calendar, displaying a data chart, or making a tabbed panel. On the server-side, Web 2.0 uses many of the same technologies as Web 1.0. Languages such as Perl, PHP, Python, Ruby, as well as Enterprise Java (J2EE) and Microsoft.NET Framework, are used by developers to output data dynamically using information from files and databases. This allows websites and web services to share machine readable formats such as XML (Atom, RSS, etc.) and JSON. When data is available in one of these formats, another website can use it to integrate a portion of that site's functionality.

    Concepts

    Web 2.0 can be described in three parts:

  • Rich Internet application (RIA) — defines the experience brought from desktop to browser, whether it is "rich" from a graphical point of view or a usability/interactivity or features point of view.
  • Web-oriented architecture (WOA) — defines how Web 2.0 applications expose their functionality so that other applications can leverage and integrate the functionality providing a set of much richer applications. Examples are feeds, RSS feeds, web services, mashups.
  • Social Web — defines how Web 2.0 websites tends to interact much more with the end user and make the end-user an integral part of the website, either by adding her profile, adding comments on content, uploading new content, or adding user-generated content (e.g., personal digital photos).
  • As such, Web 2.0 draws together the capabilities of client- and server-side software, content syndication and the use of network protocols. Standards-oriented Web browsers may use plug-ins and software extensions to handle the content and the user interactions. Web 2.0 sites provide users with information storage, creation, and dissemination capabilities that were not possible in the environment now known as "Web 1.0".

    Web 2.0 sites include the following features and techniques, referred to as the acronym SLATES by Andrew McAfee:

    Search
    Finding information through keyword search.
    Links to other websites
    Connects information sources together using the model of the Web.
    Authoring
    The ability to create and update content leads to the collaborative work of many authors. Wiki users may extend, undo, redo and edit each other's work. Comment systems allow readers to contribute their viewpoints.
    Tags
    Categorization of content by users adding "tags" — short, usually one-word or two word descriptions — to facilitate searching. For example, a user can tag a metal song as "death metal". Collections of tags created by many users within a single system may be referred to as "folksonomies" (i.e., folk taxonomies).
    Extensions
    Software that makes the Web an application platform as well as a document server. Examples include Adobe Reader, Adobe Flash, Microsoft Silverlight, ActiveX, Oracle Java, QuickTime, and Windows Media.
    Signals
    The use of syndication technology, such as RSS feeds to notify users of content changes.

    While SLATES forms the basic framework of Enterprise 2.0, it does not contradict all of the higher level Web 2.0 design patterns and business models. It includes discussions of self-service IT, the long tail of enterprise IT demand, and many other consequences of the Web 2.0 era in enterprise uses.

    Usage

    A third important part of Web 2.0 is the social web. The social Web consists of a number of online tools and platforms where people share their perspectives, opinions, thoughts and experiences. Web 2.0 applications tend to interact much more with the end user. As such, the end user is not only a user of the application but also a participant by:

  • Podcasting
  • Blogging
  • Tagging
  • Curating with RSS
  • Social bookmarking
  • Social networking
  • Social media
  • Wikis
  • Web content voting
  • The popularity of the term Web 2.0, along with the increasing use of blogs, wikis, and social networking technologies, has led many in academia and business to append a flurry of 2.0's to existing concepts and fields of study, including Library 2.0, Social Work 2.0, Enterprise 2.0, PR 2.0, Classroom 2.0, Publishing 2.0, Medicine 2.0, Telco 2.0, Travel 2.0, Government 2.0, and even Porn 2.0. Many of these 2.0s refer to Web 2.0 technologies as the source of the new version in their respective disciplines and areas. For example, in the Talis white paper "Library 2.0: The Challenge of Disruptive Innovation", Paul Miller argues

    Blogs, wikis and RSS are often held up as exemplary manifestations of Web 2.0. A reader of a blog or a wiki is provided with tools to add a comment or even, in the case of the wiki, to edit the content. This is what we call the Read/Write web. Talis believes that Library 2.0 means harnessing this type of participation so that libraries can benefit from increasingly rich collaborative cataloging efforts, such as including contributions from partner libraries as well as adding rich enhancements, such as book jackets or movie files, to records from publishers and others.

    Here, Miller links Web 2.0 technologies and the culture of participation that they engender to the field of library science, supporting his claim that there is now a "Library 2.0". Many of the other proponents of new 2.0s mentioned here use similar methods. The meaning of Web 2.0 is role dependent. For example, some use Web 2.0 to establish and maintain relationships through social networks, while some marketing managers might use this promising technology to "end-run traditionally unresponsive I.T. department[s]." There is a debate over the use of Web 2.0 technologies in mainstream education. Issues under consideration include the understanding of students' different learning modes; the conflicts between ideas entrenched in informal on-line communities and educational establishments' views on the production and authentication of 'formal' knowledge; and questions about privacy, plagiarism, shared authorship and the ownership of knowledge and information produced and/or published on line.

    Marketing

    Web 2.0 is used by companies, non-profit organizations and governments for interactive marketing. A growing number of marketers are using Web 2.0 tools to collaborate with consumers on product development, customer service enhancement, product or service improvement and promotion. Companies can use Web 2.0 tools to improve collaboration with both its business partners and consumers. Among other things, company employees have created wikis—Web sites that allow users to add, delete, and edit content — to list answers to frequently asked questions about each product, and consumers have added significant contributions. Another marketing Web 2.0 lure is to make sure consumers can use the online community to network among themselves on topics of their own choosing. Mainstream media usage of Web 2.0 is increasing. Saturating media hubs—like The New York Times, PC Magazine and Business Week — with links to popular new Web sites and services, is critical to achieving the threshold for mass adoption of those services. User web content can be used to gauge consumer satisfaction. In a recent article for Bank Technology News, Shane Kite describes how Citigroup's Global Transaction Services unit monitors social media outlets to address customer issues and improve products. According to Google Timeline, the term Web 2.0 was discussed and indexed most frequently in 2005, 2007 and 2008. Its average use is continuously declining by 2–4% per quarter since April 2008.

    Education

    Web 2.0 could allow for more collaborative education. For example, blogs give students a public space to interact with one another and the content of the class.Some studies suggest that Web 2.0 can increase the public's understanding of science, which could improve governments' policy decisions. A 2012 study by researchers at the University of Wisconsin-Madison notes that "...the internet could be a crucial tool in increasing the general public’s level of science literacy. This increase could then lead to better communication between researchers and the public, more substantive discussion, and more informed policy decision."

    Web-based applications and desktops

    Ajax has prompted the development of Web sites that mimic desktop applications, such as word processing, the spreadsheet, and slide-show presentation. WYSIWYG wiki and blogging sites replicate many features of PC authoring applications. Several browser-based services have emerged, including EyeOS and YouOS.(No longer active.) Although named operating systems, many of these services are application platforms. They mimic the user experience of desktop operating-systems, offering features and applications similar to a PC environment, and are able to run within any modern browser. However, these so-called "operating systems" do not directly control the hardware on the client's computer. Numerous web-based application services appeared during the dot-com bubble of 1997–2001 and then vanished, having failed to gain a critical mass of customers.

    XML and RSS

    Many regard syndication of site content as a Web 2.0 feature. Syndication uses standardized protocols to permit end-users to make use of a site's data in another context (such as another Web site, a browser plugin, or a separate desktop application). Protocols permitting syndication include RSS (really simple syndication, also known as Web syndication), RDF (as in RSS 1.1), and Atom, all of which are XML-based formats. Observers have started to refer to these technologies as Web feeds. Specialized protocols such as FOAF and XFN (both for social networking) extend the functionality of sites and permit end-users to interact without centralized Web sites.

    Web APIs

    Web 2.0 often uses machine-based interactions such as REST and SOAP. Servers often expose proprietary Application programming interfaces (API), but standard APIs (for example, for posting to a blog or notifying a blog update) have also come into use. Most communications through APIs involve XML or JSON payloads. REST APIs, through their use of self-descriptive messages and hypermedia as the engine of application state, should be self-describing once an entry URI is known. Web Services Description Language (WSDL) is the standard way of publishing a SOAP Application programming interface and there are a range of Web service specifications.

    Criticism

    Critics of the term claim that "Web 2.0" does not represent a new version of the World Wide Web at all, but merely continues to use so-called "Web 1.0" technologies and concepts. First, techniques such as Ajax do not replace underlying protocols like HTTP, but add an additional layer of abstraction on top of them. Second, many of the ideas of Web 2.0 were already featured in implementations on networked systems well before the term "Web 2.0" emerged. Amazon.com, for instance, has allowed users to write reviews and consumer guides since its launch in 1995, in a form of self-publishing. Amazon also opened its API to outside developers in 2002. Previous developments also came from research in computer-supported collaborative learning and computer supported cooperative work (CSCW) and from established products like Lotus Notes and Lotus Domino, all phenomena that preceded Web 2.0. Tim Berners-Lee, who developed the initial technologies of the Web, has been an outspoken critic of the term, while supporting many of the elements associated with it. In the environment where the Web originated, each workstation had a dedicated IP address and always-on connection to the Internet. Sharing a file or publishing a web page was as simple as moving the file into a shared folder.

    Perhaps the most common criticism is that the term is unclear or simply a buzzword. For many people who work in software, version numbers like 2.0 and 3.0 are for software versioning or hardware versioning only, and to assign 2.0 arbitrarily to many technologies with a variety of real version numbers has no meaning. The web does not have a version number. For example, in a 2006 interview with IBM developerWorks podcast editor Scott Laningham, Tim Berners-Lee described the term "Web 2.0" as a jargon:

    "Nobody really knows what it means... If Web 2.0 for you is blogs and wikis, then that is people to people. But that was what the Web was supposed to be all along... Web 2.0, for some people, it means moving some of the thinking [to the] client side, so making it more immediate, but the idea of the Web as interaction between people is really what the Web is. That was what it was designed to be... a collaborative space where people can interact."

    Other critics labeled Web 2.0 "a second bubble" (referring to the Dot-com bubble of 1997–2000), suggesting that too many Web 2.0 companies attempt to develop the same product with a lack of business models. For example, The Economist has dubbed the mid- to late-2000s focus on Web companies as "Bubble 2.0".

    In terms of Web 2.0's social impact, critics such as Andrew Keen argue that Web 2.0 has created a cult of digital narcissism and amateurism, which undermines the notion of expertise by allowing anybody, anywhere to share and place undue value upon their own opinions about any subject and post any kind of content, regardless of their actual talent, knowledge, credentials, biases or possible hidden agendas. Keen's 2007 book, Cult of the Amateur, argues that the core assumption of Web 2.0, that all opinions and user-generated content are equally valuable and relevant, is misguided. Additionally, Sunday Times reviewer John Flintoff has characterized Web 2.0 as "creating an endless digital forest of mediocrity: uninformed political commentary, unseemly home videos, embarrassingly amateurish music, unreadable poems, essays and novels... [and that Wikipedia is full of] mistakes, half truths and misunderstandings". In a 1994 Wired interview, Steve Jobs, forecasting the future development of the web for personal publishing, said "The Web is great because that person can't foist anything on you - you have to go get it. They can make themselves available, but if nobody wants to look at their site, that's fine. To be honest, most people who have something to say get published now." Michael Gorman, former president of the American Library Association has been vocal about his opposition to Web 2.0 due to the lack of expertise that it outwardly claims, though he believes that there is hope for the future.

    "The task before us is to extend into the digital world the virtues of authenticity, expertise, and scholarly apparatus that have evolved over the 500 years of print, virtues often absent in the manuscript age that preceded print".

    There is also a growing body of critique of Web 2.0 from the perspective of political economy. Since, as Tim O'Reilly and John Batelle put it, Web 2.0 is based on the "customers... building your business for you," critics have argued that sites such as Google, Facebook, YouTube, and Twitter are exploiting the "free labor" of user-created content. Web 2.0 sites use Terms of Service agreements to claim perpetual licenses to user-generated content, and they use that content to create profiles of users to sell to marketers. This is part of increased surveillance of user activity happening within Web 2.0 sites. Jonathan Zittrain of Harvard's Berkman Center for the Internet and Society argues that such data can be used by governments who want to monitor dissident citizens. The rise of AJAX-driven web sites where much of the content must be rendered on the client has meant that users of older hardware are given worse performance versus a site purely composed of HTML, where the processing takes place on the server. Accessibility for disabled or impaired users may also suffer in a Web 2.0 site.

    Trademark

    In November 2004, CMP Media applied to the USPTO for a service mark on the use of the term "WEB 2.0" for live events. On the basis of this application, CMP Media sent a cease-and-desist demand to the Irish non-profit organization IT@Cork on May 24, 2006, but retracted it two days later. The "WEB 2.0" service mark registration passed final PTO Examining Attorney review on May 10, 2006, and was registered on June 27, 2006. The European Union application (which would confer unambiguous status in Ireland) was declined on May 23, 2007.

    References

    Web 2.0 Wikipedia