By Adedapo Adesanya
The Internet can be said to have started with Web1, as introduced by Tim Berners-Lee, a British scientist who worked at the Organisation Européenne pour la Recherche Nucléaire (CERN) in 1989.
He submitted the proposal for what is known as the Internet today as an effective communication system at CERN and envisioned the Web in three ways: a Web of documents (Web 1.0), a Web of people (Web 2.0) and a Web of data (Web 3.0).
Web 1 is accurately referred to as a “read-only web” as there were very few visuals besides text, and in comparison to today’s Internet (2.0), which allows users to interact with information online, users of Web 1 were more passive and only read things online.
There was no comment section like there is on Twitter or Facebook where users can air their views or share opinions about someone else’s posts or articles like one can do today.
This necessitated the need for Web2, which led to the almost demise of Web1, which brought about the beginning of user-generated content on the web, meaning people could create their own posts or write articles.
Web 2.0 is regarded as the business revolution in the computer industry caused by the move to the Internet as a platform and any attempt to understand the rules for success on that new platform which included building applications that harness network effects to get better more people to use them.
Web2 was truly transformative; it birthed social media networks such as Facebook and Twitter, which have dominated online social interactions to date, cloud computing, e-commerce, and financial services.
Many of the things we enjoy on the Internet today were only possible with the creation of Web 2.
Web 2.0 applications tend to interact much more with the end user. As such, the end-user is not only a user of the application but also a participant using tools including podcasting, blogging, tagging, curating with RSS, social bookmarking, social networking, social media, and web content voting.
But like the law of life says, change is constant – the next big thing now is Web3. web3 jobs are emerging as Web 3.0 aims to make the Internet more inclusive and take control away from big corporations like Facebook, Google, and Amazon. It aims to do this by decentralising the Internet.
So, with Web3, people will be able to control their own data as control from services like Facebook, Google, and others will be replaced with information present on multiple computing devices, acting more like a peer-to-peer internet with no single authority.
Another one of the benefits of Web3 is that it is believed to be able to avoid Internet hacks and leaks as it acts as a system for specific users, meaning there is data security and privacy.
Once it becomes a reality, the virtual world will see resources, applications, and content that is accessible to all.
Web3 has also been noted will create room for the advancement of technologies like cryptocurrencies, virtual reality, automated realities, Non-Fungible Tokens (NFTs), and other digital enhancements.
However, Web 3 has had its critics, with the world’s richest man, Elon Musk, saying the concept is more of a “marketing buzzword” than a reality, while Former Twitter CEO, Jack Dorsey, argued that it would ultimately end up being owned by venture capitalists.
Nevertheless, it never hurts for a full stack developer to take advantage of what Web3 offers.