“
In the beginning, there was the internet: the physical infrastructure of wires and servers that lets computers, and the people in front of them, talk to each other. The U.S. government’s Arpanet sent its first message in 1969, but the web as we know it today didn’t emerge until 1991, when HTML and URLs made it possible for users to navigate between static pages. Consider this the read-only web, or Web1.
In the early 2000s, things started to change. For one, the internet was becoming more interactive; it was an era of user-generated content, or the read/write web. Social media was a key feature of Web2 (or Web 2.0, as you may know it), and Facebook, Twitter, and Tumblr came to define the experience of being online. YouTube, Wikipedia, and Google, along with the ability to comment on content, expanded our ability to watch, learn, search, and communicate.
The Web2 era has also been one of centralization. Network effects and economies of scale have led to clear winners, and those companies (many of which I mentioned above) have produced mind-boggling wealth for themselves and their shareholders by scraping users’ data and selling targeted ads against it. This has allowed services to be offered for “free,” though users initially didn’t understand the implications of that bargain. Web2 also created new ways for regular people to make money, such as through the sharing economy and the sometimes-lucrative job of being an influencer.
”
”
Harvard Business Review (Web3: The Insights You Need from Harvard Business Review (HBR Insights Series))