Ethan Liu

The Great Virtualization

The Great Virtualization

The physical world can be modeled as a network with layers of abstraction – it can be decomposed, with different levels of granularity, into groups of atoms that interact with each another. Computer-enabled networks, such as the Internet, were invented to connect electronic devices. But they are gradually turning into a gateway to the virtualization of the physical world, where simulated objects communicate via network protocols. We already had a couple of iterations on this movement and I believe we are currently on the third and final phase of this great virtualization.

The first wave of virtualization primarily targeted information, goods, and services, and was manifested through the advent of search engines, portals, and ecommerce services in the early days of the Internet. These three aspects of the physical world require the least overall “onboarding” effort and thus are the easiest to virtualize. Pioneers such as Wikipedia and EBay made it relatively effortless to write an article or to take a picture of an item and sell it online. Because Internet companies provide better economic value and more convenience compared with their traditional counterparts, they progressively earn the trust of users, which paved the way for further development of Internet-enabled businesses.

Although information, goods, and services are relatively easy to virtualize, it is by no means easy to store and retrieve those bits representing them on a large scale. For example, to meet the demand for to do search quickly anywhere in the world, companies need massive data storage and retrieval capabilities. These kinds of user demand have propelled breakthroughs in software, hardware, and technical operations, forming a positive feedback loop where businesses and technologies reinforce one another. The continuing innovations in data storage and processing technologies that started from the first wave of Internet companies also shortened the time required to scale a product for the newcomers and handsomely rewarded those that find their product-market fit ahead of their respective competitors. These innovations ultimately enabled ultra-fast product iterations, making blitzscaling possible.

The second phase of virtualization, as suggested earlier, was fueled by the burgeoning trust of Internet users. As people became increasingly comfortable with using the Internet, they began to expand their sphere of activities. So the Internet became more personal, and this enabled the virtualization of sensitive matters. Social network services thus entered the stage to virtualize people’s social activities and record their characteristics and behaviors demonstrated in different scenarios. Although search engines, portals, and ecommerce services can also track user actions, they have limited access to users’ characteristics and behaviors in social contexts. This difference gave social networks the unique power to understand individuals better than themselves could and even build predictive services. The net result of this is that we would be able to simulate people given enough data from these social network services. It is, after all, not hard to imagine training a chatbot that talks almost exactly like a particular person when the chatbot is fed with copious amounts of conversation data and writings of that person.

Another category of sensitive matter that got sucked into the Internet is financial resources. From online brokers to mobile wallets to blockchain based digital tokens, people have reached consensus that it is safer and more convenient to virtualize their financial resources. An enormous amount of user trust is a prerequisite for people to put almost all of their assets in the digital world, so this also would not be possible without the first wave of virtualization. Because the feedback loop between data and user trust is in effect, we can determine the progress of virtualization by looking at the amount of sensitive data people are willing to share with companies. This ongoing expansion of social network services and digital financial markets, however, does not mean their physical counterparts will disappear any time soon. Like all other parts of the physical world, in-person interactions and physical ledgers for financial assets will not be replaced, even though a growing portion of them are becoming virtualized.

At this point, we are quite certain that the Internet is going to keep expanding and absorb even more aspects of the physical world. The last wave, which we are currently experiencing, involves the virtualization of physical spaces. With the first two waves, we have already virtualized many physical and metaphysical objects in our daily life. So the last phase definitely has the highest barrier to entry. The most difficult hurdle we need to overcome is probably processing data representing both the states and interactions of everything inside a physical space. The technologies we invented in the first two virtualization waves are simply not sufficient to record the states and interactions of all matter in a large space. Additionally, as Moore’s law starts to hit limits imposed by quantum mechanics, we will likely need both theoretical and engineering advances to produce the ultimate computing device we will need to completely simulate physical spaces.

Although we are still far from the end of the final phase of virtualization, many solutions we built to tackle the problem of simulating physical spaces have already showed great promise, at least on a less granular scale. IoT devices, for example, offer us means to monitor the physical world and we have created both algorithms and compute infrastructures for analytics on the data generated. Ubiquitously deployed sensors help us to monitor our cities and surrounding environment so we can collect data to record the characteristics of them in their natural states. Though the data collected may come from sparse sources, we could use interpolation techniques to compose virtualized roads, shops, and even atmosphere on a sufficiently refined scale where they would be indistinguishable from their physical counterparts to humans. Deploying IoT devices and processing collected data may prove to be exponentially harder on more granular scales, but I believe our desire to virtualize everything will eventually drive another round of technological innovation that enable us to address the issue of scale.

When considered as a whole, the virtualization movement will be the most ambitious and difficult project humans have ever endeavored to carry out. But this trend of development indicates that we will end up putting ourselves in The Matrix. Will this actually happen? Only time can tell.




Drafted on Sep. 19, 2018; revised on Oct. 9, 2018