In addition to being a ludicrously well-funded game engine startup, there’s something conspicuous about what Improbable has in the works. In its Series B announcement, after mentioning the typical game engine applications (video games, VR), the company said it’s planning to simulate the real world:
Over the longer term, we are also excited by the potential for our technology to inform the understanding of real world systems and their emergent complexity. We are already working on projects with telecommunications companies, governments and other enterprise clients to explore the ability of massive, detailed simulations to drive better decisions using real-world data and hope to talk more about this in the future.
This is a pretty radical ambition for a funding announcement.
And in the WIRED article that followed, the team revealed that it made a full simulation of Cambridge, UK, complete with 130,000 digital inhabitants and fully mapped out infrastructure, including transit, utilities, traffic, and internet. The CEO quipped that “AI gets all the press, [but] this idea of recreating reality is going to become something in the public consciousness that’s as important, as significant, as artificial intelligence.”
Now, it should be noted Improbable isn’t the first to try this idea out. Wall Street has long simulated portfolio performance with Monte Carlo simulations. Autonomous vehicle startups are using it to generate training data. Astrophysicists use simulation to model 25 billion galaxies. Computer scientists are building quantum software using simulation.
And let’s not forget the long history of military simulation. As a kid, I first came across the idea in the 80s classic WarGames, in which Matthew Broderick hacks into NORAD’s war game supercomputer. As the resident computer whiz explains in this (must-see) clip, the WOPR supercomputer ingests real-world data–weather patterns, missile tests, etc–and plays war games to simulate the best responses.
So it’s a pretty established sci-fi concept. But to me at least, Improbable is emblematic of a new wave of companies trying to “abstract away” the real world.
In many ways, “recreating reality” is the goal of IoT. The idea is to digitize real-world information (via connected hardware) and decide how to act automatically. The concept goes as far back as the 1950s, when Alan Turing proposed building machines that could sense and learn from their surroundings. And in 1966, Karl Steinbuch imagined a future where computers were stitched into everything.
Game engines like Improbable or Unreal share a similar goal: synthetically render stuff and have them act with realistic physics, lighting, and behaviors. Right now, Improbable is working on giving game developers the tools for immersive experiences. But Improbable is also acknowledging the Holy Grail for simulation is to render the real world as close to 1-to-1 as possible.
However, simulation requires IoT–and vice versa–to make this all a reality. Right now, the concept is difficult to absorb because our hardware is bulky, visible, and swinging in our pockets. But the arc of computing is nearing a point where chips and sensors become microscopic, enabling most physical world processes worth studying to be captured by IoT. There’s a great post from Matt Turck at FirstMark Capital that explains that “the importance of the IoT perhaps emerges more clearly when you think about it as the final chapter of ‘software eats the world,’ where everything gets connected.” At some point, it will be more costly for a company to NOT connect every asset.
The magnitude of having a 1-to-1 (or even a “pretty good”) representation of reality could be as revolutionary as Improbable’s CEO claims. It’s not crazy to imagine a future where C-suite executives turn to a computer for advice on whether to expand into a new business area. People are going to want the tools afforded to scientists and presidents. As technology becomes table stakes in business, the leverage of a company’s single decision becomes greater and greater. And with IoT proliferating and computing getting ever cheaper, the dream of democratizing WarGames-for-Business™ could be attainable.
Obstacles in the way
“Gentlemen, I don’t trust this overgrown pile of microchips any further than I can throw it.” -General Beringer, WarGames
While I am enamored with these concepts (and dedicate my professional life studying them), I think there may be inherent issues in the way.
Right now, I remain skeptical of a perfect simulation for a few reasons:
- Messy data: garbage data in means a garbage simulation out
- Coordination problems: getting everyone on board
- Complexity problems: too little compute or data to derive anything meaningful
A good simulation of the world requires high-fidelity, real-time data.
To ever pull off WarGames-for-Business™, the simulation needs to know the business climate, the marketing optics, the demand, the local inventory, the geospatial data, and so forth, all updated into real-time. It would be Bloomberg crossed with AWS crossed with Unreal crossed with Netsuite crossed with Accuweather crossed with dozens of others.
Can we trust all these data vendors?
Now, I don’t know how Improbable got its transit data, but living in NYC has taught me there’s a vast difference between projected and actual train location. This can be extrapolated to the unpleasant realities of digitizing real-world data. The bits and bytes of game engines exist in perfect abstraction, whereas the atom world is messy.
At the moment, blockchain startups are a dime a dozen promising solutions to complex systems like supply chains. But the big issue remains–it’s difficult to trust any data on where physical stuff is.
Again, IoT will be necessary to verify that. Many a startup like Cartasite or Tive are jockeying to be that universal supply chain sensor. But until there’s a winning device network that’s trustable, any simulation is working with noisy, imperfect data.
For a meaningful simulation to take off, there’s gotta be something in it for the data providers. Getting every utility, bus, and person to consent and turn themselves into an API is probably an unattainable task. It’s likely we never get a 1-to-1 simulation but maybe something like 75% accurate. And that might be good enough to be valuable, but even getting there requires massive coordination.
Coordination problems are inherent to any real-world protocol. The main issue is getting everyone to agree on the rules. Will every data vendor structure their data to Improbable’s liking? Could a decentralized Improbable come around instead?
This is being seen the world of shipping and supply chain blockchains. Right now, Maersk and a number of other shippers are testing their own proprietary blockchains, in hope of being the industry-wide solution. As Matt Levine eloquently explained the irony here, “You do not quite get the benefits of trustless decentralization and seamless movement of commerce if each shipping company builds its own proprietary blockchain.”
Likewise, the best simulation will probably be the most comprehensive dataset. Will every data company flock to this one simulation house? Coordination will make or break a potential winner.
Finally, there is the possibility that simulating reality simply doesn’t work because computers cannot derive anything insightful from the deluge of data.
Jorge Luis Borges once wrote a cheeky short story on map-territory relation, imagining a cartographer who makes a map so big and so exact, that it becomes as big as the Empire itself.
More or less, Borges says it’s tragically useless to make a one-to-one map. An object’s representation can never be the object itself:
In time, those Unconscionable Maps no longer satisfied, and the Cartographers Guild drew a Map of the Empire whose size was that of the Empire, coinciding point for point with it. The following Generations, who were not so fond of the Study of Cartography saw the vast Map to be Useless and permitted it to decay and fray under the Sun and winters.
-From On Exactitude in Science
Do we expect computer simulation to be any different? I’m sure the folks at Improbable realize these limitations, and are simply making something better than what currently exists. Philosophically, though, there may be pitfalls.
In any event, I’m excited to watch how software “eats” and eventually simulates the world. If it works to the point where it’s good enough, there is near unlimited upside. Epic Games, for example, takes a 5% royalty from games using its Unreal Engine. Imagine the take-rate for a programmatic Bill Campbell. In theory, it goes all the way up.