Xname technology says technology can be out of control
Humanity has a method to prevent new technologies from getting out of hand. Analyzing the possible negative consequences, involving all affected parties, and reaching some agreement regarding ways to mitigate them.
However, new research suggests that the accelerated pace of change may soon render this approach ineffective.
People use laws, social norms, and international agreements to reap technology’s benefits while minimizing unwanted things like environmental damage. To find these rules of conduct, we are often inspired by what game theorists call “a Nash equilibrium,” which takes its name from the mathematician and economist John Nash.
“A Nash Equilibrium”
In-game theory, a Nash equilibrium is a set of strategies that, once discovered by a set of players, provides a stable fixed point at which no one has an incentive to deviate from their current strategy.
To achieve that balance, players must understand the consequences of possible actions by themselves and others. During the Cold War, for example, peace between the nuclear powers depended on the understanding that any attack would guarantee the destruction of all.
Similarly, from local regulations to international law, negotiations can be seen as a gradual exploration of all possible moves to find a stable framework of acceptable rules, and that gives no one an incentive to cheat – because it would make them worse.
But what if the technology becomes so complex and begins to evolve so rapidly that humans cannot imagine the consequences of a new action? It’s the question analyze by two scientists – Dimitri Jusnezov of the National Nuclear Safety Administration and Wendell Jones recently retired from the Sandia National Labs. Its conclusion is disturbing: the concept of strategic equilibrium as an organizing principle is perhaps almost obsolete.
Kusnezov and Jones
Kusnezov and Jones derive this insight from recent mathematical studies of multiplayer games and many possible action options. A basic conclusion is a sharp division into two types, stable and unstable. Below a certain level of complexity, a Nash equilibrium is useful to describe the probable results.
Beyond that, there is a chaotic zone in which players never manage to find stable and reliable strategies. Still, they only get ahead by constantly changing their behavior in a highly irregular way. What happens is essentially random and unpredictable.
The authors argue that emerging technologies – especially computing, software, and biotechnology such as gene editing. Are much more likely to fall into the unstable category. In those areas, shocks are getting bigger and more frequent as costs go down, and shared platforms allow open innovation.
Thus, these technologies will evolve faster than regulatory frameworks’ response speed – at least as they are traditionally conceive.
What can we do?
What can we do? Kusnezov and Jones do not have an easy answer. A clear consequence is that it is probably a mistake to copy techniques use for the past technologies that evolved more slowly and were less available. This is usually the default approach, as illustrated by proposals to regulate gene-editing techniques.
Such initiatives are likely to doom in a world where technologies develop through a global population’s parallel efforts with diverse goals and interests. Perhaps future regulation should depend on emerging technologies, as some are already exploring for finance.
Perhaps, we are approaching an intense moment in history. In which the guiding idea of strategic equilibrium that we have trusted for 75 years will hit its limits. In that case, regulation will become a completely different game.
If you found this post interesting, I would appreciate it if you would like it or share it with a friend. You can also find other similar quite curious posts within the category ” what is “
What you need to know before hiring Angular JS Developer
Angular JS Developer – Every website, app, and other information product must work flawlessly. But it’s not just technology that…