There is no agenda for innovation more important for humanity to undertake than the identification and reduction of “existential risks” (human extinction risks). Following that are global catastrophic risks our civilization could survive. The effort to reduce existential and global catastrophic risks involves information technology (rogue artificial intelligence, infrastructure threats), biotechnology (bioweapons, virulent ...more »
There is no agenda for innovation more important for humanity to undertake than the identification and reduction of “existential risks” (human extinction risks). Following that are global catastrophic risks our civilization could survive.
The effort to reduce existential and global catastrophic risks involves information technology (rogue artificial intelligence, infrastructure threats), biotechnology (bioweapons, virulent contagious diseases), and nanotechnology (nanoweapons, nanorobots). These "golden triangle" technologies also are of great interest economically, and young people can become inspired to work in them based on the need to reduce existential risk and global catastrophic risk. Like the anti-terrorism agenda, a focus on risk reduction also has the potential to unite the U.S. with other nations in applying science, technology, and innovation to cooperative efforts
Expansion of industry to meet demands of such risk reduction - in the near term as well as the longer term through education - could generate many white and blue collar jobs, and provide a boost to the economy that will not disappear over time, since the demand will be self-perpetuating (new risks will be determined and risk reduction can always be improved) as well as in everyone’s rational self-interest. Improving analysis of potential interactions between risks and proactively developing solutions and contingency plans (including how risks may impact the economy) would seem to be an especially promising area for commercial development. In turn, commercial development would spur more innovative solutions for risk reduction.
There may be tens of thousands of people working on specific risks including arms control, severe climate scenarios, mega-scale natural disasters, asteroids, technological weaponization and many other particular risks. However, instead of viewing these risks as fundamentally dissimilar, they can all be classified as “existential risks” or “global catastrophic risks.” Popularizing these classifications, with objective weightings for things such as the significance of impact, the likelihood of occurrence, and the time horizon to avert, would facilitate wiser allocations of (inter)governmental resources and contribute to synergies between various risk reduction institutes and companies (e.g. via shared conferences, publications, web tools, business solutions, etc.). This could be especially beneficial to those working on reduction of existential risks, which receive much less attention than some global catastrophic risks, e.g. some forms of large-scale terrorism.
Ideally, a government panel would be formed as soon as possible to explore how to highlight and reward the study of existential and global catastrophic risks in education and catalyze industry quickly around the reduction of such risks, both to serve that primary aim and secondarily to create valuable jobs that help boost the economy, many of which will relate to information technology, biotechnology, and nanotechnology.
Some institutes that deal with existential risks include the Future of Humanity Institute (FHI) [http://www.fhi.ox.ac.uk/research ], the Singularity Institute for Artificial Intelligence (SIAI) [http://www.singinst.org ], the Institute for Ethics and Emerging Technologies (IEET) [http://www.ieet.org ], and Lifeboat Foundation [http://lifeboat.com/ex/about ].