Federal Government Role

Establish a Permanent Panel or Program to Address Global Catastrophic Risks, Including AGI

Currently there is no government panel or program to address global catastrophic risks including human extinction risks ("existential risks"), collect proactive solutions to prevent extinction-scale disasters or provide resilience in the face of less severe global catastrophes, and help coordinate (inter)governmental initiatives to reduce the likelihood and severity of extinction threats, including providing research grants. One significant threat to the future of humanity that has received inadequate governmental attention (virtually none) is the development of a human-indifferent artificial general intelligence that can alter its own source code to become “superintelligent” or smarter than any group of humans in multiple domains. Please note that these are a different set of issues than those covered in the Nano-Bio-Info-Cogno (NBIC) Convergence events, though it also encompasses information tech, nanotech, and biotech. Some institutions currently addressing such risks from which panel members might be drawn include the Future of Humanity Institute at Oxford U. (http://www.fhi.ox.ac.uk/research/global_catastrophic_risks ), The Singularity Institute for Artificial Intelligence (http://singinst.org/aboutus/ourmission ), and the Institute for Ethics and Emerging Technologies (http://ieet.org/index.php/IEET/about ). If we get global catastrophic risks wrong there might not be a future for humanity, period.


Submitted by

Stage: Active

Feedback Score

65 votes
Voting Disabled

Idea Details

Vote Activity (latest 20 votes)

  1. Upvoted
  2. Upvoted
  3. Upvoted
  4. Upvoted
  5. Upvoted
  6. Upvoted
  7. Upvoted
  8. Upvoted
  9. Upvoted
  10. Upvoted
  11. Upvoted
  12. Upvoted
  13. Upvoted
  14. Upvoted
  15. Upvoted
  16. Upvoted
  17. Upvoted
  18. Upvoted
  19. Upvoted
  20. Upvoted
(latest 20 votes)

Similar Ideas [ 4 ]


  1. Comment
    neurobionetics ( Idea Submitter )

    I've just been informed that the Center for Responsible Nanotechnology is no longer in existence. However, public intellectuals from that organization have joined other highly relevant ones at the Institute for Ethics and Emerging Technologies (http://ieet.org/index.php/IEET/about ).

    Also, a request has been made to clarify that "existential risk" means "extinction risk."

    If I could alter the original proposal to make these changes, I would do so.

  2. Comment
    neurobionetics ( Idea Submitter )

    Corrections have been made to the document, my apologies.

  3. Comment

    If we do not handle existential risks, it will not matter how sustainably our energy is generated, how clean our air is, or how good our medical technology is at preventing disease.

    Don't get me wrong, other things are important too, but this should come first.

  4. Comment

    This is a good idea. Even if some of the potential catastrophes have very long odds, their impact (human extinction), make them worthy of serious policy consideration. Clear thinking about worst-case scenarios can also help inform thinking about negative outcomes that are more likely and more immediately on the horizon. Finally, we need policy makers to consider risks and outcomes at longer time scales than the next election cycle. This kind of program can help cultivate such long-term thinking.

  5. Comment

    I voted for this idea, but it seems to be offtopic since it doesn't seem to be an appropriate answer to what PITAC is asking about -- namely, proposals "that will lead to new jobs and greater GDP."