Establish a Permanent Panel or Program to Address Global Catastrophic Risks, Including AGI

Idea#272

Stage: Active

Campaign: Federal Government Role

Currently there is no government panel or program to address global catastrophic risks including human extinction risks ("existential risks"), collect proactive solutions to prevent extinction-scale disasters or provide resilience in the face of less severe global catastrophes, and help coordinate (inter)governmental initiatives to reduce the likelihood and severity of extinction threats, including providing research grants. One significant threat to the future of humanity that has received inadequate governmental attention (virtually none) is the development of a human-indifferent artificial general intelligence that can alter its own source code to become “superintelligent” or smarter than any group of humans in multiple domains. Please note that these are a different set of issues than those covered in the Nano-Bio-Info-Cogno (NBIC) Convergence events, though it also encompasses information tech, nanotech, and biotech. Some institutions currently addressing such risks from which panel members might be drawn include the Future of Humanity Institute at Oxford U. (http://www.fhi.ox.ac.uk/research/global_catastrophic_risks ), The Singularity Institute for Artificial Intelligence (http://singinst.org/aboutus/ourmission ), and the Institute for Ethics and Emerging Technologies (http://ieet.org/index.php/IEET/about ). If we get global catastrophic risks wrong there might not be a future for humanity, period.

Tags

Submitted by

Feedback Score

65 votes
Voting Disabled

Idea Details

Vote Activity (latest 20 votes)

  1. Agreed
  2. Agreed
  3. Agreed
  4. Agreed
  5. Agreed
  6. Agreed
  7. Agreed
  8. Agreed
  9. Agreed
  10. Agreed
  11. Agreed
  12. Agreed
  13. Agreed
  14. Agreed
  15. Agreed
  16. Agreed
  17. Agreed
  18. Agreed
  19. Agreed
  20. Agreed
(latest 20 votes)

Similar Ideas [ 4 ]

Assessment

Comments

  1. Comment
    neurobionetics ( Idea Submitter )

    I've just been informed that the Center for Responsible Nanotechnology is no longer in existence. However, public intellectuals from that organization have joined other highly relevant ones at the Institute for Ethics and Emerging Technologies (http://ieet.org/index.php/IEET/about ).

    Also, a request has been made to clarify that "existential risk" means "extinction risk."

    If I could alter the original proposal to make these changes, I would do so.

  2. Comment
    neurobionetics ( Idea Submitter )

    Corrections have been made to the document, my apologies.

  3. Comment
    johnbr

    If we do not handle existential risks, it will not matter how sustainably our energy is generated, how clean our air is, or how good our medical technology is at preventing disease.

    Don't get me wrong, other things are important too, but this should come first.

  4. Comment
    ekansa

    This is a good idea. Even if some of the potential catastrophes have very long odds, their impact (human extinction), make them worthy of serious policy consideration. Clear thinking about worst-case scenarios can also help inform thinking about negative outcomes that are more likely and more immediately on the horizon. Finally, we need policy makers to consider risks and outcomes at longer time scales than the next election cycle. This kind of program can help cultivate such long-term thinking.

  5. Comment
    florinclapa

    I voted for this idea, but it seems to be offtopic since it doesn't seem to be an appropriate answer to what PITAC is asking about -- namely, proposals "that will lead to new jobs and greater GDP."