I agree to Idea Establish a Permanent Panel or Program to Address Global Catastrophic Risks, Including AGI
Voting Disabled

65 votes

I disagree to Idea Establish a Permanent Panel or Program to Address Global Catastrophic Risks, Including AGI

Rank3

Idea#272

This idea is active.
Federal Government Role »

Establish a Permanent Panel or Program to Address Global Catastrophic Risks, Including AGI

Currently there is no government panel or program to address global catastrophic risks including human extinction risks ("existential risks"), collect proactive solutions to prevent extinction-scale disasters or provide resilience in the face of less severe global catastrophes, and help coordinate (inter)governmental initiatives to reduce the likelihood and severity of extinction threats, including providing research grants. One significant threat to the future of humanity that has received inadequate governmental attention (virtually none) is the development of a human-indifferent artificial general intelligence that can alter its own source code to become “superintelligent” or smarter than any group of humans in multiple domains. Please note that these are a different set of issues than those covered in the Nano-Bio-Info-Cogno (NBIC) Convergence events, though it also encompasses information tech, nanotech, and biotech. Some institutions currently addressing such risks from which panel members might be drawn include the Future of Humanity Institute at Oxford U. (http://www.fhi.ox.ac.uk/research/global_catastrophic_risks ), The Singularity Institute for Artificial Intelligence (http://singinst.org/aboutus/ourmission ), and the Institute for Ethics and Emerging Technologies (http://ieet.org/index.php/IEET/about ). If we get global catastrophic risks wrong there might not be a future for humanity, period.

Submitted by 4 years ago

Comments (5)

  1. neurobionetics Idea Submitter

    I've just been informed that the Center for Responsible Nanotechnology is no longer in existence. However, public intellectuals from that organization have joined other highly relevant ones at the Institute for Ethics and Emerging Technologies (http://ieet.org/index.php/IEET/about ).

    Also, a request has been made to clarify that "existential risk" means "extinction risk."

    If I could alter the original proposal to make these changes, I would do so.

    4 years ago
  2. neurobionetics Idea Submitter

    Corrections have been made to the document, my apologies.

    4 years ago
  3. If we do not handle existential risks, it will not matter how sustainably our energy is generated, how clean our air is, or how good our medical technology is at preventing disease.

    Don't get me wrong, other things are important too, but this should come first.

    4 years ago
  4. This is a good idea. Even if some of the potential catastrophes have very long odds, their impact (human extinction), make them worthy of serious policy consideration. Clear thinking about worst-case scenarios can also help inform thinking about negative outcomes that are more likely and more immediately on the horizon. Finally, we need policy makers to consider risks and outcomes at longer time scales than the next election cycle. This kind of program can help cultivate such long-term thinking.

    4 years ago
  5. I voted for this idea, but it seems to be offtopic since it doesn't seem to be an appropriate answer to what PITAC is asking about -- namely, proposals "that will lead to new jobs and greater GDP."

    4 years ago

Vote Activity Show

(latest 20 votes)