Forgot password
Enter the email address you used when you joined and we'll send you instructions to reset your password.
If you used Apple or Google to create your account, this process will create a password for your existing account.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Reset password instructions sent. If you have an account with us, you will receive an email within a few minutes.
Something went wrong. Try again or contact support if the problem persists.
Escapist logo header image

Researchers Attempting to Prevent Inevitable Rise of Terminators

This article is over 13 years old and may contain outdated information
image

The Singularity Institute is already deep in the future war against rogue artificial intelligence by researching ways to keep A.I. from becoming hostile when it reaches its pinnacle: free thought.

Comprised of a group of eight intelligent men and women from all different fields, The Singularity Institute operates as a means of keeping The Terminator from becoming a terrifying reality. The group has spent years working to develop ways to keep computers from becoming our malevolent leaders, if A.I. ever reaches such a point. If the Jeopardy-winning supercomputer Watson is any indication, that future could be very near.

While The Singularity Institute doesn’t necessarily expect that a free-thinking A.I. will become fixated on enslaving the human race, it does expect that if this sort of sentient computer started trying to achieve its own means, it would be focused on achieving specific goals with humanity on the back-burner. A document on the group’s thoughts of reducing catastrophic risks reads “broad range of AI designs may initially appear safe, but if developed to the point of a Singularity could cause human extinction in the course of optimizing the Earth for their goals.” The group believes that resources such as solar and nuclear energy could be just a few things that a certain types of A.I. would be compelled to control.

In researching A.I., the Singularity Institute hopes to help push A.I. away from being indifferent from humanity in order to have the best likely scenario, safe A.I. that is compelled to help in efforts such as curing disease, aiding in the prevention of nuclear warfare and other ways of furthering our race in a peaceful manner.

If you feel compelled to help stop the rise of the machines, The Singularity Institute is currently accepting $1 donations over at Philanthroper. A small price to keep Skynet at bay, if I do say so myself.

Source: The Singularity Institute via Gizmodo

Recommended Videos

The Escapist is supported by our audience. When you purchase through links on our site, we may earn a small affiliate commission.Ā Learn more about our Affiliate Policy