The Risk Of SuperIntelligence - Deepstash

The Risk Of SuperIntelligence

The oncoming and unstoppable intelligence explosion carries profound “existential risk.” As technology slowly evolves to make this explosion a reality, humans must carefully consider all the philosophical and moral ramifications now – prior to the appearance of superintelligent AI. Once AI appears, it may be too late for contemplation.

376

2.21K reads

CURATED FROM

IDEAS CURATED BY

coab

Education officer at museum

Superintelligence asks what will happen once we manage to build computers that are smarter than us, including what we need to do, how it’s going to work, and why it has to be done the exact right way to make sure the human race doesn’t go extinct.

The idea is part of this collection:

Fostering Psychological Safety In The Workplace

Learn more about books with this collection

How to handle and learn from mistakes

The benefits of psychological safety in a workplace

The importance of empathy and active listening

Related collections

Read & Learn

20x Faster

without
deepstash

with
deepstash

with

deepstash

Personalized microlearning

100+ Learning Journeys

Access to 200,000+ ideas

Access to the mobile app

Unlimited idea saving

Unlimited history

Unlimited listening to ideas

Downloading & offline access

Supercharge your mind with one idea per day

Enter your email and spend 1 minute every day to learn something new.

Email

I agree to receive email updates