Best quotes by Eliezer Yudkowsky on Intelligence
Checkout quotes by Eliezer Yudkowsky on Intelligence
-
‟ By far, the greatest danger of Artificial Intelligence is that people conclude too early that they understand it.
- Eliezer Yudkowsky
-
‟ Anything that could give rise to smarter-than-human intelligence - in the form of Artificial Intelligence, brain-computer interfaces, or neuroscience-based human intelligence enhancement - wins hands down beyond contest as doing the most to change the world. Nothing else is even in the same league.
- Eliezer Yudkowsky
-
‟ We tend to see individual differences instead of human universals. Thus, when someone says the word 'intelligence,' we think of Einstein instead of humans.
- Eliezer Yudkowsky
-
‟ Since the rise of Homo sapiens, human beings have been the smartest minds around. But very shortly - on a historical scale, that is - we can expect technology to break the upper bound on intelligence that has held for the last few tens of thousands of years.
- Eliezer Yudkowsky
-
‟ I am a full-time Research Fellow at the Machine Intelligence Research Institute, a small 501(c)(3) public charity supported primarily by individual donations.
- Eliezer Yudkowsky
-
‟ I keep trying to explain to people that the archetype of intelligence is not Dustin Hoffman in 'The Rain Man;' it is a human being, period. It is squishy things that explode in a vacuum, leaving footprints on their moon.
- Eliezer Yudkowsky
-
‟ There's a popular concept of 'intelligence' as book smarts, like calculus or chess, as opposed to, say, social skills. So people say that 'it takes more than intelligence to succeed in human society.' But social skills reside in the brain, not the kidneys.
- Eliezer Yudkowsky
-
‟ When you think of intelligence, don't think of a college professor; think of human beings as opposed to chimpanzees. If you don't have human intelligence, you're not even in the game.
- Eliezer Yudkowsky
-
‟ Intelligence is the source of technology. If we can use technology to improve intelligence, that closes the loop and potentially creates a positive feedback cycle.
- Eliezer Yudkowsky
-
‟ The purest case of an intelligence explosion would be an Artificial Intelligence rewriting its own source code. The key idea is that if you can improve intelligence even a little, the process accelerates. It's a tipping point. Like trying to balance a pen on one end - as soon as it tilts even a little, it quickly falls the rest of the way.
- Eliezer Yudkowsky