Sublime
An inspiration engine for ideas
Announcing Indexes: Big Questions, Quantified — EA Forum
forum.effectivealtruism.org
A superintelligent system could have disastrous effects even if it had a neutral goal and lacked self-awareness. “We cannot blithely assume,” Bostrom wrote, “that a superintelligence will necessarily share any of the final values stereotypically associated with wisdom and intellectual development in humans—scientific curiosity, benevolent concern f
... See moreMeghan O'Gieblyn • God, Human, Animal, Machine: Technology, Metaphor, and the Search for Meaning
to trust yourself more than a particular facet of your civilization at this particular time and place, checking the results whenever you can, and building up skill.
Eliezer Yudkowsky • Inadequate Equilibria
“first-past-the-post”
Eliezer Yudkowsky • Inadequate Equilibria
Absence of proof is not proof of absence.
Eliezer Yudkowsky • Rationality
“coherent extrapolated volition.”
Brian Christian • The Alignment Problem
You cannot obtain more truth for a fixed proposition by arguing it; you can make more people believe it, but you cannot make it more true.
Eliezer Yudkowsky • Rationality
Free Energy Fallacy,
Eliezer Yudkowsky • Inadequate Equilibria
Since verbal behavior (spoken or written) is what gets the gold star, students begin to think that verbal behavior has a truth-value.