added by baja and · updated 9d ago
Rationality: From AI to Zombies
A clear argument has to lay out an inferential pathway, starting from what the audience already knows or accepts. If you don’t recurse far enough, you’re just talking to yourself.
from Rationality: From AI to Zombies by Eliezer Yudkowsky
Zach Kirshner added 8mo ago
This is why rationalists put such a heavy premium on the paradoxical-seeming claim that a belief is only really worthwhile if you could, in principle, be persuaded to believe otherwise.
from Rationality: From AI to Zombies by Eliezer Yudkowsky
Zach Kirshner added 8mo ago
That sort of error is called “statistical bias.” When your method of learning about the world is biased, learning more may not help. Acquiring more data can even consistently worsen a biased prediction.
from Rationality: From AI to Zombies by Eliezer Yudkowsky
Zach Kirshner added 8mo ago
A cognitive bias is a systematic way that your innate patterns of thought fall short of truth (or some other attainable goal, such as happiness).
from Rationality: From AI to Zombies by Eliezer Yudkowsky
Zach Kirshner added 8mo ago
label an emotion as “not rational” if it rests on mistaken beliefs, or rather, on mistake-producing epistemic conduct:
from Rationality: From AI to Zombies by Eliezer Yudkowsky
Zach Kirshner added 8mo ago
This correspondence between belief and reality is commonly called “truth
from Rationality: From AI to Zombies by Eliezer Yudkowsky
Zach Kirshner added 8mo ago
Jonathan Wallace suggested that “God!” functions as a semantic stopsign—that it isn’t a propositional assertion, so much as a cognitive traffic signal: do not think past this point. Saying “God!” doesn’t so much resolve the paradox, as put up a cognitive traffic signal to halt the obvious continuation of the question-and-answer chain.
from Rationality: From AI to Zombies by Eliezer Yudkowsky
Zach Kirshner added 8mo ago
our tendency to assess phenomena by how representative they seem of various categories.
from Rationality: From AI to Zombies by Eliezer Yudkowsky
Zach Kirshner added 8mo ago
Of course, one didn’t use phlogiston theory to predict the outcome of a chemical transformation. You looked at the result first, then you used phlogiston theory to explain it. It’s not that phlogiston theorists predicted a flame would extinguish in a closed container; rather they lit a flame in a container, watched it go out, and then said, “The ai
... See morefrom Rationality: From AI to Zombies by Eliezer Yudkowsky
Zach Kirshner added 8mo ago