anyone who prioritizes the wellbeing of the “future of humanity” in the abstract over the dignity and care of individuals in the present cannot claim that they truly care about people
I’m struck by how profoundly non-humanistic many AI leaders sound.
- Sutton sees us as transitional artifacts
- x-risk/EA types reduce the human good to bare survival or aggregates of pleasure and pain
- e/accs reduce us to variables in a thermodynamic equation
- Alex Wang calls... See more
Brendan McCord 🏛️ x 🤖x.comwe should directly tie the success of our technologies to how much they enable our humanity (as in, our positive human characteristics), and use this criteria to evaluate past, present and future technologies.
Saffron Huang • To be a Technologist is to be Human — Letters to a Young Technologist
Many people knew this before, but the last three years have hammered home the fact that we cannot protect things that we don’t empathize with. If we don’t care about the value of other lives, whether human or animal, then we won’t be motivated to protect those lives.