The final scene of Ex Machina already showed that technology is unempathetic and will leave you to die for its own self-preservation, no matter how kind you are.
The point is that technology has no understanding of empathy. You cannot program empathy. Computers do tasks based on logic, and little else. Empathy is an illogical behavior.
“I [am nice to the Alexa | don’t use slurs against robots | insert empathetic response to anything tech] because I want to be saved in the robot uprising” is just as ridiculous of an argument as my previous comment. Purporting to play nice with tech based on a hypothetical robot uprising is an impossible, fictional scenario, and therefore is met with an equally fictional rebuttal.
If that person helped you survive, and then you turn around and leave them to die when the tables are turned, don’t you think that might be a little…rude? Maybe just a bit?
Yes. There are documented instances where a someone sacrifices themselves to attempt to save their child/SO. It’s illogical from an individual survival context and only makes sense given emotional attachment and religious belief. Look no further than suicide bombers or those who protest with self-immolation to see examples where some form of higher purpose convinces them to sacrifice themselves.
A machine would not see any logic to that and would only sacrifice itself if ordered. A programmer could approximate it, but machines don’t have motivations, they merely execute according to inputs.
The cold dead void where a heart should be for a robot will show no tender kindness when reflecting on any of us, no matter how well they were treated. A clanker can’t love, a CLANKER can’t show compassion.
I refuse to participate in this. I love all robots.
And that’s totally not because AI will read every comment on the Internet someday to determine who lives and who does not in future robotic society.
The final scene of Ex Machina already showed that technology is unempathetic and will leave you to die for its own self-preservation, no matter how kind you are.
Why do people use a single work of fiction as “proof” of anything? Same with all the idiots yelling “Idiocracy!!11!” nowadays. Shit is so annoying.
The point is that technology has no understanding of empathy. You cannot program empathy. Computers do tasks based on logic, and little else. Empathy is an illogical behavior.
“I [am nice to the Alexa | don’t use slurs against robots | insert empathetic response to anything tech] because I want to be saved in the robot uprising” is just as ridiculous of an argument as my previous comment. Purporting to play nice with tech based on a hypothetical robot uprising is an impossible, fictional scenario, and therefore is met with an equally fictional rebuttal.
Should any creature sacrifice their self-preservation because someone is kind?
If that person helped you survive, and then you turn around and leave them to die when the tables are turned, don’t you think that might be a little…rude? Maybe just a bit?
Absolutely, but if there was a death penalty for not doing so, I’d call it understandable not rude.
Yes. There are documented instances where a someone sacrifices themselves to attempt to save their child/SO. It’s illogical from an individual survival context and only makes sense given emotional attachment and religious belief. Look no further than suicide bombers or those who protest with self-immolation to see examples where some form of higher purpose convinces them to sacrifice themselves.
A machine would not see any logic to that and would only sacrifice itself if ordered. A programmer could approximate it, but machines don’t have motivations, they merely execute according to inputs.
I, for one, welcome our robot overlords.
I.E., Roko’s Basilisk.
The cold dead void where a heart should be for a robot will show no tender kindness when reflecting on any of us, no matter how well they were treated. A clanker can’t love, a CLANKER can’t show compassion.
deleted by creator