Lmao, “but” means your statement can be true and irrelevant at the same time. From the day photoshop could fool people lawyers have been trying to mark any image as faked, misplaced or out of context.
When you just now realise it’s an issue, that’s your problem. People can’t stop these tools from existing, so like, go yell at a cloud or something.
I am not disagreeing with you, but it’s intellectually dishonest to not acknowledge the context of the reality we live in: it used to require genuine talent and skill to use a paid tool to fake images, and now is as easy as entering text on your phone in a free app just describing what you want to see.
This is an exponential escalation of existing problems and technologies.
I never said I was just now worried about fake images. To say it myself: I’m worried about the now non-existent barrier that bad actors no longer need to clear to do whatever they want to do here.
Well yeah, I’m not concerned with its ease of use nowadays. I’m more concerned with the computer forensics experts not being able to detect a fake for which Photoshop has always been detectable.
Just wait, image manipulation will happen at image creation and there will be no “original”. Proving an image is unmanipulated will be a landmark legal precedent and set the standard for being able to introduce photographic evidence. It is already a problem for audio recordings and will be eventually for video.
TL;DR: The new Reimage feature on the Google Pixel 9 phones is really good at AI manipulation, while being very easy to use. This is bad.
Some serious old-man-yelling-at-cloud energy
It’ll sink in for you when photographic evidence is no longer admissible in court
Photoshop has existed for a bit now. So incredibly shocking it was only going to get better and easier to do, move along with the times oldtimer.
Photoshop requires time and talent to make a believable image.
This requires neither.
But it has been possible, for more than a decade
You said “but” like it invalidated what I said, instead of being a true statement and a non sequitur.
You aren’t wrong, and I don’t think that changes what I said either.
Lmao, “but” means your statement can be true and irrelevant at the same time. From the day photoshop could fool people lawyers have been trying to mark any image as faked, misplaced or out of context.
When you just now realise it’s an issue, that’s your problem. People can’t stop these tools from existing, so like, go yell at a cloud or something.
You are misunderstanding me.
I am not disagreeing with you, but it’s intellectually dishonest to not acknowledge the context of the reality we live in: it used to require genuine talent and skill to use a paid tool to fake images, and now is as easy as entering text on your phone in a free app just describing what you want to see.
This is an exponential escalation of existing problems and technologies.
I never said I was just now worried about fake images. To say it myself: I’m worried about the now non-existent barrier that bad actors no longer need to clear to do whatever they want to do here.
Well yeah, I’m not concerned with its ease of use nowadays. I’m more concerned with the computer forensics experts not being able to detect a fake for which Photoshop has always been detectable.
As the cat and mouse game continues, we ask ourselves, is water still wet?
Just wait, image manipulation will happen at image creation and there will be no “original”. Proving an image is unmanipulated will be a landmark legal precedent and set the standard for being able to introduce photographic evidence. It is already a problem for audio recordings and will be eventually for video.