- cross-posted to:
- memes@lemmy.ml
- cross-posted to:
- memes@lemmy.ml
Ring specifically.
There’s a lot of doorbell camera footage on the Internet, so it’s probably trained off a lot of doorbell camera footage. It’s unfortunate, but there is no real conspiracy.
There’s a lot of footage of everything else too though but recreating them is still messed up.
If you dig into the unnamed YouTube uploads, there are many doorbell footage vids. Most, of nothing.
I was going to ask how to find “unnamed YouTube uploads”, but I found a site: http://astronaut.io/
I haven’t seen any doorbell footage yet though. How are you finding unnamed videos?
It was a rabbit hole I went down trying to find least viewed vids. Unless the search is very specific, searches usually lead to the same 10 people in whatever area. I honestly can’t remember how I got into it. But the boring lawn vids had me thinking some people were using YT as their cloud.
True, Ring’s bad contract habits in terms of contracts are very well known after all.
“contract habits in terms of contracts”?
what else would their contract habits be in terms of?
Low res, fixed POV, wide angle lens.
And it’s a type of video that’s widely circulated online because of the prevalence of footage. I would be totally unsurprised if commercial GPTs have been given users’ private doorbell cam footage to train on, but I don’t think the fact they can replicate it well is compelling evidence on its own.
(Admittedly I don’t use genAI and don’t run into video slop very often, so all I can do is take the claim that commercial GPTs are “so good” at generating these at face value.)
I’m not following the intent of the text?
Is the tweet saying we should use AI to generate fake doorbell videos to mess with police/ICE surveillance?
Or is it saying that so many people willingly give their video up that it allows AI to generate fake doorbell videos?
I took it to mean that AI companies have collaborated with doorbell companies to train on everyone’s doorbell footage.
Wrong community




