• 0 Posts
  • 36 Comments
Joined 3 months ago
cake
Cake day: August 1st, 2024

help-circle






  • emmy67@lemmy.worldtoMicroblog Memes@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    8
    ·
    2 months ago

    The brain-dead take is that companies should explore new technology. Without any qualifiers on it (i assume there aren’t because you didn’t add any and you applied it to ai). That’s how we’ve ended up with such a huge amount of waste, pollution and theft from small independents.

    Even if we just narrow it to the field of AI, the waste and environmental damage from just this kind of tech is just absurd.

    Let’s add to the downsizing ai causes, the pathetic service disruptions and inevitable decline of a company’s reputation from using such a thing and its nothing but a waste.





  • The fundamental problem is all those results are on people with abnormal brain function. Because of the corpus calusotomy.

    It can’t be assumed things work that way in a normal brain.

    People do make up things in regards to themselves often. Especially in the case of dissonance. But that’s in relation to themselves, not the things they know. Most people, if you asked what op did will either admit they don’t know or that you should look it up. The more specific the question the less likely to make something up.







  • I wasn’t the one attempting to prove that. Though I think it’s definitive.

    You were attempting to prove it could generate things not in its data set and i have disproved your theory.

    To me, the takeaway here is that you can take a shitty 2 minute photoshop doodle and by feeding it thru AI it’ll improve the quality of it by orders of magnitude.

    To me, the takeaway is that you know less about ai than you claim. Much less. Cause we have actual instances and many where csam is in the training data. Don’t believe me?

    Here’s a link to it