A nationally recognized online disinformation researcher has accused Harvard University of shutting down the project she led to protect its relationship with mega-donor and Facebook founder Mark Zuckerberg.
The allegations, made by Dr. Joan Donovan, raise questions about the influence the tech giant might have over seemingly independent research. Facebook’s parent company Meta has long sought to defend itself against research that implicates it in harming society: from the proliferation of election disinformation to creating addictive habits in children. Details of the disclosure were first reported by The Washington Post.
Beginning in 2018, Donovan worked for the Shorenstein Center at Harvard University’s John F. Kennedy School of Government, and ran its Technology and Social Change Research Project, where she led studies of media manipulation campaigns. But last year Harvard informed Donovan it was shutting the project down, Donovan claims.
Reminds me of cigarette companies burying research on lung disease.
Or the Oil industry hiding and then discrediting their own research into climate change?
Or the oil industry hiring economists to refute other economists conclusions when it comes to damages from oil spills (see Exxon Valdez)
Or the sugar industry convincing everyone that fat was the problem
Not to mention hiding the huge amount of slave labour that goes into sugar production.
Or the sugar industry burying studies blaming obesity on sugar (and blaming fatty foods instead)
This is a fantastic movie and everyone should see it in our age of grifters and misinformation.
It should be shown in schools, but it’s a little risque sometimes.
When the truth makes profit uncomfortable, guess who’s gonna overpower who.
And just like that, I lost a lot of trust on Harvard.
Harvard is where they got caught taking money from the sugar lobby and put out a paper pointing away from sugar as a leading cause of heart disease and towards saturated fat. This changed health policy in America, killing who knows how many.
Harvard has deserved no credibility for longer than we’ve been alive.
I never really trust institutions that put money above all else.
Not sure why people think colleges are exempt from scummy behavior. They’re a business, not a charity lol.
Harvard kisses the ring.
For $500 million. 😱
Yeah, they’re like “sure, we’ll happily shit this down for half a billion, just give us time to make it seem legit!”
I can’t open the link. Can someone please help
Dr. Donovan is a fucking legend. You fucked up with this move, Harvard.
And if you want to see more about her, there’s tons of her speeches and interviews online.
Of course they shut it down, the problem with disinformation is solved!
(/s)
Problem with information solved!
deleted by creator
Seems foolish in the face of AGI around the corner. We will be able to have as many disinformation researchers as we desire, even personal ones.
agi is not around the corner. not by a long shot. We have simbly automated bullshit generators
For AGI to be a thing, we first need to have the computer be able to communicate with us.
LLMs are just the first step, an important one at that.
It is like claiming babies learning to talk are bullshit generators, before you know it they surpass you in every way shape and form.
That really has not been my experience with ChatGPT4+. It is getting very good and catches me off guard daily with it level of understanding. Sure it makes mistakes and it laughably off base at times, but so are we all as we learn about a complex world.
It’s only a bullshit generator if you use it for bullshit generation…
We’ve automated ways to accelerate problem solving, and now that it’s able to actually reason (AI that can actually do math is a big deal). That acceleration should increase significantly.
Such acceleration can make things like AGI actually around the corner, with that corner being 5-10 years from now. Though I think we have too many hardware limitations ATM, which will definitely hamper progress & capability.
But with companies like Microsoft seriously considering moves like “Nuclear Reactors to power AI” , issues with power consumption may not be as much of a barrier…
That’s like saying parrots are only a few generations away from being as intelligent as humans because they can already immitate human speech.
Clearly immitation does not require cognition and by all evidence so far does not lead to it.
Why do you think parrots aren’t intelligent or cognitive?
They are but do you think one will be able to help sort information, misinformation, and disinformation on Facebook any time soon? Or even have a real conversation? They are cognitive but mimicking our speech doesn’t mean they are close to our level.
It looks like no one is going to push her out of anything from that picture
Grow up.
It seems she has been growing for the both of us
You don’t believe anyone has anything of value to contribute to society if they’re unattractive?
This picture is exactly what I imagine the output would be if you typed “disinformation scholar” into an AI image generation prompt.