- cross-posted to:
- programming@programming.dev
- cross-posted to:
- programming@programming.dev
He also said the AI-generated code is often full of bugs. He cited one issue that occurred before his arrival that meant there was no session handling in his employer’s application, so anybody could see the data of any organization using his company’s software.
It’s only financial software, NBD.
Well to be fair, financial data should be public, it would stop so many crimes, so much corruption.
Maybe AI saw the problems that hidden financial data causes and just decided to do the world a favor!
As per usual, those pushing for AI the most are the ones who don’t fucking use it.
Is AI good for printing out the syntax, or an example of a library you haven’t used before?
Sure, sometimes yes. Sometimes no.
Should it be a requirement to be a regular part of software development?
No. AI hallucinates very often and is imitative in nature, not innovative.
More generally, noone should be required to do anything particular until it affects the team. Forcing people to work a certain way is beyond stupid.
I’ve been refusing to use any AI tools at all and luckily my manager respects that, even if he uses AI for basically everything he does. If the company ever decides to mandate it I’ll just have the AI write all my code and commit it with no checks. With the worker’s rights here, it’ll take several months to fire me anyways.
Managers are often idiots in over their heads. AI is really aggravating that problem.
My team have been trying it. So far, at best, it costs money but makes no difference in outcomes. Any productivity gains are wiped out by the time needed to diagnose and correct the errors it introduces.
I’d use Clippy before I use any of that time-wasting, unreliable, energy-guzzling crap.
This is just history repeating itself. A while ago it was typewriter repair persons vs. the keyboard. New tech won and time marched on. Having said that…fuck AI.
“We were still required to find some ways to use AI. The one corporate AI integration that was available to us was the Copilot plugin to Microsoft Teams. So everyone was required to use that at least once a week. The director of engineering checked our usage and nagged about it frequently in team meetings.”
The managerial idiocy is astounding.
It’s pretty easy to set up a cron job to fire off some sort of bullshit LLM request a handful of times a day during working hours. Just set it and forget it.
you could probably even get copilot to write it!
Not when you have to do SAML authentication to get a token for your AD account first.
You can even schedule it within copilot
For the FAANG companies, they do it in part so they can then turn around and make those flashy claims you see in headlines like “95% of ours devs use [insert AI product they are trying to sell] daily” or “60% of our code base is now ‘written’ by our fancy AI”.
I’ll admit, some tools and automation are hugely improved with new ML smarts, but nothing feels dumber than hunting for problems to fit the boss’s pet solution.
Like what?
claude performs acceptably at repetitive tasks when I have an existing pattern for it to follow. “Replicate PR 123, but to add support for object Bar instead of Foo”. If I get some of this busy work in my queue I typically just have claude do it while I’m in a meeting.
I’d never let it do refactors or design work, but as a code generation tool that can use existing code as a template, it’s useful. I wouldn’t pay an arm and a leg for it, but burning $2 while I’m in a meeting to kill chore tasks is worth it to me.
For example the tools for the really tedious stuff, like large codebase refactoring for style keeping, naming convention adherence, all kinds of code smells, whatever. Lots of those tools have gotten ML upgrades and are a lot smarter and more powerful than what I remember from a decade ago (intellisense, jetbrains helper functions, various opinionated linter toolchains, and so forth).
While I’ve only experimented a little with some the more explicitly generative LLM-based coding assistant plugins, I’ve been impressed (and a little spooked) at how good they often were at guessing what I’m doing way before I finished doing it.
I haven’t used the prompt-based LLMs at all, because I’m just not used to it, but I’ve watched nearby devs use them for stuff like manipulating a bunch of files in a repeated pattern, breaking up a spaghetti method into reusable functions, or giving a descriptive overview of some gnarly undocumented legacy code. They seem pretty damn useful.
I’ll integrate the prompt-based tools once I can host them locally.
These scummy fucks even put it as a requirement in job descriptions these days
This is a red flag for corpo culture shenanigans. Dodge the bullet.
What even is the requirement? “Must be able to ask a chatbot to do stuff”?
Then unionize! Nothing else will stop this.
Unions is not really a concept that is available to devs. At least around here.
I just attended an organizer training, and 70% of the people there were devs. Don’t believe the corporate bullshit, unions are for everyone.
No need to unionise when you have the power to make a startup.
But then first you need the power and ability to make a startup.




