Windows 11 often requires new hardware. But that will be extremely pricey or have very little RAM for a while.
I dont believe that a single competent person works at Micro$oft anymore, but maybe maybe this could lead them to make a less shitty OS?
And garbage software like Adobe Creative Cloud too?
They obviously dont care about users, but the pain could become too big.
It’s a really nice idea, but bad developers are already so deep in the sunk cost fallacy that they’ll likely just double down.
Nobody reassesses their dogma just because the justification for it is no longer valid. That’s not how people work.
Not likely. I expect the AI bubble will burst before those software optimization gears even start to turn.
Not just that all of their ai slop code will be more unoptimized
Yeah, the systems in place right now took 40 years to build
Big AI is a bubble but AI in general is not.
If anything, the DRAM shortages will apply pressure on researchers to come up with more efficient AI models rather than more efficient (normal) software overall.
I suspect that as more software gets AI-assisted development we’ll actually see less efficient software but eventually, more efficient as adoption of AI coding assist becomes more mature (and probably more formalized/automated).
I say this because of experience: If you ask an LLM to write something for you it often does a terrible job with efficiency. However, if you ask it to analyze an existing code base to make it more efficient, it often does a great job. The dichotomy is due to the nature of AI prompting: It works best if you only give it one thing to do at a time.
In theory, if AI code assist becomes more mature and formalized, the “optimize this” step will likely be built-in, rather than something the developer has to ask for after the fact.
The “shortage” is temporary and artificial, so that’s a hard NO. The ram shortage doesn’t present any incentive to make apps more efficient because the hardware and software that is already in people’s homes won’t be effected by the shortage and people who currently use the software won’t be affected by the shortage. The very small percentage of people that will be affected by the temporary shortage wouldn’t justify making changes to software that is currently in development.
There’s no incentive for software companies to make their code more efficient until people stop using their software so stop using it and it will get better. Just as an example Adobe reader is crap, just straight up garbage, but people still use it so the app stopped getting improvements many years ago. Then Adobe moved to a subscription based system, and cloud service for selling your data but guess what, it’s still the same app that it was 10 years ago, just more expensive.
What crystal ball told you this was temporary? Every day for the past few years the consumer market moves further and further into serving only the wealthy. The people in power don’t care about selling RAM or other scraps to peasants.
Downvoted by libs with their collective heads in the sand.
It might not wind up working, but Altman and Nadella et. al are trying to push all consumers to forever rent compute from them.
They do not want you to be able to run your own Deepseek at home. They do not want you to control the hub of your smarthome. They want to know what’s in the spreadsheet you saved, what’s in the business plan you typed up, and when the password is to any E2EE service you have an account with.
They want to forecast you like the weather.
It’s not just garbage software. So many programs are just electron apps which is about the most inefficient way of making them. If we could start actually making programs again instead of just shipping a webpage and a browser bundled together you’d see resource usage plummet.
In the gaming space even before the RAM shortage I’ve seen more developers begin doing optimization work again thanks to the prevalence of steam deck and such so the precedent is there and I’m hopeful other developers do start considering lower end hardware.
Probably a super unpopular take, but the Switch and Switch 2 have done more for game optimization than the Steam Deck has by sheer volume of consoles sold than the Steam Deck ever could. I agree the Steam Deck pushed things further but the catalyst is the Switch/2
So the developers of PC games like Claire Obscure: Expedition 33, which doesn’t have a Switch version of any kinda, spent time, effort and money to optimize specifically for the Steam Deck… because of the Switch’s market share? Cmon now bud, that’s a straight up ridiculous take.
I take it the Switch/S2 has many non-Nintendo games shared with other consoles? Hard to search through 4,000 titles on Wikipedia to find them at random, but I did see they had one Assassin’s Creed (Odyssey) at the game’s launch. I never really had Nintendo systems and just associate them with exclusive Nintendo games.
I’m choosing to believe the Steam Machine will do more of the same for PC games. Maybe it won’t force optimization at launch, but I hope it maintains itself as a benchmark for builds and provides demand for optimization to a certain spec.
I only own one Nintendo game on my Switch. I’m not going to sit here and pretend most of my games run great on it though. Slay the Spire and Stardew run well. But I’ve had quite a few crashes with Civilization and some hangs with Hades or Hollow Knight too
I try to follow the gaming space and I didn’t really see anyone talk about optimization until the Steam deck grew. I do wish more companies were open about their development process so we actually had some data. The switch/switch 2 very well could have pushed it, but I think with those consoles people just accept that they might not get all the full modern AAA games, they’re getting Pokemon and Mario and such. Where as the steam deck they want everything in their steam library. I dunno
I have no real data, just what I’ve seen people discussing.
Web apps are a godsend and probably the most important innovation to help move people off of Windows.
I would prefer improvements to web apps and electron/webview2 if I had to pick.
If those web apps were using the same shared electron backend then they could be “a godsend”. But each of those web apps uses it’s own electron backend.
The beauty of it is that it electron/webview2 will probably get improved and you don’t need to fix the apps.
I don’t disagree with that. But the problem is having one electron backend for each web app and not one backend for all web apps.
Idk, I don’t think the issue is election apps using 100mb instead of 10mb. The kind of apps that you write as html/js are almost always inherently low demand, so even 10x-ing their resources doesn’t really cause a problem, since you’re not typically doing other things at the same time.
The issue is the kind of apps that require huge system resources inherently (like graphically intensive games or research tools), or services that run in the background (because you’ll have a lot of them running at the same time).
You’re off by a large margin. I’ll use two well documented examples.
Whatsapp native used about 300mb with large chats. Cpu usage stayed relatively low and constant. Yes it wasn’t great but that’s a separate issue. The new webview2 version hits over a gig and spikes the cpu more than some of my games.
Discord starts at 1gb memory usage and exceeds 4gb during normal use. That’s straight from the developers. It’s so bad they have started rolling out an experimental update that makes the app restart itself when it hits 4gb.
These are just two electron apps meant just for chatting mostly. That’s up to 5Gb with just those two apps. Electron and webview2 both spin up full node.js servers and multiple JavaScript heaps plus whatever gpu threads they run, and are exceedingly bad at releasing resources. That’s exactly why they are the problem. Yes the actual JavaScript bundles discord and Whatsapp use are probably relatively small, but you get full chromium browsers and all of their memory usage issues stacked on top.
Right
But those are only problems because they use the resources in the background. When the foreground app uses a lot of resources it’s not a problem because you only have one foreground app at a time (I know, not really, but kinda). Most apps don’t need to run in the background.Yes, thats the problem? I’m confused what you’re not getting here. Those programs are made to constantly run. Many people need both for various reasons. Add a main program like Photoshop and then you don’t have enough RAM. People don’t load discord, check a message, close it, load Whatsapp, check it, then load Photoshop.
The RAM usage doesn’t suddenly stop because you alt+tab to a different program.
there is no “shortage” just capitalism testing the limits of various bubbles.
deleted by creator
You fool, humans are flexible enough to get used to slow experiences. Even if the average user needs to have discord, slack, 100 chrome tabs, word and any other electron app opened simultaneously, he will just go through his work. He may not be happy with it but still continue without changing his habits.
But to be honest, I goddamn hope you are right!
Sometimes I also think there is no one competent left at Microsoft anymore but they still have their flight sim team so I guess that’s something.
Isn’t Microsoft just the publisher? Also, there’s so many problems with MSFS.
Found the silver lining guy.
Love the optimism but yeah, the impact on software dev will be minimal, if there even is one.
Why do you believe so? Do you believe software developers earn too much to care about RAM prices and will continue to write software that requires more RAM than the rest of the world can afford?
As a software dev, theres a lot of stuff thats just bloat now. Electron apps are really easy to make pretty and write for web devs and are super portable, but each one is literally an instance of a chrome browser. Theres still a lot of devs that care (to some degree) about performance and are willing to trim fat or take small shortcuts where viable.
However theres also the issue of management. I once was tasked with a problem at work dealing with the traveling salesman problem. I managed to make a very quick solution that worked fairly well and was fast but always left 1 point for last that probably should have been like point 3. Anyway, it was quick and mostly accurate, but my boss told me to “fix it” and in spite of my explaination that hes asking me to solve an unsolved math problem he persisted. I am now ashamed of how slow that operation is now since instead of just finding the nearest point it now needs to look ahead a few steps to see what path is shorter.
Because that kind of shift in mindset (going backwards, basically) will require far more pressure than a 1-2 year RAM shortage.
Enterprise developers are basically unaffected by this. And anyone writing software for mom & pop was already targeting 8gb because that’s what Office Depot is selling them.
This mostly hurts the enthusiast parts of tech. Most people won’t notice, because they don’t know the difference between 8, 16, or over 9000 gb of RAM. I’ve had this discussion with ‘users’ so many times when they ask for pc recommendations, and they just don’t really get it, or care.
I remember in the 80s a PC programmer that did his programs in GWBASIC and when I asked him why was he using that instead of a better language that could make faster a smaller programs his answer was “if this doesn’t run fast enough in the client’s PC then the client will buy a better PC”. That’s the mindset, “it’s not my problem once I sell it”.
The RAM shortage will end before any meaningful memory optimizations can be made.
Naw it’s easy:
void* malloc(size_t size) { return std::malloc(size/2); }Is that what happened after COVID with chip shortage?
For the most part, the answer seems to be yes. Some products did also ship with missing or reduced feature sets for a time, too.
Dealing with memory usage will likely require significant rewrites and architectural changes. It will take years.
The ”memory optimizations” we’ll see is the removal of features but charge the same price. Software shrinkflation. Will require same amount of memory though.
Misaligned incentives. The people making bloated software are not the people buying the Ram. In theory the people buying the ram are the same people buying the software and so might put pressure on the people making the software to make it more efficient, but that is a very loose feedback loop and I wouldn’t hold my breath.
I opened Photoshop, and I left it open with no document open. Just the main window. It started at 11 GB of RAM and went up to 28 gb without me doing anything.
If there was anything that was as good as Photoshop, I’d have switched years ago. But I’ve tried the alternatives, and they’re just is nothing like it. Same for InDesign. Affinity photo is really really close, but it’s just not the same.
I’ve been using Photoshop for over 30 years. Even when the time comes, making the switch will be very difficult.
edit: I just tried opening PS again and letting it sit. it’s hovering around 3-3.5GB of ram usage. I think that last attempt was a fluke.
I’ve been using Photoshop for over 30 years. Even when the time comes
It’s not coming. Not for you, anyway.
lol, eat me hahaha
Is there something specific you do? I hated Gimp for eternity, but PhotoGimp plugin paired with the newest (v3) Gimp isn’t that bad. I do enjoy the experience, mostly. Perhaps I just got used to it, but it’s quite usable for me.
I do a few different types of graphic design and photo editing for some different clients, but it’s more about the workflows I’ve worked out in PS. GiMP does things very differently, and I think that’s why a lot of PS users hate it. I sure do. it took me years to master PS. I don’t want to have to go through all of that again, and I certainly don’t have the time.
I will check out the plugin you mentioned. I’ve heard of it, but didn’t think much of it before.
I think I have something same, I absolutely hate Gimp for many years. The plugin does not change much, so do not expect miracles. Yet, it helps me tolerate the pain of an absolutely awful UX. Yet, you still need to relearn. Also, check out Krita, it’s good, and is similar to Photoshop in many regards. What I do, is simply edit some images for web development. So it’s not much, and that’s how I can tolerate Gimp.
I actually have Krita. I use it for creating digital art with my tablet. I don’t really use it for editing/compositing like PS.
Just curious-- what would you say are the main ways in which modern GIMP doesn’t live up to PS?
@pantherina@feddit.org,
Windows 11 often requires new hardware.
This was true in my case, but it was also true that I’d been using a 10yr old machine, which is pretty ridiculous. Win10 was creaking along, and Firefox wasn’t helping. So, ahead of the deadline, I got myself a ~US$350 mini-computer with modern AMD processor and 16gb. It’s been flying.
So it was a comparatively tiny investment to stay with a modern machine, and also helpful in maintaining a chain of redundancy. (i.e. if this one has a problem for whatever reason, I have a temporary backup machine) So in a way, the Win11 jump actually helped me out a lot.
Checking just now, the computer has gone up US$50 since then.
I started out using PS but when they decided to be a subscription model, I started using GIMP.
I can’t stand trying to use photoshop anymore, and while I would love the user experience to be improved and the interface to be a little more intuitive, I’ve never been able to not do something I needed to do on gimp.
Maybe my needs are more simple than a lot of people here, I’m definitely not a photographer so if you’re using it often enough, I suppose it could be better to use photoshop.
Similar situation here, Chief; altho I got out of PS well before the absurd subscription hurdle.
It was definitely a powerhouse, but GIMP has been sufficient for me, too. I only use it sporadically these days, but GPT5.2 has been useful in helping me when there’s 3-4 ways to do something, and I simply want to know which is easiest and most efficient. Of course, it doesn’t yet talk in Majel Barrett’s voice yet…
Of course, it doesn’t yet talk in Majel Barrett’s voice yet…
That can be arranged. I’ve heard they recorded her saying a bunch of frequently used words/phrases, as well as phonetic sounds, so that her voice could be used in the future for the computer. I’m not sure how true that is, but I am a little afraid to look it up.
I’ve always hoped to hear her in the new trek shows.
That reminds me… about 10-15yrs ago, someone scripted and CG-animated all new TOS episodes using audio samples of the original show, uploading them to YT. Me, I found the scripts surprisingly strong and the audio surprisingly effective and smooth. Unfortunately, the animation was by far the weakest link, but that was a long time ago in terms of advances. I’d love to see a modern effort.
Somewhat similarly, I love how these turned out:
TNG in TAS style:
https://www.youtube.com/watch?v=Jyz2pVqrEkIVOY in TAS style:
https://www.youtube.com/watch?v=luEDui2zAUw
I’ve been running Linux on a 14 year old Asus laptop with 4Gb of ram ,works fine
Well yes, Windows is pretty disastrous of an OS in the first place.
FWIW, I’m hoping to get Linux installed at some point.
but it was also true that I’d been using a 10yr old machine, which is pretty ridiculous
Why do you think it’s ridiculous to use a 10yo machine?
Shit… that would probably require a long-arse, detailed essay to answer adequately.
interesting you should mention your old machine. I recently upgraded my 2016 MacBook Pro to a M4 Pro MBP. Photoshop ran fine on the old one. Great, actually. The only difference I noticed was that PS launches faster and opens files faster, buuuut… that’s about it. PS already ran fine on my 9 y/o MBP. the new machine didn’t improve much, other than the RAM, which allows me to have more large docs open at once.
my last machine only had 16GB of ram, whereas this new one has 48GB (the max for the MBP). still, the performance is pretty close-- although my old machine would probably struggle if I had a bunch of large PSDs open.
as an investment, I didn’t really have a choice-- my old MBP died (ssd fried). I love this new machine, tho. it’s very fast.
When I was double-checking prices at Microcenter, I noticed that there was a mini (micro) Mac with modern processor, etc selling for a mere US$400. I had no idea such a thing existed, given their historically high prices. I reckon one could even use it somewhat like a laptop at times, or at the least, a portable. Just in general, though-- seems like it could help out a bunch of people who like Macs.
https://www.microcenter.com/product/688173/apple-mac-mini-mu9d3ll-a-(late-2024)-desktop-computer
Apple’s prices have been more reasonable in recent years, although still above-market. and the Mac Mini is a pretty powerful little machine for a good price.
OP is the optimistic type
Where I’m eyeing resource usage is in the cloud right now. I run a few discourse instances which seem really inefficient to me - 1.5G ram for just a discussion board. I have to dedicate a server for each one, whereas my rust web servers can have more like 30meg usage. Probably doing a lot less stuff, but still.
…and grocery store prices will go back down, too.
I suspect companies behind needlessly memory-intensive software would rather push (harder) towards cloud services, or ignore the problem entirely - I’m sure they’ll find a way to enshittify their products in a way that solves the problem for them, or see lower profits and learn absolutely nothing.
If the software in question is something people need for their job, those companies can absolutely just decide that it’s not their problem and that you’ll just have to face the shortage head-on.I recall listening to half of a video from SumitoMedia, where his answer to your question is, quote, “do you hear how fucking stupid you sound?” (you can probably guess why I didn’t watch the rest of it).













