might be a form of Jevons Paradox
(midwest.social)
(midwest.social)
https://raru.re/@ocean/116214437024389203
The whole industry needs rebuilt from the foundations. GRTT with a grading ring that tightly controls resources (including, but not limited to RAM) as the fundamental calculus, instead of whatever JS happens to stick to the Chome codebase and machine codes spewed by your favorite C compiler.
If one of us ever wins the lotto we better get on funding that
Had to install (an old mind you, 2019) visual studio on windows...
...
...
First it's like 30GB, what the hell?? It's an advanced text editor with a compiler and some ..
Crashed a little less than what I remember đĽ´đ
First itâs like 30GB, what the hell??
Just be grateful it's SSD and not RAM.
Visual Studio is the IDE. VS Code is the text editor.
OP was clearly using a rhetorical reduction to make a point that VS is bloated.
Visual code is another project, visual studio is indeed an IDE but it integrates it all. Vscode is also an integrated development environment. I don't really know what more to say.
My PC is 15 times faster than the one I had 10 years ago. It's the same old PC but I got rid of Windows.
Everything bad people said about web apps 20+ years ago has proved true.
It's like, great, now we have consistent cross-platform software. But it's all bloated, slow, and only "consistent" with itself (if even). The world raced to the bottom, and here we are. Everything is bound to lowest-common-denominator tech. Everything has all the disadvantages of client-server architecture even when it all runs (or should run) locally.
It is completely fucking insane how long I have to wait for lists to populate with data that could already be in memory.
But at least we're not stuck with Windows-only admin consoles anymore, so that's nice.
All the advances in hardware performance have been used to make it faster (more to the point, "cheaper") to develop software, not faster to run it.
And that us poors still on limited bandwidth plans get charged for going over our monthly quotas because everything has to be streamed or loaded from the cloud instead of installed (or at least cached) locally.
I'm dreading when poorly optimized vibe coding works it's way into mainstream software and create a glut of technical debt. Performance gonna plummet the next 5 years just wait.
Already happening with Windows. Also supposedly with Nvidia GPU drivers, with some AMD execs pushing for the same now
If only bad people weren't the ones who said it, maybe we would have listened đ
I almost started a little rant about Ignaz Semmelweis before I got the joke. :P
Bloated electron apps are what makes Linux on desktop viable today at all, but you guys aren't ready for that conversation.
Yes, in that the existence of bloated electron apps tends to cause web apps to be properly maintained, as a side effect.
But thankfully, we don't actually have to use the Electron version, to benefit.
I can only think of a couple Electron apps I use, and none that are important or frequently used.
Uhhh like what?
Note, I don't know how comprehensive this wiki list is, just quick research
https://en.wikipedia.org/wiki/List_of_software_using_Electron
From those, I'm only currently using a handful.
balenaEtcher, Discord, Synergy, and Obsidian
The viability of linux isn't dependent on them though
Agreed. I wasn't the one that claimed that
"Let them eat ram"
I hate that our expectations have been lowered.
2016: "oh, that app crashed?? Pick a different one!"
2026: "oh, that app crashed again? They all crash, just start it again and cross your toes."
Iâm starting to develop a conspiracy theory that MS is trying to make the desktop experience so terrible that everyone switches to mobile devices, such that they can be more easily spied on.
I bought a desktop PC for a little over 2k in late 2011, and still use it. I'm a back-end developer, and certainly I would like to be able to upgrade my 16 GB RAM to 32 GB in an affordable way.
Other than that, it's perfectly fine. IDE, a few docker containers, works.
And modern gaming is a scam anyway. Realistic graphics do not increase fun, they just eat electricity and our money. Retro gaming or not at all.
Imagine how things were if they were built to be maintained for 15+ years.
2011 means it's probably DDR3, which is still fairly affordable
wow, you are right! I didn't bother to check this whole time of needless suffering, but for what I earn with it in less than an hour I could probably buy 2x8 GB DDR-3, lol!
It just seemed a fair assumption that it would be insanely expensive ...
The same? Try worse. Most devices have seen input latency going up. Most applications have a higher latency post input as well.
Switching from an old system with old UI to a new system sometimes feels like molasses.
I work in support for a SaaS product and every single click on the platform takes a noticeable amount of time. I don't understand why anyone is paying any amount of money for this product. I have the FOSS equivalent of our software in a test VM and its far more responsive.
Except for KDE. At least compared to cinnamon, I find KDE much more responsive.
AI generated code will make things worse. They are good at providing solutions that generally give the correct output but the code they generate tends to be shit in a final product style.
Though perhaps performance will improve since at least the AI isn't limited by only knowing JavaScript.
I still have no idea what it is, but over time my computer, which has KDE on it, gets super slow and I HAVE to restart. Even if I close all applications it's still slow.
It's one reason I've been considering upgrading from6 cores and 32 GB to 16 and 64.
Have you gone through settings and disabled unnecessary effects, indexing and such? With default settings it can get quite slow but with some small changes it becomes very snappy.
I have not, but also it's not slow immediately, it takes time under use to get slow. Fresh boot is quite fast. And then once it's slow, even if I close my IDE, browsers and everything, it remains slow, even if CPU usage is really low and there's theoretically plenty of memory that could be freed easily.
Have you tried disabling the file indexing service? I think itâs called Baloo?
Usually it doesnât have too much overhead, but in combination with certain workflows it could be a bottleneck.
Upgrade isn't likely to help. If KDE is struggling on 6@32, you have something going on that 16@64 is only going to make it last twice as long before choking.
wail till it's slow
Check your Ram / CPU in top and the disk in iotop, hammering the disk/CPU (of a bad disk/ssd) can make kde feel slow.
plasmashell --replace # this just dumps plasmashell's widgets/panels
See if you got a lot of ram/CPU back or it's running well, if so if might be a bad widget or panel
if it's still slow,
kwin_x11 --replace
or
kwin_wayland --replace &
This dumps everything and refreshes the graphics driver/compositor/window manager
If that makes it better, you're likely looking at a graphics driver issue
I've seen some stuff where going to sleep and coming out degrades perf
Hmm, I haven't noticed high CPU usage, but usually it only leaves me around 500MB actually free RAM, basically the entire rest of it is either in use or cache (often about 15 gigs for cache). Turning on the 64 gig swapfile usually still leaves me with close to no free RAM.
I'll see if it's slow already when I get home, I restarted yesterday. Then I'll try the tricks you suggested. For all I know maybe it's not even KDE itself.
Root and home are on separate NVMe drives and there's a SATA SSD for misc non-system stuff.
GPU is nvidia 3060ti with latest proprietary drivers.
The PC does not sleep at all.
To be fair I also want to upgrade to speed up Rust compilation when working on side projects and because I often have to store 40-50 gigs in tmpfs and would prefer it to be entirely in RAM so it's faster to both write and read.
Don't let me stop you from upgrading, that's got loads of upsides. Just suspecting you still have something else to fix before you'll really get to use it :)
It CAN be ok to have very low free ram if it's used up by buffers/cache. (freeable) If Buff/cache gets below about 3GB on most systems, you'll start to struggle.
If you have 16GB, it's running low, and you can't account for it in top, you have something leaking somewhere.
Lol I sorted top by memory usage and realized I'm using 12 gigs on an LLM I was playing around with to get local code completion in my JetBrains IDE. It didn't work all that well anyway and I forgot to disable it.
I did have similar issues before this too, but I imagine blowing 12 gigs on an LLM must've exacerbated things. I'm wondering how long I can go now before I'm starting to run out of memory again. Though I was still sitting at 7 gigs buffer/cache and it hadn't slowed down yet.
12/16, That'll do it. Hopefully that's all, good luck out there and happy KDE'ing
I want to avoid building react native apps.
Windows 11 is the slowest Windows I've ever used, by far. Why do I have to wait 15-45 seconds to see my folders when I open explorer? If you have a slow or intermittent Internet connection it's literally unusable.
Even Windows 10 is literally unusable for me. When pressing the windows key it literally takes about 4 seconds until the search pops up, just for it to be literally garbage.
Found out about this while watching "Halt and Catch Fire" (AMC's effort to recreate the magic of Mad Men, but on the computer).
In 1982 Walter J. Doherty and Ahrvind J. Thadani published, in the IBM Systems Journal, a research paper that set the requirement for computer response time to be 400 milliseconds, not 2,000 (2 seconds) which had been the previous standard. When a human beingâs command was executed and returned an answer in under 400 milliseconds, it was deemed to exceed the Doherty threshold, and use of such applications were deemed to be âaddictingâ to users.
if it only occurs hours or days after boot, try killing the startmenuexperiencehost process. that's what I was doing until I switched to linux
I am using windows like once a week at maximum and then it only takes about 10 minutes. So I kind of do not really care and am glad, that I do not need to use it more often.
The Windows bloat each new generation is way out of control.
It takes forever to boot I know that and that's from fast food which is extra pathetic.
fast food
Too many nuggies
Maybe if Windows quit pigging out on tendies and slimmed down it would be as baf
Probably that's the folder explorer or whatever itself crashing.
yeah
and like why does it crash? it worked fine on Windows 10
I've given up trying to understand modern PC software. I can barely keep up with the little microcontrollers I work with. They aren't so little.
The tech debt problem will keep getting worse as product teams keep promising more in less time. Keep making developers move faster. Iâm sure nothing bad will come of it.
Capitalism truly ruins everything good and pure. I used to love writing clean code and now itâs just âprompt this AI to spit out sloppy code that mostly works so you can focus on what really matters⌠meetings!â
What really matters isn't meetings, it's profits.
I'll keep saying this: my 2009 i5 750 still feels as fast as my 2 years old workstations and can play almost everything I want with the 1060.
They often are worse, because everything needed to be an electron app, so they could hire the cheaper web developers for it, and also can boast about "instant cross platform support" even if they don't release Linux versions.
Qt and GTK could do cross platform support, but not data collection, for big data purposes.
There's no difference whatsoever between qt or gtk and electron for data collection. You can add networking to your application in any of those frameworks.
I don't know why electron has to use so much memory up though. It seems to use however much RAM is currently available when it boots, the more RAM system has the more electron seems to think it needs.
Chromium is basically Tyrone Biggums asking if y'all got any more of that RAM, so bundling that into Electron is gonna lead to the same behavior.
Ib4 "uNusEd RAm iS wAStEd RaM!"
No, unused RAM keeps my PC running fast. I remember the days where accidentally hitting the windows key while in a game meant waiting a minute for it to swap the desktop pages in, only to have to swap the game pages back when you immediately click back into it, expecting it to either crash your computer or probably disconnect from whatever server you were connected to. Fuck that shit.
I mean unused RAM is still wasted: You'd want all the things cached in RAM already so they're ready to go.
I mean I have access to a computer with a terabyte of RAM I'm gonna go ahead and say that most applications aren't going to need that much and if they use that much I'm gonna be cross.
Wellll
If you have a terabyte of RAM sitting around doing literally nothing, it's kinda being wasted. If you're actually using it for whatever application can make good use of it, which I'm assuming is some heavy-duty scientific computation or running full size AI models or something, then it's no longer being wasted.
And yes if your calculator uses the entire terabyte, that's also memory being wasted obviously.
I don't want my PC wasting resources trying to guess every possible next action I might take. Even I don't know for sure what games I'll play tonight.
Well you'd want your OS to cache the start menu in the scenario you highlighted above. The game could also run better if it can cache assets not currently in use instead of waiting for the last moment to load them. Etc.
Yeah, for things that will likely be used, caching is good. I just have a problem with the "memory is free, so find more stuff to cache to fill it" or "we have gigabytes of RAM so it doesn't matter how memory-efficient any program I write is".
âmemory is free, so find more stuff to cache to fill itâ
As long as it's being used responsibly and freed when necessary, I don't have a problem with this
âwe have gigabytes of RAM so it doesnât matter how memory-efficient any program I write isâ
On anything running on the end user's hardware, this I DO have a problem with.
I have no problem with a simple backend REST API being built on Spring Boot and requiring a damn gigabyte just to provide a /status endpoint or whatever. Because it runs on one or a few machines, controlled by the company developing it usually.
When a simple desktop application uses over a gigabyte because of shitty UI frameworks being used, I start having a problem with it, because that's a gigabyte used per every single end user, and end users are more numerous than servers AND they expect their devices to do multiple things, rather than running just one application.
they're actually slower than windows vista, there is said it.
Vista honestly wasn't as bad as we all said/remember, but it was the start of Windows optimization downturn. It worked great on top of the line systems with tons of power, and was the best looking Windows Microslop ever developed.
It just happened to also coincide with the start of netbooks and low power computers going mainstream, and marketing thought that the F1 requiring OS should also be sold on a 3 door hatchback with 60 horsepower.
Your mileage with Vista was wildly hardware-dependent. Prior to Vista, if you could run one version of Windows, the next version would run just about as well.
The Indexer and Glass were memory hungry. If you gave it a decent amount of ram, it could look like a dream while it did. If you turned off Aero on an under-specced machine, it could also run pretty well, but if you turned off Aero, you didn't have much of a reason not to just run 98se.
The other shoe was drivers. Noone was ready for WDDM and a LOT of the small to mid-sized hardware vendors emergency released slow, buggy, memory-hungry drivers that just made Vista feel horrible.
I had some off-the-shelf compaqs that ran beautifully, My dual P3/scsi workstation with tons of ram, ran like hot garbage.
Websites are probably a better example; as the complexity and bloat has increased faster than the tech.
oh, yes, somebody made this a long time ago, in response to the performance of new webpages https://motherfuckingwebsite.com/
I love it
Well yeah, why would I learn html when I can learn React?!?
(/s but I actually did learn React before I had a grasp of semantic Html because my company needed React devs and only paid for React-specific education)
For anyone unsure: Jevon's Paradox is that when there's more of a resource to consume, humans will consume more resource rather than make the gains to use the resource better.
Case in point: AI models could be written to be more efficient in token use (see DeepSeek), but instead AI companies just buy up all the GPUs and shove more compute in.
For the expansive bloat - same goes for phones. Our phones are orders of magnitude better than what they were 10 years ago, and now it's loaded with bloat because the manufacturer thinks "Well, there's more computer and memory. Let's shove more bloat in there!"
Jevon's Paradox is that when there's more of a resource to consume, humans will consume more resource rather than make the gains to use the resource better.
More specifically, it's when an improvement in efficiency cause the underlying resource to be used more, because the efficiency reduces cost and then using that resource becomes even more economically attractive.
So when factories got more efficient at using coal in the 19th century, England saw a huge increase in coal demand, despite using less coal for any given task.
Also Eli Whitney inventing the cotton gin to make extracting cotton less of a tedious and backbreaking process, which lead to a massive expansion in slavery plantations in the American South due to the increased output and profitability of the crop.
I always felt American car companies were a really good example of that back in the 60s-70s when enormously long vehicles with giant engines were the order of the day. Why not bigger? Why not stronger? It also acted as a symbol of American strength, which was being measured by raw power just like today lol.
This also reminds me of the way video game programmers in the late 70s/early 80s had such tight limitations to work within that you had to get creative if you wanted to make something stand out. Some very interesting stories from that era.
I also love to think about the tricks the programmer of Prince of Persia had employed to get the "shadow prince" to work...
https://www.youtube.com/watch?v=sw0VfmXKq54
Case in point: AI models could be written to be more efficient in token use
They are being written to be more efficient in inference, but the gains are being offset by trying to wring more capabilities out of the models by ballooning token use.
Which is indeed a form of Jevon's paradox
Costs have been dropping by a factor of 3 per year, but token use increased 40x over the same period. So while the efficiency is contributing a bit to the use, the use is exploding even faster.
I think we're meaning the same thing.
Yes, but have you considered if I just rephrase what you just said but from a slightly different perspective?
I'm pretty sure the "unused RAM is wasted RAM" thing has caused its share of damage from shit developers who took it to mean use memory with reckless abandon.
With 32 and 64 GB systems I've never run out of RAM, so the RAM isn't the issue at all.
Optimization just sucks.
Would be nice if I could force programs to use more ram though. I actually have 100GB of DDR4 my desktop. I bought it over a year ago when DDR4 was unloved and cheap. But I have tried to force programs to not be offloading as much. Like Firefox, I hate that I have the ram but it's still unloading webpages in the background and won't use more than 6GB ever.
I actually have 100GB of DDR4
They've got RAM! Get'em!
Programs that care about memory optimization will typically adapt to your setup, up to a point. More ram isnt going to make a program run any better if it has no use for it
Will disabling the swap file fix that?
If not, just mount your swap file in RAM lmao
Don't fully disable swap on Windows, it can break things :-/
I didn't know that, that used to not be the case.
Maybe it has changed again, but in the past I gave it a try. When 16 GB was a lot. Then when 32 GB was a lot. I always thought "Not filling up the RAM anyway, might as well disable it!"
Yeah, no, Windows is not a fan. Like you get random "running out of memory" errors, even though with 16 GB I still had 3-4 GB free RAM available.
Some apps require the page file, same as crash dumps. So I just set it to a fixed value (like 32 GB min + max) on my 64 GB machine.
Set swappiness to 5 or something similar, or disable swap altogether unless you're regularly getting close to max usage
In most cases, you either optimize the memory, or you optimize the speed of execution.
Having more memory means we can optimize the speed of execution.
Now, the side effect is that we can also afford to be slower to gain other benefits: Ease of development (come in javascript everywhere, or python) at the cost of speed, maintainability at the cost of speed, etc...
So, even though you dont always see performance gains as the years go, that doesn't mean shit devs, it means the priority is somewhere else. We have more complex software today than 20 years ago because we can afford not to focus on ram and speed optimization, and instead focus on maintainable, unoptimized code that does complex stuff.
Optimization is not everything.
unoptimized code that does complex stuff.
You can still have complex code that is optimized for performance. You can spend more resources to do more complex computations and still be optimized so long as you're not wasting processing power on pointless stuff.
For example, in some of my code I have to get a physics model within 0.001°. I don't use that step size every loop, because that'd be stupid and wasteful. I start iterating with 1° until it overshoots the target, back off, reduce the step to 1/10, and loop through that logic until I get my result with the desired accuracy.
Yeah, my Xperia 1 (Android 3.1) still runs fluid with it's 100 MB of RAM and storage. While my Leaf2 e-reader lags in a uptodate LineageOS, despite having 10x more CPU and 20x more RAM.
I still remember playing StarCraft 2 shortly after release on a 300$ laptop and it running perfectly well on medium settings.
Looked amazing. Felt incredibly responsive. Polished. Optimized.
Nowadays itâs RTX this, framegen that, need SSD or loading times are abysmal, oh and donât forget that you need 40gb of storage and 32gb of ram for a 3 hour long walking simulator, how about you optimize your goddamn game instead? Donât even get me started on price tags for these things.
Software and game development is definitely a spectrum though, but holy shit is the ratio of sloppy releases so disproportionate that itâs hard to see it at times.
Absolutely. Every time I play a game from before 2016 or so it runs butter smooth and looks even better than modern games in many cases. I don't know what we're doing nowadays.
StarCraft 2 was released in 2007, and a quick search indicates the most common screen resolution was 1024x768 that year. That feels about right, anyway. A bit under a million pixels to render.
A modern 4K monitor has a bit over eight million pixels, slightly more than ten times as much. So you'd expect the textures and models to be about ten times the size. But modern games don't just have 'colour textures', they're likely to have specular, normal and parallax ones too, so that's another three times. The voice acting isn't likely to be in a single language any more either, so there'll be several copies of all the sound files.
A clean Starcraft 2 install is a bit over 20 GB. 'Biggest' game I have is Baldur's Gate 3, which is about 140 GB, so really just about seven times as big. That's quite good, considering how much game that is!
I do agree with you. I can't think of a single useful feature that's been added to eg. MS Office since Office 97, say, and that version is so tiny and fast compared to the modern abomination. (In fact, in a lot of ways it's worse - has had some functionality removed and not replaced.) And modern AAA games do focus too much on shiny and not enough on gameplay, but the fact that they take a lot more resources is more to do with our computers being expected to do a lot more.
Why are you comparing the most common screen resolution in 2007 to a 4k monitor today? 4k isn't the most common today. This isn't a fair comparison.
1080p is still the most common, though is 1440p is catching up very fast
BTW the demand for bigger screens and bigger resolutions is something I don't easily understand. I notice some difference between 1366x768 and 1920x1080 on a desktop, but the difference from further increase is of so little use for me I'd classify it as a form of bloat. If anything, I now habitually switch to downloading 480p and 720p instead of higher definition by default because it saves me traffic and battery power, and fits much more on a single disk easy to back up.
Pixel density is more important than resolution. Higher resolution is only useful outside of design work if the screen size matches
IMO the ideal resolutions for computer monitors is 24" @ 1080p, 27" @ 2k, and 32"+ at 4k+. For TV it's heavily dependant on viewer distance. I can't tell the difference between 2k and 4k on my 55" TV from the couch.
the main thing I noticed with a 768p monitor was gnome being unusable thanks to their poor ui density
Excel is sooo much than it used to be in Office 97. And it's way better than any other spreadsheet software I've tried.
Speaking of, anyone know of any alternative that handles named tables the same as Excel? Built-in filtering/sorting and formulas that can address the table itself instead of a cell range?? Please?
SQL?
Seriously. If you are talking about querying tables, Excel is the wrong tool to use. You need to be looking at SQL.
I've been hosting grist for a while and it is quite nice. Wasn't able to move all the stuff from classic spreadsheets though
I'll check that out, thanks!
'Biggest' game I have is Baldur's Gate 3, which is about 140 GB, so really just about seven times as big. That's quite good, considering how much game that is!
Not at all. For example, Rimworld saves all the map and world data in one big XML (which is bad btw, don't do that): about 2 million lines @75 MB, for a 30-pawns mid-game colony.
So you see, Data is not what uses space. But what uses space instead is, if you don't properly re-use objects/textures (so called "assets"), or even copy and repack the same assets per level/map, because that saves dev time.
Ark Survival Evolved, with "only" about 100 GB requirement, was known as a unoptimized mess back then.
Witcher 3 mod "HD Reworked Next-Gen" has barely 20 GB with 4k textures and high-res meshes. And you can't say that Witcher 3 is not a vibrant and big open world game.
Then factorio dev blog comes in and spend months optimizing the tok of one broken gear in the conveyor belt to slightly improve efficiency.
Tbf, there's saves there that efficiency increase means a lot
Comparing a 20 year old game with FMV sequences at 1080p is certainly a take đ¤Ł.
Yup. Worst part? We've produced people who make this justified: users and idiotic managers (really hard for me to estimate which of them are bigger source of shit). And we keep producing them
On Linux it really is noticeable
Well, until you open a browser... or five, because these days nobody wants to build native applications anymore and instead they shove webapps into electron containers.
Right now, my laptop doesn't have to run much. Just a combination of KDE, browser, emails, music player, a couple of messengers and some background services. In total, that uses about 9.5 GB of RAM. 20 years ago we would have run the same workload with less than 1 GB.
Yeah, discotd is eating 1.5gb of ram. Actually crazy
When you become one with the penguin, though ... then you can begin to feel how much faster modern hardware is.
Hell, I've got a 2016 budget-model chromebook that still feels quick and snappy that way.
Sadly, it is not how it is for me. I've never (in last 20 years) experienced freezes that bad and that frequent as with my new beefy Linux PC.
I've got a 2007 laptop that was shitty even for its time, and it does the job perfectly as a home server with Debian and a few good open source services I want to host
But... 2016 was a decade ago. If it feels quick and snappy that way that means the post is right.
Which it kinda isn't but hey.
the point is the software is what's wrong, not the hardware. it feels snappy because it's linux, not because it's old hardware.
Except the Linux userbase has been saying that exact thing for the past ten years, so again, has Linux also degraded in sync or, hear me out here, is this mostly a nostalgia thing that makes you forget the cludgy performance issues of the software you used when you were younger and things have mostly gotten snappier over time across the board?
As a current dual booter I'll say that Windows and Linux don't feel fundamentally different these days, for good and ill. Windows has a remarkably crappy and entirely self-inflicted issue with their online-search-in-Start-menu feature, which sucks but is toggleable at least. Otherwise I have KDE and Win11 set up the same way and they both work pretty much the same. And both measurably better than their respective iterations 10, let alone 15 or 20 years ago.
Windows and Linux donât feel fundamentally different these days
Try Windows 11 vs. Linux on a shitty old laptop with a budget 2-core processor and 2GB of RAM. Then tell me Windows and Linux don't feel any different.
My bf bought me a brand new laptop with Win 10 preinstalled, and even after disabling or uninstalling as much as I could, it was literally like watching a slideshow. Then I installed Linux, and it...worked like you'd expect a brand new computer to work, fast and smooth. Never used Win 11 because I stopped using Windows after that.
Myyyyyeeeeh. A lightweight distro or a conemporaneous distro sure.
If I'm running GPU accelerated Steam, tons of tabs on Firefox and the same highly customized KDE desktop full of translucent components and extra animations I am willing to bet they'd both chug.
Which is what the conversation is about: new software doesn't suck, it's doing more stuff.
For sure, all things being equal Linux does run ligher on RAM and VRAM, so if you're using something that is speficially memory-limited so Windows and Linux fall on opposite sides of overflowing the available memory you'll definitely see better performance on Linux, but that's not an inherent issue with poorly made software having a huge performance overhead..
A decade? Try 25 years
why do anime girls have to be right all the time?
They are all actually the start of an incredibly complex dojinshi.
I sure hope the tags are wholesom... OH NO.
To prevent fictionalist comments in replies.
What Intel giveth, Microsoft taketh away.
I feel like this is Windows specific. Linux is rapid on PCs and my MacBook is absurdly quick.
PC games are software.
Unfortunately many PC games are also like this, astoundingly poorly optimized, just assume everyone has a $750 GPU.
Proton can only do so much.
... and Metal basically can't do that that much.
Look at Metal Gear Solid 5 or TitanFall 2, and tell me realtime video game graphics have dramatically increased in visual fidelity in the last decade.
They haven't really.
They shifted to a poorly optimized, more expensive paradigm for literally everyone involved; publisher, developer, player.
Everything relating to realtime raytracing and temporal antialiasing is essentially a scam, in the vast majority of actual implementations of it.
I guess the counter argument for games is load times have dramatically improved, though thatâs less about software development than hardware improvements.
If we put consoles in the same bracket as computers, the literally instant quick-resume feature on an Xbox (for example) feels like sci-fi.
Yeah, you kinda defeated your own argument there, but you do seem to recognize that.
You can instant resume on a Steam Deck, basically.
You can alt tab on a PC, at least with a stable game that is well made and not memory leaking.
Yeah, better RAM / SSDs does mean lower loading times, higher streaming speeds/bus bandwidths, but literally, at what cost?
You could just actually take the time to optimize things, find non insanely computationally expensive ways to do things that are more clever, instead of just saying throw more/faster ram at it.
RAM and SSD costs per gig are going up now.
Moore's Law is not only dead, it has inverted.
Constantly cheaper memory going forward turned out to not the best assumption to make.
With respect to OP's post, they say "you can't even tell the computers we are on are 15x faster...", and I reckon that quick resume etc, is an example of "you absolutely can tell that we now have extremely fast hardware" when compared to what came before, irrespective of the quality of the software.
I'm not disagreeing with you, I'm just picking apart the blanket "computers feel the same as they did a decade ago". Some computers might feel the same, and a lot of software might be unoptimised, but there's a good selection of examples where that's not the case.
App launch time can be annoyingly slow on mac if you're not offline or blocking the server it phones home to
it can be the difference between one bounce or seven bounces of the icon on my end
What apps out of interest? Iâm a new Mac owner, so limited experience, but everything seems insanely quick so far. Even something like Xcode is a one-bounce on this M4 Air.
My 12? 13? Year old Dell laptop does just fine running Ubuntu. It'll probably be fine for my needs for another 3 or 4 years at least.
Mint Xfce on my 2015 laptop compared to its previous system was the difference between usable and waiting 10 minutes for it to even boot, and things like gaming, VMs, comically large spreadsheets (surprisingly the memory hog), etc., were an eternal challenge on it. On my current laptop, I have the luxury of picking the systems by aesthetics and non-optimization functions instead. And to compare, I've run even the same updates on the two laptops, as the older one still works.
Anyone opening the app menu (from the dock or Home Screen) on an iPad will tell you that itâs not exclusive to windows pcs.
And Android
phones.
Using a smartphone from 5 years ago with modern updates feels worse than the first iPhone.
whomst is jeavon
https://en.wikipedia.org/wiki/Jevons_paradox
Whenever efficiency increases consumption increases. Better steam engines mean more coal consumption. Faster cheaper RAM means more RAM consumption.
That's... not really true, and not what that link shows. Those latency tests still show then-modern devices topping the list. They're arguing that some then-modern low end devices have more button-to-screen latency than older hardware (which they would, given he's comparing to single-threaded, single-tasking bare metal stuff from the 80s spitting signals out to a CRT to laptops with integrated graphics). And they're saying that at the time (I presume the post dates from 2017, when the testing ends), this wasn't well understood because people were benching the hardware and not the end to end latency factoring the I/O... which was kinda true at the time but absolutely not anymore.
I'd get in the weeds about how much or little sense it makes to compare an apple 2 drawing text on a CRT to typing on a powershell/Linux terminal windows inside a desktop environment, but that'd be kind of unfair. Ten years ago this wasn't a terrible observation to make with the limited tools the guy had available, and this sort of post made it popular to think about latency and made manufacturers on controllers, monitors and GPUs focus on it more.
What it does not show, though, is that an apple 2 was faster than a modern gaming PC by any metric. Not in 2017, and sure as hell not in 2026, when 240Hz monitors are popular, 120Hz TVs are industry-standard, VRR is widely supported and keyboards, controllers, monitors and GPU manufacturers are obsessed with latency measurements. It's not just fallacious, it's wrong.
I dislike a lot the framing of this.
Yes, the average software runs much less efficient. But is efficiency what the user want? No. It is not.
How many people will tell you that they stick to windows instead of switching to linux because linux is all terminal? And terminal is quicker, more efficient for most things. But the user wants a gui.
And if we compare modern gui to old gui... I don't think modern us 15x worse.
But the user wants a gui.
Firstly, plenty of Linux instances have GUI. I installed Mint precisely because I wanted to keep the Windows/Mac desktop experience I was familiar with. GUIs add latency, sure. But we've had smooth GUI experiences since Apple's 1980s OS. This isn't the primary load on the system.
Secondly, as the Windows OS tries to do more and more online interfacing, the bottleneck that used to be CPU or open Memory or even Graphics is increasingly internet latency. Even just going to the start menu means making calls out online. Querying your local file system has built in calls to OneDrive. Your system usage is being constantly polled and tracked and monitored as part of the Microsoft imitative to feed their AI platforms. And because all of these off-platform calls create external vulnerabilities, the (abhorrently designed) antivirus and firewall systems are constantly getting invoked to protect you from the online traffic you didn't ask for.
It's a black hole of bloatware.
I am not saying linux is terminal. I am saying that people tell you that linux is all terminal and that they want a gui.
Linux gui is much prettier than Windows anyway.
TVs became SmartTVs and now need the internet to turn on. The TVs need an OS now to internet to do TV.
Antennae broadcast TV seems like an ancient magic.
We've deprecated a lot of the old TV/radio signal bandwidth in order to convert it to cellphone signal service.
But, on the flip side, digital antennae can hold a lot more information than the old analog signals. So now I've got a TV with a mini-antennae that gets 500 channels (virtually none of which I watch). My toddler son has figured out how to flip the channel to the continuous broadcast of Baby Einstein videos. And he periodically hijacks the TV for that purpose, when we leave the remote where he can reach.
So there's at least one person I can name who likes the current state of affairs.
I always have to remind myself being able to stream audio from a cellphone while driving across a city is also a pretty crazy development.
There isn't anything fundamentally slower about using a GUI vs just text in a console. There's more to draw but it scales linearly. The drawing things on the screen part isn't the slow bit for slow programs. Well, it can be if it's coded inefficiently, but there are plenty of programs with GUIs that are snappy... Like games, which generally draw even more complex things than your average GUI app.
Slow apps are more likely because of an inefficient framework (like running in a web browser with heavy reliance on scripts rather than native code), inefficient algorithms that scale poorly, poor resource use, bad organization that results in doing the same operation more times than necessary, etc.
The terminal is quicker. Not because of the image is drawn more quickly but because it is more efficient to do anything.
Technically true, but there's a threshold on responsiveness. If both user interfaces respond in milliseconds, it doesn't matter if one is more efficient
Can you elaborate on that? I disagree but would like to understand why you think that. Maybe you're referring to something I wouldn't disagree with.
Dunno this paradox theory, but the impression I get is that when you're part of the process, it's harder to notice changes. But if putting the two devices side by side, trying to run the same systems, programs, etc., the difference is glaring. And from tests I did, if the software doesn't work on either of the devices, slapping a VM on the newer one to test older programs still tells quite a lot.
Uh, that's a bit off to be fair.
Our computers are 15x faster than they were about 15-20 years ago, sure...
But 1, the speed is noticeable and 2, not all the new performance is utilised 100%.
Sure, operating systems have started utilising the extra hardware to deliver the same 60-120fps base experience with some extra fancy (transparency with blur, for example, instead of regular transparency), but overall computer UX has plateaued a good decade and half ago.
The bottleneck is not the hardware or bad software but simply the idea of "why go faster when going this speed is fine, and we now don't need to use 15-30% of the hardware to do it but just 1-2%".
Oh and the other bottleneck is stupidity, like letting long running tasks onto the main/UI thread, fucking up the experience. Unfortunately, you can't help stupid.
No, the bottleneck is fucking electron apps.
The problem is not so much badly written programs and websites in terms of algorithms, but rather latency. The latency of loading things from storage, sometimes through the internet is the real bottleneck and why things feel so slow.
even with mothern ssds, things sometimes feel slower than what they were with hdds with time accurate software, windows 7 was snappy even on a hdd, windows 11 is slow and sluggish everywhere
That's some nonsense, though.
For one thing, it's one of those tropes that people have been saying for 30 years, so it kinda stops making sense after a while. For another, the reason it doesn't make sense is it doesn't account for modern computers doing more now than they did then.
In 2016 I had a 970 that's still in an old computer I use as a retro rocket, and I can promise you that wonderful as that thing was, I couldn't have been playing Resident Evil this week on that thing. So yeah, I notice.
And I had a Galaxy S7 then, which is still in use as a bit of a display and I assure you my current phone is VERY noticeably faster, even discounting the fact that its displaying 120fps rather than 60.
Old people have been going "things were better when I was a kid" for millennia. I'm not assuming we're gonna stop now, but... maybe we should.
it doesnât account for modern computers doing more now than they did then.
We know they do more, but most of that âmoreâ is bullshit bloat and sloppy engineering, neither of which we asked for.
When I boot up Windows 11 and start no programs, somehow thereâs already over 8Gb RAM already consumed, the CPU shows 8% utilisation, thereâs 244 processes running, and it still took every bit as long to get there as my Windows 7 machine from 2011. Thatâs what the rest of us are talking about.
When was the last time you booted a 2011 machine? Because man, is that not true.
And that's a 2016-2017 era PC.
Windows 7 didn't even have fast boot support at all. I actively remember recommending people to let their PCs sit for a couple of minutes after booting so that Windows could finish whatever the hell it was trying to do in the background faster instead of clogging up whatever else you were trying to do.
Keeping my old hardware around compulsively really impacts my perception of this whole "things were better when I was a teenager" stuff.
I very specifically said âmy windows machine from 2011â because itâs still in use.
Then you're either lying about it or haven't booted a newer PC. Fast Boot was a back of the box feature for Windows 8 for a reason. It was becoming a huge meme at the time how slow Win7 was to boot.
If your 2020s PC with Windows 11 is taking 45 seconds to boot on the Windows logo like Win 7 does (as seen in the benchmarks above) then you need some tech support because something is clearly not working as expected. I don't think even my weaker Win11 machines take longer than 10 secs from boot starting to the password screen.
That may be true anyway, because the tiny hybrid laptop I'm using to write this is reporting 2-5% CPU utilization even with a literal hundred tabs open in this browser. So... yeah, either you have a knack for hyperbole or something broken.
Neither of them are fresh out the box clean installs, they are what they are.
But anyway, Iâve been trying to engage in good faith but you keep being obnoxious on every single post you make, so itâs Block time
Hah. You do you. I get how it'd be obnoxious to be called out, but man, it's not my fault that you chose the worst possible example for this. Like, literally the worst iteration of Windows for the specific metric you called out, in a clearly demonstrable way that a ton of people measured because it was such a meme.
You can block me, but "they are what they are" indeed.
Incidentally, this is a classic opportunity to remind people that blocking on AP applications sucks ass and the only effect it has is for the blocker to stop being able to see what the blockee is saying about them while everybody else still gets access to both. Speaking of software degradation, somebody should look into that.
It tracks that the guy I saw raging about linux more than once is defending bloat lol

Matrix chat room: https://matrix.to/#/#midwestsociallemmy:matrix.org
Communities from our friends:
LiberaPay link: https://liberapay.com/seahorse