an annoyance toward these that use a jack-of-trade fat library to produce a slow app/ux
The thing is: Software written that way
isn't slow if you have a reasonable computer from the last ~10 years or a high end computer form the last 15 years.
My daily computers are 10 and 11 years old respectively and my dad dailies a 12 year old machine. Nothing is slow on these computers.
Now, would that software be slow on a 20 year old computer? absolutely, but the rift between what was common even just 2009 -> 2011 is massive. In that space you really had quad-cores take over on desktops and in enthusiast/gamer/workstation-level laptops, 16G memory ceilings are common, and 6-gigabit SATA Became the most common. That's in addition to really huge per-core boosts, and graphics hardware becoming more capable.
For "basic" usage, desktops up to about a decade old are fine.
More importantly than emoji: modern computers support all global text encodings with few or no troubles and exceptions. It's one of the massive improvements of the last 20 years compared to the 20 years before that.
It's less about displaying
correctly and more about displaying 量産型ティーン correctly.
Emoji are just a benefit of that, by way of being part of unicode.
unless naturally you are one of these small number of competent commercial programmers hired for extremely small or timecounting-sensitive computers
Bad take. Working in a high level language doesn't imply competency. As has been pointed out several times, the time you save working in a higher level language is often exchanged for doing different types of things, e.g. also knowing a database system or also knowing specific business rules.
Embedded stuff has its own share of problems, and there's plenty of incompetency or situations where it would have cost too much to fix a problem, or where a problem wasn't worth fixing because the task gets done without it. (Struggling to recall a story where a team was struggling to fix a memory leak in an embedded computer and then they were like "wait, this memory leak takes 17 hours to crash the system and the uh... whole computer will be blown to smithereens 12 hours in because this is a space launch rocket..." so they just shipped it as-is and worked on something more important, because a memory leak that becomes apparent at 17h, 5 hours after the computer no longer exists, isn't that big of a deal.)
If you're thinking about electron, which is yes typically a memory hog and has at times resulted in software that's massively bad (Slack using several gigs of ram back in 2014-15 or so is one of the best examples) but that varies enough that I'm no longer convinced "being electron" in and of itself causes it.
Though, there's plenty of electron apps I don't install if I can get away with it, primarily because it's one more thing to keep maintained. Discord is one of these for me, I also only have desktop Spotify installed on one machine.
Profit demands that more product is sold to more people, more frequently, so whereas in the 'early days', the speed of technological advance governed the rate of growth of systems architectures and OS features, in the 'modern era', it's all about features, because features sell product - and that means the complexity of software and operating systems, much more than hardware.
This plays out differently in hardware and software.
On the hardware side of things: the opposite is basically true. PC hardware has been mostly a declining market for a few years as people at home move to phones, tablets, and chromebooks. That's true of K-12, and those things are making inroads elsewhere, although often less generically for e.g. information workers, "It depends".
Modern desktop OSes and software runs on incredibly old computers compared to any point in the past. I know I've said it several times but I'm running all current software on my 10 y/o systems and it's "basically fine". As an enthusiast, I'd probably notice if I upgraded from my i5-2400 to an i5-13xxx and the attendant other modernizations. I'd expect the whole system to be 2-3x faster on average, but even my oldest computers spend most of their time waiting for me, and not the other way around.
On the software side of things - yeah, the need to introduce new features constantly has been bad overall. There's a couple different things working together to create that environment. Our overarching economic system encourages flashy things you can sell repeatedly rather than, say, encouraging the idea of maintaining existing hardware and keeping software running for longer, or selling things based not on sustainability but new functionality.
Subscriptions are a sore point but to be honest, if I had my way, Microsoft would be taking my money monthly or yearly to maintain Windows and Office on existing and new hardware and balancing "efficiency" and "new features" in a little bit of a different way.
I'm also reminded that for all the supposition that our computers can do so much more today than they could 'back then',
I think and talk and write about this every once in a while. I think I even mentioned things like "file sync" and "live co-authoring" in a previous post in this very thread.
But also: There's a good reason why nobody does that. It was better than doing it by hand but it was miserable compared to doing it on a bigger screen and on a faster computer.
Yes. Computers
literally gained more capabilities along the way. Some of those capabilities mean newer and better workflows are available. Sure, you can publish an entire metro newspaper on a 9-inch Mac, but I bet every paper that could upgraded to a Mac II with a 2PD the instant they could. They probably added networking as soon as they could, etc etc. Along the way you get the ability to integrate color graphics directly in layout files, and although I'm not, you know, a pro typesetter I'm 100% that there's other things that made upgrading meaningfully important.
But to my point about hardware: if you install AdobeCC on a Windows 10 computer... is there anything you "can" do on a computer built in 2023 that you "can't" in 2013? The same is likely not true if you tiptop a 10-capable machine from 2005, unfortunately. (2005 just because that's the oldest
That largely isn't the case for, say, 2013 and 2003 or 2003 and 1993 or 1993 and 1983. That's what I mean by The Plateau and one of the reasons I'm pushing back against this idea that software has gone too far.
One of the biggest stories in computing is things that were "possible" at one time being "reasonable" and "affordable" at another time. This dovetails in with the "specialist" comment you made. In 1993, high-quality video capture on a Mac was "possible if you spend twenty thousand dollars on a tip-top machine, a storage array, and video compression hardware" and in 1999 it was "reasonable on the $1,299 computer". (To the credit of all involved: you really needed DV compression hardware which was built into cameras at that point, but still.)
"Not everyone wants to edit DV" - yeah but that computer is also significantly better at literally everything else than what came before, and people
did want to do MP3 and the Internet.
Much of "capability" really does depend on "reasonableness" and for a lot of this stuff, if a computer "can" do a thing but not in real-time or not at the same time as other things, it's not "reasonably" capable. And, reasonableness is in the eye of the beholder. Calling back to your newspaper example: A Mac Plus was almost certainly more reasonable than either guessing about appearance on an Apple II or literally laying out a page by hand. (I imagine pasteboard techniques would've persisted in some environments too so your 9-inch Plus can work on, say, a single ad layout or a single article, which is a much easier lift than a whole page or the whole paper.)
So yeah I think it's fair to talk about literal new capabilities. Things get backported but theye're not always good.
This goes for all of us: Just because the new capabilities aren't important to you doesn't mean they aren't important to someone. (Although I think it's fair to categorize these into things that are commonly wanted and beneficial, like MP3s, HD video playback, desktop compositing, still image compression that's fast, etc etc) and things that aren't super common, like high-resolution/high-speed 3d gaming, in-memory databases, working with uncompressed video directly, six protools cards, etc.
Those uncommon/specialty use cases are still important, and to an extent they often drive technology forward, e.g. high-refresh monitors are geting cheaper because gamers like them and many people say they're better/beneficial for average desktop work, similar to how big displays, color displays, and higher-color displays were in the past.