• Updated 2023-07-12: Hello, Guest! Welcome back, and be sure to check out this follow-up post about our outage a week or so ago.

How Does One Explain The Difference In The Size & RAM Usage Of Modern Operating Systems?

Paralel

Well-known member
My installation of System 7.1.2 on my Blackbird takes up only ~10 MB of hard disk space and uses ~3300 KB of memory, and can do most of what I can do with a modern system. Same trackpad, same keyboard, same LCD screen. Same ability to connect to external input devices and video output. Browsing the web, on a wired or wireless connection. Access a CD\DVD drive. Use any given piece of productivity software. Etc... If the hardware was present, I'd likely be able to access USB devices without much more overhead as far as hard drive and memory usage based on the mass storage and USB extensions used by Apple in later operating systems.

My Windows 11 installation takes up ~29 GB and uses 5.3 GB of memory. How, in ~30 years time has the size of operating systems expanded ~2,900x and memory use has increased by ~1600x and I honestly can't do that much more with this OS than I can do with that one?

Has the coding ability of the people writing this stuff really gotten that much worse? Or am I missing something?
 

chelseayr

Well-known member
@Paralel its a rather convoluted topic but just to try sum up *a tiny bit* of it..
1. lot of high-level abstractions (system 7 probably drew directly to the graphic engine while windows 95 had to go through the directx api and windows 11 probably is relatively much worser not to mention the newer security-related api to work around)
2. lazy programmers using generic libraries (I am not saying basic/shared libaries are bad but for to eg download 3mb of jquery each webpage on a website just simply to see a tiny floating menu bar which could easily had been done in more honest 5kb direct code)
3. the push to demand more hardware sales directly or indirectly by "slowing down softwares"

I don't mean to be starting any flame wars at this point now, but even then theres so much more that could be mentioned on the other hand
 

beachycove

Well-known member
Your post made me go and look up the Wikipedia entry for Wirth’s Law (my usual port of call when this general question comes up; it’s a sort of devotional experience for me, inducing calm amid the sense of finding that the All has just been explained). Anyway, according to Wikipedia, the trend that Wirth’s Law describes was identified as early as 1987, which, alas, would make even System 7.1 part of the problem.

My coding experience is more or less limited to bits of BASIC and AppleScript, so this is a completely amateur comment — BUT, I suspect the answer to the question has something to do with systems becoming too complex for them to be understood properly by any actual human being. Software is thus no longer able to be elegantly coded at a low level, since that would require a kind of vision, comprehension, and genius that is not strictly possible for the human mind. It once was possible, just — one thinks of Bill Atkinson, for instance — but those days have passed. Instead, bloat gets added to bloat in standard industry practices and the thinking is that newer processors will make up for complexities thus added. And so they do, so that consumers continue to consume, though the costs to them of such assumptions and industry norms are exponential.

In one of the Foundation books, I think it was, Isaac Asimov wrote about a computer that was perfect enough to have survived for centuries. It did what was required and needed no correction or improvement, since it could simply be left to get on with what it did so well for century after century.

I’d like to buy one of those.
 
Last edited:

Cory5412

Daring Pioneer of the Future
Staff member
The short version is: "computers do more in 2023 than they did in 1993" - hope that helps!
 

Cory5412

Daring Pioneer of the Future
Staff member
So, on a more serious and effort note:

Without getting "too far" into it - there were "several" massive shifts in "what the average new computer could do" between 1993 and 2013 or so.

I've been using the term "The Plateau" for a number of years and I largely mean it to codify/group all the different aspects that have made computers "boring" over the last ~10-15 years. All of that "it feels like computers don't progress as fast as they used to" and, well, the fact that for the first time in history most normal people can do everything they want to, today, with current software, on 10+ year old hardware.

Yeah, sure, in 2003 you could use a Quadra 840av, I was at the time! It wasn't, in retrospect, a "modern" experience by a long shot. Just off hand the things that a 68k Mac really couldn't do:
  • Wireless (you happen to have the literal one single model that can do wireless, but it can only do it to ~1999-2003 standards)
  • playing compressed audio files (especially multi-tasking and on anything not a "flagship" in 1993-4, like a Quadra 650/800 or your PB550)
  • playing high quality video files
  • doing several things at once
  • (unless you upgrade to OS 8.x or install other software) literally typing in a document at the same time as you're moving or copying some files
  • multimedia web content such as flash videos
  • exchanging and collaborate on documents with modern office software without using a feature-incomplete intermediate format like RTF or plain text (plain text editors that had spell checkers weren't if I remember correctly common at the time and markdown was a few years into the future, plus dedicated markdown software is really a modern era thing even if Gruber technically invented it in like 2002 or whatever)
Those were all things normal people could conceivably have wanted to do. Computing moved incredibly fast from the '70s through to the late 2000s and early 2010s and most of the things that were coming in are things normal people wanted. Even today like... most normal people want to be able to play h.264 and 265-encoded content on their computers.

That's not even counting all the things people now expect their computers to do in the background. Being able to play a video or do chat or open a document isn't enough, computers are expected to stay responsive under workloads like "a hundred chrome tabs of research for an article" or "watching a video, editing a spreadsheet, playing a game, having a video chat, also recording the screen and doing some bulk video conversion/processing work in the background" or "all of that and virtual machines". Also Office documents now support live co-authoring, file syncing often happens in the background, everything is encrypted and decrypted in flight.

Microsoft in particular has been pretty good at efficiency. Windows still runs fine on HDDs. Not great but way better than Mac OS, even better than MacOS versions from 5-7 years ago. It runs fine on 4GB of RAM. It runs fine on Pentium 4s.

Getting back to The Plateau, I'd largely argue the minimum viable "Good Experience" these days is a quad-core Ivy Bridge CPU from 2012, an SSD (basically any modern one), 8GB of RAM. You can use an older CPU so long as it's a quad-core and you have a supported graphics card for video acceleration with at least h.264 video is supported. You can flex some of those bits forward and backward based on specific needs, e.g. I have a laptop from 2009 with a dualcore that runs desktop apps fine but heavy web workloads are absolutely unacceptably slow.

Is it theoretically possible that with "better programmers" we could be doing this on computers from 1993? No. Absolutely not
 

Cory5412

Daring Pioneer of the Future
Staff member
lazy programmers using generic libraries

Bad take. "human time" has been more expensive than "machine time" for a very long time. I genuinely question if there was literally ever a moment where this wasn't true. (maybe An Single Year in like 1962?)

It's infinitely more cost-effective for the vast majority of programming that occurs on the face of this planet to be in relatively high-level languages that let people get things running quickly and share projects among different people who may need to maintain it.

In addition, the growth of webtech as a presentation method now means line of business apps are no longer tied to specific client technologies. If that had happened in 1998 the net effect would be that instead of DOS cards and virtual PC, companies would've been able to allow their employees to have Macs and fill out forms in a web browser.

Sure, Chrome requires more memory than Netscape 4, but from an economic and human perspective the benefits are near infinite.

The lever to press to have the biggest impact with programmers that write in low level languages is to have them work on OS and service software and that's... literally what Microsoft does. And, it seems to be working - The practical system requirements for Windows and Office have been flat for fifteen yeras.
 

Cory5412

Daring Pioneer of the Future
Staff member
So, to pull back from being a little severe and flippant for a moment: Modern computers do genuinely do a lot. It's up to you to decide whether or not that's a good thing, but most of this has been true for a long time. Windows 10 isn't really any bigger on disk than 7 or Vista are, if your system is currently using more, that may well be from the archival of an old OS version or you may have something like shadow copies running.

Pulling back even further - I don't think it's possible to port "modern" computing to hardware from 1993. At absolute minimum, you'd need to pick and choose what you wanted to be able to do, and things like "record the screen while you live co-edit a spreadsheet, stream a movie, and have a video chat with your colleague, also there's a virtual machine in the background" just... aren't reasonable on hardware over about 10 years old. That's a lot going on and I think even a 10 year old machine would struggle. I'm being a little unfair because that's sevreal video streams in and out and that's not fully normal - most people don't stream a movie while they record their own screen and/or a video chat.

Interest in video streaming over the last ~15 years, ironically,s erved the Zoom Era of computing pretty well and in my experience, participating in and recording a multi-party video conference on 10 y/o hardware is reasonable, although it does use a lot of the system so you really can't also be running a hundred tabs.

If it weren't for capitalism, I suspect The Plateau would be longer. Running uBlock Origin helps my older systems out significantly, but I don't find that, say, once you're on a 15 year old computer, avoiding Atom-descendant text editors like Visual Studio Code in favor of notepad++ actually makes that big of a difference. Similarly, I didn't find that, say, Spotify (which is also electron) was significantly worse on resources than Windows Media Player or iTunes.

There are extreme examples though: The last time I measured it, the Word web app used about 300 megs of RAM, versus about 30 for Word 2010 through 365. (This was on Windows, these numbers are slightly different, and if I remember correctly worse on Mac.)

The "efficiency above all else" platforms these days really are mobile, and I've seen it floated that CPUs in use on Android and Windows on ARM being so far behind Apple's own ARM chips is part of what's holding The Plateau in place, even as Microsoft starts to, say, require TPM2 and CPUs from 2018 or newer.

if you could hypothetically hire an unlimited number of the worlds most talented programmers, I guess maybe you could make "modern computing" work on hardware from like 2003, provided you were fine with giving up most of what makes modern computing what it is -- e.g. HD video, encryption on everything at all times, an unlimited number of file sync tools, some aspects of multimedia chat, some aspects of webtech-adtech, virtualization, and possibly a couple other nice-to-haves.

There's an intersting piece of tech alt-history fanfic for whoever wants it in terms of what computing might look like if, say, CPU and graphics horsepower had stopped developing at some point, even if other things had continued apace.

What would it look like if the fastest CPUs were ~1.0-1.2GHz PIIIs and G4s but SATA and SSDs had become a thing? Tough to say. I suspect that we would've retained more focus on desktop software. The experience of OS X 10.4/5/6-era software on an SSD would be even better than it was on an HDD. I expect that web tech would still have evolved and it would still make sense for a lot of applications, but its use cases where it doesn't make sense would be limited.

I suspect "computers didn't become good enough for h.264 video" has knock-on effects for things like, wel, people probably still want HD video eventually so, what? blu-ray becomes a mainstay? TVs stay at standard def in perpetuity because it's not reasonable to edit HD video on a computer from 2003, even if it has significantly better storage? Tough to say.
 

sfiera

Well-known member
Apple Color Emoji is 188MB right there. Another seven preinstalled fonts (for CJK languages) take more than 10MB each. You probably don’t need these, and in System 7 you might have been content with ����, but who would buy a modern computer without emoji support?
 

Paralel

Well-known member
Apple Color Emoji is 188MB right there. Another seven preinstalled fonts (for CJK languages) take more than 10MB each. You probably don’t need these, and in System 7 you might have been content with ����, but who would buy a modern computer without emoji support?

I guess for me Emoji's just isn't a make-or-break feature. I rarely use them.
 

CC_333

Well-known member
If I were to do emoji on a machine that didn't explicitly support them, I'd probably do it the old fashioned way, with periods and commas, dashes, colons and semicolons, brackets and parentheses. All the graphical ones are nice and I do use them sometimes, but I can live without them.

As for computing in general, I tried a thought experiment a couple years ago (and I may do it again sometime) where I used a highly upgraded Windows 2000 on a Pentium II (yes, Pentium II, running at a blazing fast 300 MHz) with 128 MB of RAM and a then-current copy of a Firefox-derived browser, and the results were.... quite miserable speed-wise, but it actually worked! The one cheat I'll confess to is that I used an IDE-to-mSATA adapter for more SSD-like disk speed. My experiment probably would've been impossible if not for that.

Anyway, with such miserable specs, I could nevertheless do everything I generally use a computer for, just very slowly. Going just a couple years newer to a Pentium 4 running XP and 2GB RAM, the results were still sluggish, but MUCH better overall. And an early Core2 Duo with XP or even 7 and 4 GB? Perfectly usable, if not fast.

Are newer machines better? Of course. Does this magically make old machines useless junk? It depends on how patient one is and what compromises they're willing to live with. Not all can do bulk HD video encoding or many dozens of other background tasks while simultaneously maintaining hundreds of web browser pages and playing back high bit rate compressed audio.

Can a clever hacker make do with a 1993 machine for word processing, basic web browsing and email? Probably! For all their limitations, those old machines can still be astonishingly flexible and capable when properly updated with the right software.

c
 

chelseayr

Well-known member
@Cory5412 mm yeah sorry I didn't meant lazy as in not wanting to do any lengthy job. eg just an annoyance toward these that use a jack-of-trade fat library to produce a slow app/ux instead of a more prune relevant library for a good one (of course bypassing the library in favour of direct ux/app layer would had been ever more snappy but I can understand not wanting to look at such a path in the first place these years..unless naturally you are one of these small number of competent commercial programmers hired for extremely small or timecounting-sensitive computers)
 

slipperygrey

Well-known member
A few things I'd add to the soup:

- Memory protection. The more you cordon off parts of the memory map for each process, the more memory you need in total!
- The rise of interpreted languages for production applications. Develop rapidly in JavaScript, Python, etc. without having to worry about memory management or system calls, but the abstraction requires a lot more CPU cycles.
- And then the next tier of languages, Java, C# etc., that have isolated runtimes with nice stuff like automatic garbage collection, which takes a lot more resources compared to coding directly against the hardware. As a side note: If everyone were still coding in assembly and C for everything, building complex and rich applications would be an enormously more costly and hard to maintain / test (millions of buffer overflows).
- Security. Encrypting and decrypting files and network traffic on the fly. Do you remember when Google started mandating https back in 2017? That's when the cost of decrypting network traffic had dropped enough to justify this.
- Emulation, virtualization, containerization in general.
- Unicode and HUGE polylingual fonts with ~150k chars. Gone are the days when we had to worry about codepages and charsets. But keeping all of this in memory is costly!
 

AndyO

Well-known member
The explanation is in itself quite simple: Computers were not true mass-market objects for years, and were thus the province of specialist developers and manufacturers producing products for a limited audience of users with commonly fairly specific things to do. As they became more widely accepted and the market for them expanded, the earlier chaotic manufacturing and software base shook out to becoming more stable and mature, with the result that profit-centric development became increasingly crucial in the marketplace rather than earlier solution-driven development.

Profit demands that more product is sold to more people, more frequently, so whereas in the 'early days', the speed of technological advance governed the rate of growth of systems architectures and OS features, in the 'modern era', it's all about features, because features sell product - and that means the complexity of software and operating systems, much more than hardware.

Is it a good thing? It depends on whether you want to work in multi-camera 4K video editing or 128-channel surround sound studio audio production - or indeed something equally specialist. The market in a practical sense can't sustain highly specialist needs in the same way it used to, so in effect we all pay for our systems to have the ability to manage far more complex and specialist tasks, which most users probably don't have any need for in a real sense. But when you compare the cost of a specialist video editing suite in 1990 (for example) to an M2 Mac that can do far more for far less, the cost of specialist capabilities has compressed over the years while the capabilities have expanded.

Observations that modern systems can do so much more do perhaps miss the point that most people don't need more than a small piece of today's all-things-to-all-people systems, but the complexity of systems damns many to a life of constant updates, patches and feature creep that makes actually using a system harder than it needs to be.

The problem is that the people who talk about technology may understand the technology and how it develops in great depth, but not necessarily understand much about what users want from it or why. Clearly in some instances they don't actually care.

I will say that personally, I hate the feature creep and bloat with a vengeance. I find it obstructive to practical use, sometimes downright confusing, and it is frustrating to hear 'technologists' bleating on about the wonders of modern systems development as if these things are an end in themselves.

I'm also reminded that for all the supposition that our computers can do so much more today than they could 'back then', back then a vast array of the published media, including well known daily newspapers and numerous full color glossy magazines were produced from start to final pre-press on compact Macs, complete with 9-inch B&W screens. Those old machines and technologies were really not that shabby.

They can't play on the internet though, as if that's a measure of anything but our obsession with an online universe of distractions and content largely obscured by advertising.

So as you'd guess, I am far less impressed with modern systems in use than I am with just what was made possible with the far less complex architectures and resources of systems in the past, the skill of software engineers squeezing the last drops of capability from hardware, and the power of users to put all that together and get their work done.
 

Oberlehrer

Well-known member
A couple of things that might or might not be correct:

- Doesn't RISC generally require larger executables? So the switch from 68k to PPC would have automatically increased code size.
- Certain modern processor features require a lot of overhead. Multithreading comes to mind.
- To allow true "plug and play" the OS must have to do a lot of "heavy lifting" itself - the universal drivers need to be able to work with a ton of different hardware.
- I have the feeling that certain multi-platform modern frameworks generate large (and maybe not really efficient) code - it runs on different platforms but might not be optimized as much as code that targets just a single platform from the beginning.
 

Cory5412

Daring Pioneer of the Future
Staff member
an annoyance toward these that use a jack-of-trade fat library to produce a slow app/ux

The thing is: Software written that way isn't slow if you have a reasonable computer from the last ~10 years or a high end computer form the last 15 years.

My daily computers are 10 and 11 years old respectively and my dad dailies a 12 year old machine. Nothing is slow on these computers.

Now, would that software be slow on a 20 year old computer? absolutely, but the rift between what was common even just 2009 -> 2011 is massive. In that space you really had quad-cores take over on desktops and in enthusiast/gamer/workstation-level laptops, 16G memory ceilings are common, and 6-gigabit SATA Became the most common. That's in addition to really huge per-core boosts, and graphics hardware becoming more capable.

For "basic" usage, desktops up to about a decade old are fine.


More importantly than emoji: modern computers support all global text encodings with few or no troubles and exceptions. It's one of the massive improvements of the last 20 years compared to the 20 years before that.

It's less about displaying 😅 correctly and more about displaying 量産型ティーン correctly.

Emoji are just a benefit of that, by way of being part of unicode.


unless naturally you are one of these small number of competent commercial programmers hired for extremely small or timecounting-sensitive computers

Bad take. Working in a high level language doesn't imply competency. As has been pointed out several times, the time you save working in a higher level language is often exchanged for doing different types of things, e.g. also knowing a database system or also knowing specific business rules.

Embedded stuff has its own share of problems, and there's plenty of incompetency or situations where it would have cost too much to fix a problem, or where a problem wasn't worth fixing because the task gets done without it. (Struggling to recall a story where a team was struggling to fix a memory leak in an embedded computer and then they were like "wait, this memory leak takes 17 hours to crash the system and the uh... whole computer will be blown to smithereens 12 hours in because this is a space launch rocket..." so they just shipped it as-is and worked on something more important, because a memory leak that becomes apparent at 17h, 5 hours after the computer no longer exists, isn't that big of a deal.)

If you're thinking about electron, which is yes typically a memory hog and has at times resulted in software that's massively bad (Slack using several gigs of ram back in 2014-15 or so is one of the best examples) but that varies enough that I'm no longer convinced "being electron" in and of itself causes it.

Though, there's plenty of electron apps I don't install if I can get away with it, primarily because it's one more thing to keep maintained. Discord is one of these for me, I also only have desktop Spotify installed on one machine.

Profit demands that more product is sold to more people, more frequently, so whereas in the 'early days', the speed of technological advance governed the rate of growth of systems architectures and OS features, in the 'modern era', it's all about features, because features sell product - and that means the complexity of software and operating systems, much more than hardware.

This plays out differently in hardware and software.

On the hardware side of things: the opposite is basically true. PC hardware has been mostly a declining market for a few years as people at home move to phones, tablets, and chromebooks. That's true of K-12, and those things are making inroads elsewhere, although often less generically for e.g. information workers, "It depends".

Modern desktop OSes and software runs on incredibly old computers compared to any point in the past. I know I've said it several times but I'm running all current software on my 10 y/o systems and it's "basically fine". As an enthusiast, I'd probably notice if I upgraded from my i5-2400 to an i5-13xxx and the attendant other modernizations. I'd expect the whole system to be 2-3x faster on average, but even my oldest computers spend most of their time waiting for me, and not the other way around.

On the software side of things - yeah, the need to introduce new features constantly has been bad overall. There's a couple different things working together to create that environment. Our overarching economic system encourages flashy things you can sell repeatedly rather than, say, encouraging the idea of maintaining existing hardware and keeping software running for longer, or selling things based not on sustainability but new functionality.

Subscriptions are a sore point but to be honest, if I had my way, Microsoft would be taking my money monthly or yearly to maintain Windows and Office on existing and new hardware and balancing "efficiency" and "new features" in a little bit of a different way.

I'm also reminded that for all the supposition that our computers can do so much more today than they could 'back then',

I think and talk and write about this every once in a while. I think I even mentioned things like "file sync" and "live co-authoring" in a previous post in this very thread.

But also: There's a good reason why nobody does that. It was better than doing it by hand but it was miserable compared to doing it on a bigger screen and on a faster computer.

Yes. Computers literally gained more capabilities along the way. Some of those capabilities mean newer and better workflows are available. Sure, you can publish an entire metro newspaper on a 9-inch Mac, but I bet every paper that could upgraded to a Mac II with a 2PD the instant they could. They probably added networking as soon as they could, etc etc. Along the way you get the ability to integrate color graphics directly in layout files, and although I'm not, you know, a pro typesetter I'm 100% that there's other things that made upgrading meaningfully important.

But to my point about hardware: if you install AdobeCC on a Windows 10 computer... is there anything you "can" do on a computer built in 2023 that you "can't" in 2013? The same is likely not true if you tiptop a 10-capable machine from 2005, unfortunately. (2005 just because that's the oldest

That largely isn't the case for, say, 2013 and 2003 or 2003 and 1993 or 1993 and 1983. That's what I mean by The Plateau and one of the reasons I'm pushing back against this idea that software has gone too far.

One of the biggest stories in computing is things that were "possible" at one time being "reasonable" and "affordable" at another time. This dovetails in with the "specialist" comment you made. In 1993, high-quality video capture on a Mac was "possible if you spend twenty thousand dollars on a tip-top machine, a storage array, and video compression hardware" and in 1999 it was "reasonable on the $1,299 computer". (To the credit of all involved: you really needed DV compression hardware which was built into cameras at that point, but still.)

"Not everyone wants to edit DV" - yeah but that computer is also significantly better at literally everything else than what came before, and people did want to do MP3 and the Internet.

Much of "capability" really does depend on "reasonableness" and for a lot of this stuff, if a computer "can" do a thing but not in real-time or not at the same time as other things, it's not "reasonably" capable. And, reasonableness is in the eye of the beholder. Calling back to your newspaper example: A Mac Plus was almost certainly more reasonable than either guessing about appearance on an Apple II or literally laying out a page by hand. (I imagine pasteboard techniques would've persisted in some environments too so your 9-inch Plus can work on, say, a single ad layout or a single article, which is a much easier lift than a whole page or the whole paper.)

So yeah I think it's fair to talk about literal new capabilities. Things get backported but theye're not always good.

This goes for all of us: Just because the new capabilities aren't important to you doesn't mean they aren't important to someone. (Although I think it's fair to categorize these into things that are commonly wanted and beneficial, like MP3s, HD video playback, desktop compositing, still image compression that's fast, etc etc) and things that aren't super common, like high-resolution/high-speed 3d gaming, in-memory databases, working with uncompressed video directly, six protools cards, etc.

Those uncommon/specialty use cases are still important, and to an extent they often drive technology forward, e.g. high-refresh monitors are geting cheaper because gamers like them and many people say they're better/beneficial for average desktop work, similar to how big displays, color displays, and higher-color displays were in the past.
 

Cory5412

Daring Pioneer of the Future
Staff member
TL;DR: yes, stuff is bigger. There's good reasons. It doesn't prevent The Plateau from having happened, you can daily a 10 year old computer with modern software if your needs aren't very specialist.

This wasn't in what I wrote, but workplaces still replace computers around every 5 years because it's cheaper than doing the maintenance themselves and cascading machines around an institution, plus many workplaces don't like "owning" "assets" and lease their computers. The benefit (I think this was in a draft but didn't make the final cut) is off-lease business computers are usually sold to the general public significantly cheaper than brand new computers.
 

AndyO

Well-known member
I will bow out of any further part of this, not least because the sheer torrent of the above post is such that I have no hope of working out if I agree with it or not. Likely not, I suspect, but I'm more comfortable with a discussion rather than a barrage.
 

Cory5412

Daring Pioneer of the Future
Staff member
I'm sorry that felt like a barrage! It looks like I wrote about 650 words in response to your part, which was itself about 580 words - I didn't feel like that was too big a differential. That was rewritten down from something over 2x the length - but I'm super bad at brevity and so it wouldn't surprise me if I could've done it in fewer. But - this is also a big, complicated issue, so it's not... really something you can cover in much less, either.

Ultimately, to be quite straight-up, I don't think this (or literally any vintage tech scene) is a good venue for this discussion, regardless. Most of us aren't involved in the production of software and most of us aren't even remotely close, nor do we understand the different markets. Plus, you get the political aspect which this group always handles poorly. The majority of us aren't even involved in "technology," professionally, so we're basically armchair quarterbacking.

Software could be better, I don't think anybody denies that, but I'm annoyed so many people are so excited to take cheap dunks on... what? people using a tool to get a job done? using tools you don't aesthetically like for reasons that don't matter in lots of contexts? doing something not-morally-objectionable their boss/team/job wanted?

I'm not here for it.
 

AndyO

Well-known member
It's a very complex issue, certainly, and I'll admit that I have serious problems with some of your posts simply because I can't follow the thought pattern in such an unstructured commentary - and particularly when I'd guess you come at it from a very different angle than I do, so your starting point and mine appear not even remotely the same.

It's not a big deal in a real sense, since it's not an argument that has to be won, but this is a subject that deserves to be explored because it's quite a fascinating one.

For my part, I come at this as a person who actually uses, by choice and preference, old systems every single day - yet because I am a systems manager and InfoSec specialist, also have workflows and responsibilities around and dependent on modern/current systems which extend from fax machines up through workstations and production print and digitizing equipment to server farms. With a foot in both old and new camps, I see the advantages and disadvantages in both.

I will also admit that being autistic, I have serious issues with distractions since 90% of what I do for a living requires tight and continuous focus for hours at a time, and working on modern macOS or Windows can prove exasperating and highly unproductive for that.

However, I also have been in this business since (perhaps ill-advisedly) buying an RML 380Z for my business, and having to learn how to program it... which started me off on the next 40+ years of technology, so I've seen it all.... or most of it.

But yes, I have written software, I do this work professionally, I have owned and run a quite successful business, I have taught on MBA courses, and I understand markets reasonably well because I have to in the context of some of the work I do. Which means that I understand why, for example, my employer does not lease computers and never has - it doesn't fit the capital asset program we know to be best suited to our long-term financial health, is severely cost-inefficient and far too inflexible.

What I don't do is take cheap dunks.
 
Top