• Updated 2023-07-12: Hello, Guest! Welcome back, and be sure to check out this follow-up post about our outage a week or so ago.

Old Mac, Modern Internet

Cory5412

Daring Pioneer of the Future
Staff member
I think the misinformation here is that "web sites today are the same as they are in the '90s."

This could not possibly be further from the truth, at all.

 

TheWhiteFalcon

Well-known member
Case in point; during Prime Day, Amazon's website displayed their deals in a grid. Each square had the ability to add the item to cart, showcase current stock levels, run a countdown timer until the sale started & ended, and refresh itself as status required. That's much cleaner than forcing the user to refresh the entire page, and it's no doubt easier on Amazon's servers.

But yes, it was much more complex than just an HTML table, so therefore it was evil and should be stopped.

 

Nathan

Well-known member
I think the misinformation here is that "web sites today are the same as they are in the '90s."

This could not possibly be further from the truth, at all.
It's not misinformation. The point is not that websites as a whole are the same, but that the majority of the content presented is. Think about it for a bit.

Even Facebook is just text, images, some animated GIFs (or similar) and video. The real changes are the addition of some kinds of video/multimedia content and aspects of the formatting and interaction plus sheer volume of content. Yet on account of some of those changes being made the sites are sometimes totally unviewable/unloadable . Otherwise it's a 3 column layout with css and some dynamic aspects. The javascript alone is probably the biggest burden.

At the end of the day a somewhat less dynamic view of the basic content of the website could probably be managed by an early powerpc and ram usage is probably the biggest hurdle. If you could strip the ads and avoid loading of anything other than static pictures in the news feed... We aren't talking about a switch to some kind of full body VR web browsing here.

Case in point; during Prime Day, Amazon's website displayed their deals in a grid. Each square had the ability to add the item to cart, showcase current stock levels, run a countdown timer until the sale started & ended, and refresh itself as status required. That's much cleaner than forcing the user to refresh the entire page, and it's no doubt easier on Amazon's servers.

But yes, it was much more complex than just an HTML table, so therefore it was evil and should be stopped.
 
How complex is it really, though, at it's fundamentals? A full desktop application could do that easily.
 

Cory5412

Daring Pioneer of the Future
Staff member
One thing that's interesting about this conversation, every time it comes up, is that it almost always completely excludes the possibility that there is anything interesting on the Internet other than the web.

The web (a particular application) is the main thing on the Internet (a network) that has really passed 68k Macs by. E-mail still ultimately works the same way it did 30 years ago. IRC does, hotline does, FTP does, Gopher does, etc etc etc.

Some of these are "solved" problems. You can use a security tunnel to strip SSL off of connections as a gateway or proxy. You can also use your own server to aggregate mail into an IMAP mailbox that a 68k mail client can access.

I think the real question here is whether or not there's enough interest in building out infrastructure for non-web applications.

This is a personal opinion, but even back in the day, the web was really never what I found "interesting" about my vintage Macs. It's better now, in that I derive entertainment from the web today, but it was never something I did a lot of back in the day. Even back when I was using my PowerBook G4, I had a web browser open and I was on an IRC channel, perhaps downloading files from an FTP server or moving something with hotline, and getting my mail with whatever was hot for that at the time.

The suggested solution that somebody might develop plain-HTML content and then people who want it can go read it is probably the easiest solution to this "problem".

 

Cory5412

Daring Pioneer of the Future
Staff member
Other than the text itself? literally everything about how web pages work has changed in the last 20 years. Multiple times. It's an entirely different platform today than it was in the late '90s.

You can't just go "oh it's text" and conclude that something's wrong with everybody who has endeavored to deliver text over a network and display it in a web browser in the past 20 years.

How complex is it really, though, at it's fundamentals? A full desktop application could do that easily.
It's incredibly complicated. A sufficiently large table and there's a liklihood that before you even get to "how should I present this?" may actually overwhelm a 68k processor.

That entire process is completely different than what web pages could do in the '80s. Gosh, it wasn't until at least the mid '2000s (2005, actually) that ajax was introduced and started to be used to update parts of pages without reloading the whole page. I would bet that Netscape 4, IE4, and perhaps even Classilla are completely incapable of doing that. And that kind of task is something even this forum does.

 

Cory5412

Daring Pioneer of the Future
Staff member
Plus: the older you get, the less likely you are to have a rapid application development technologies that include the ability to make calls out to web sites in order to get information like this. Microsoft was introducing this stuff in the early 2000s, before that it was presumed you were going to interface directly with a database instead of through middleware API calls on an httpd.

In fact, there's a big aversion these days to developing desktop software at all, since with electron you can just wrap your app in a standalone copy of Chrome and put it in an exe or app file and have it almost look native, which is considered "good enough" by many. (Slack, Atom, Spotify, Teams.)

At the peak of this being a Problem™ (or at least the peak so far) Slack on average people's computers was using more than a gig of RAM on its own.

For something you can do in 7.1 on an '020 with like 6 megs of RAM.

Is it sad? yes.

Is it bad? No, not really.

Is it inconvenient if your only computer is an original LC, Color Classic, or Classic II? Yes, but let's be real, the web was barely existed outside of CERN in 1990.

 

TheWhiteFalcon

Well-known member
Even Facebook is just text, images, some animated GIFs (or similar) and video.
It's pictures with millions of pixels and videos that are larger than most old Mac hard disks. Are you going to force everyone to upload only 180px images again? Is this a campaign to Make QuickTime Great Again?

 

Cory5412

Daring Pioneer of the Future
Staff member
Also text that refreshes automatically, quickly enough that some people use it as an actual chat platform, in lieu of something like IRC or AIM. Also, that's before you discover Facebook Messenger, the actual chat platform built into the web site.

 

Scott Squires

Well-known member
If you could strip the ads and avoid loading of anything other than static pictures in the news feed...
If you did that, your Facebook page would be completely blank. It doesn't have any static pictures or a news feed. What it has is a payload of Javascript over 2MB in size that makes hundreds of network requests for assets, templates, content, etc. It has its own rendering engine (called React) that builds a DOM from scratch and feeds it to the browser. The browser has to apply DOM and CSS layout rules that are orders of magnitude more complex than 3D rendering applications were in the '80s. A modern browser executes over a dozen interpreted programming and markup languages, compiles source code to machine language in real time, and has hundreds of advanced, computationally expensive APIs. Websites depend on all that functionality. If that functionality is not there, they don't work.

How complex is it really, though, at it's fundamentals? A full desktop application could do that easily.
You are confusing "fundamentals" with how you imagine you could implement it given your chosen constraints, not how the web actually is in reality implemented.

The web has fundamentally changed. You were ignorant, and have been informed. Now you're in denial.

If you want content that is usable on a '90s machine, you'll need to create it. Or dig it out of the internet archive.

 
Last edited by a moderator:

Nathan

Well-known member
Some of these are "solved" problems. You can use a security tunnel to strip SSL off of connections as a gateway or proxy. You can also use your own server to aggregate mail into an IMAP mailbox that a 68k mail client can access.
I wouldn't call the problem solved, although it is technically a solution. Having to run a second piece of software and an entire mail server just to get your mail is a major burden and somewhat unreasonable. You might as well just print it out on your modern computer and carry a stack of paper over to your mac, then handwrite your reponses, walk back to modern times and type them into your email and hit send.

Other than the text itself? literally everything about how web pages work has changed in the last 20 years. Multiple times. It's an entirely different platform today than it was in the late '90s.
I'm not debating that there has been change, but as far as I am concerned whether it's really a "platform" at all is debatable. And even if anyone agrees it's a platform that doesn't make it a good platform or one to rely on. I'm not sure it's really not that different ultimately. HTTP is still HTTP. There's still quite a bit of reliance on the basic model of requesting a page and then doing further requests if the page says it needs something else.

It's incredibly complicated. A sufficiently large table and there's a liklihood that before you even get to "how should I presnt this?" may actually overwhelm a 68k processor.

e

That entire process is completely different than what web pages could do in the '80s. Gosh, it wasn't until at least the mid '2000s (2005, actually) that ajax was introduced and started to be used to update parts of pages without reloading the whole page. I would bet that Netscape 4, IE4, and perhaps even Classilla are completely incapable of doing that. And that kind of task is something even this forum does.
Uh huh, sure. The size of table wouldn't overwhelm a processor, it would choke up the ram/hard disk. We aren't talking about the 80s though, we are talking about the 90s. The great bulk of Macintosh computers date from the 90s, that is everything from the Macintosh II on. What you are describing seems to be mostly a software level issue rather than any especial limitation of the hardware. Given that the Classilla pages implies you might be "upgrading" to it from Netscape 7 and that wikipedia suggests that netscape 7 was a thing in 2002 and that ajax existed at least from 2004/2005 on I wouldn't be surprised if it supports ajax at least. Support and handling well may not be the same thing.

In fact, there's a big aversion these days to developing desktop software at all, since with electron you can just wrap your app in a standalone copy of Chrome and put it in an exe or app file and have it almost look native, which is considered "good enough" by many. (Slack, Atom, Spotify, Teams.)
That is both disgusting and lazy. I don't see why I (or anyone else) should have to run a browser on top of my OS just to run some software. I'd call that unnecessary introduction of intermediate layers and it's probably less flexible than the JVM. Also, I think it's objectively terrible regardless of what you think.

It's pictures with millions of pixels and videos that are larger than most old Mac hard disks. Are you going to force everyone to upload only 180px images again? Is this a campaign to Make QuickTime Great Again?

No and no. Depending on complexity, a screen sized picture (say 1366x768 might only be a couple hundred kilobytes, even at 20 of them that's only a couple megabytes. It's a sizable chunk of an 80mb hard drive, but once you've got even a gigabyte it's not that big a deal and you aren't keeping them indefinitely. Personally I have next to no need for video while browsing the web and can go back to a modern machine if I need to do watch something.

With regard to Facebook what I mean by 'static' is not a continuous stream so much as a fixed window on it that perhaps updates to show the last so many posts as opposed to a continuous stream that reloads as you scroll up and down. Their bloody rendering engine is a pox on humanity and completely unnecessary simply to present some data. It's just a fricking three column layout with dynamic content. Just because it pulls a lot of assets, not to mention a ton of ADs doesn't mean that it is necessary or meaningful to do so.

You are confusing "fundamentals" with how you imagine you could implement it given your chosen constraints, not how the web actually is in reality implemented.

The web has fundamentally changed. You were ignorant, and have been informed. Now you're in denial.

If you want content that is usable on a '90s machine, you'll need to create it. Or dig it out of the internet archive.
Err, no. I am pointing out the basic elements are highly simplistic.

The point I was making is that drawing boxes and updating the content inside of them via a network is hardly some vast burden. If the web is complicated it's because tons of stuff has been shoehorned into in a way it was never designed to be used. It's the equivalent of stacking a skyscraper on top of a modest three story apartment building. A real network application can send and receive happily without dependence on long polling, ajax, or some other pseudo-technology.

In any case, you are an asshole with your head stuck up that asshole. Just accept that what you see is not in fact reality at all, but only your viewpoint.

 
Last edited by a moderator:

Gorgonops

Moderator
Staff member
Clearly. You're missing it by miles 100s of miles at least and being rather snide about it too.
Sorry, getting snide is a reflex that happens when someone is being (intentionally?) dense even after repeated attempts to enlighten them.

comprised mostly of text and images which, at a basic level, these machines are capable of rendering. Drawing text and pictures has not become any harder. Even so-called "interactive" pages are hardly a new thing, even if older browser don't support the newer mechanisms/approaches they did do those sorts of things then afaik.
No. I got interrupted in the midst of typing a reply to this, and in the meantime plenty of other smart people have replied so, well, I'll settle with "No, you're wrong, displaying this *rap has gotten way, way harder since the day when the only text format tags you could reliably expect to work were <center>, <b>, <i>, and <br>. It's frankly just not a sure thing anymore that the text-only content you might want to access exists as plain text bounded in <p> tags; even plaintext browser needs a minimum understanding of HTML 4+ conventions to fish that text out of the <div> and CSS sea it lives inside.

(And, as had been noted, many websites today simply don't load *any* content unless a Javascript engine goes and gets it. Note that websites like this are controversial for a number of reasons, a big one being accessibility to the visually impaired. In some cases those users are simply SOL, in others they may be able to use an alternative "mobile" site that degrades more gracefully, but still realistically might require more CPU horsepower to sort through than a Motorola 68000 can offer.)

This is not all like expecting Halo to run on an Atari 2600, it's far more akin to being surprised that an "enhanced" version of an old game won't run on the original hardware specs AT ALL even though all it does is bump the asset resolution/detail up a little bit
You seem to be *really stuck* on the idea that the "resolution" of the modern Internet has been bumped up just a "little bit" from 1990. Considering how many times this has been explained to you I'm kinda thinking it's not going to be possible to get you to grasp the scale of this problem.

 

Gorgonops

Moderator
Staff member
In any case, you are an asshole with your head stuck up that asshole. Just accept that what you see is not in fact reality at all, but only your viewpoint.
Aaand, I think you're done here.

EDIT/MOD NOTE:

I'm not locking this thread, yet, because the problem seems to be more personal than topical, but please turn down the heat. Babyish name calling is uncalled for.

 
Last edited by a moderator:

Cory5412

Daring Pioneer of the Future
Staff member
In any case, you are an asshole with your head stuck up that asshole. Just accept that what you see is not in fact reality at all, but only your viewpoint.
Please refrain from using this kind of uncivil language, lest any more warnings need to be issues.

I'm not debating that there has been change, 
You do appear to in fact be debating that there has been changed.

but as far as I am concerned whether it's really a "platform" at all is debatable. 
Funny, every web developer ever thinks of it as a platform.

HTTP is still HTTP. There's still quite a bit of reliance on the basic model of requesting a page and then doing further requests if the page says it needs something else.
HTTP is HTTP, except when it's HTTPS, but as anthon mentions in a post they were writing probably at the same time you wrote this, what an HTTPD returns when you request is no longer content. It is an application, which itself then uses HTTP to request content from a database via a piece of middleware.

As I said somewhere, the fundamental model of how web pages function has changed.

Uh huh, sure. The size of table wouldn't overwhelm a processor, it would choke up the ram/hard disk. We aren't talking about the 80s though, we are talking about the 90s. 
Does it really matter which part of the overall computer is overwhelmed? The technicalities of which Mac IIs were introduced in which decade doesn't really change the fact that an '030 or an '040 would have trouble parsing a large stream of items. What TheWhiteFalcon described isn't just your computer receiving live data about, say, eight items. I'm presuming each client computer was receiving, parsing, then displaying a feed of information about hundreds of items on the page (even if not all were visible) at once.

Regardless, Amazon was able to quickly build this page that rendered correctly on any modern Windows, Mac, Linux, UNIX computer, and that would not have been possible in the '90s, with any technology, because web browsers couldn't do this stuff then, and because you simply couldn't build that application quickly enough. Any attempt would ultimately look like web browsers do today.

That is both disgusting and lazy. I don't see why I (or anyone else) should have to run a browser on top of my OS just to run some software. I'd call that unnecessary introduction of intermediate layers and it's probably less flexible than the JVM. Also, I think it's objectively terrible regardless of what you think.
I also agree, but it's how these applications were developed, and some of them (teams, slack) are not "optional." (Although: you are generally given a work computer good enough to run them well.)

Err, no. I am pointing out the basic elements are highly simplistic.
These things were simplistic in 1995. They are not today.

 

Gorgonops

Moderator
Staff member
Yeah, I remembered literally the minute after posting that that existed and regretted not using some other video game as an example (Just what are the cool kids playing these days?) because I knew someone would drag it up.

Although, actually, maybe discussing Halo 2600 could be sort of instructive. The author of the game wrote it after being inspired by a book about the *extreme* level of creativity employed by game programmers in working around the ridiculous-by-today's-standards hardware limitations of the 2600's hardware, and wanted to see first-hand what it was like. The result, is of course, a completely made-from-scratch creation that shares nothing but some thematic elements with the "real" Halo games. Therefore it, of course, has nearly zero similarity to the challenge you're facing when trying to write a browser for an ancient computer that can reasonably interpret the modern dynamic web. That is a challenge more akin to writing a Halo 2600 that fits a game engine into the 2600 hardware capable of interpreting all the 3D models, sounds, and other assets that the modern game engine uses, only degrading the presentation sufficiently to fit its pixel generation and sound chip limitations. (Presumably for this scenario you've either interfaced a hard disk to your 2600 or come up with a bank-switching technique for accessing several gigabytes worth of data through the 4k address window available for bank-switching on the Atari's cartridge slot.) Good luck with that! ;)

 
Last edited by a moderator:

VMSZealot

Well-known member
I think that it helps if you have a specific use case. My specific use case isn't merely the bloody mindedness of using a 20-30 year old computer today.

I actually use my old Mac as part of the coding process for my new software - the reasons are twofold. Firstly, I'm too cheap to buy a new copy of Fontographer and emulation doesn't work brilliantly for 'real' work with this software. Secondly, i find the old OS very human scale and easy to get on with.

I don't have a need for internet banking or Facebook or, or, or. But if I'm on my old machine, I don't want to swap to the new one just for a spot of email or to look something up on Wikipedia, Stackoverflow or similar (including posting/reading here or downloading from Macintosh Garden). If I can get this basic functionality to work then it makes my day easier. And if it helps someone else then all the better.

Finally, I don't like the solutions which are out there already. They're unnecessarily kludgy or hard to use. Cutting my own solution also provides a welcome excuse to code, and I'd code my own breakfast if I could!

Above all though, I don't want to get all religious and warry about it. If my idea works for you then great. If it doesn't then it doesn't. There's no need to get fighty about it.

 

mactjaap

Well-known member
We now have four posts who handle a little bit about the same. Two are discussing what could be made, one is almost finished with a product ([SIZE=10.5pt]mozilla for PPC[/SIZE]) and one is offering something; legacyweb.net. These are the posts:

Internet on older macs

https://68kmla.org/forums/index.php?/topic/30760-internet-on-older-macs/

Old Mac, Modern Internet

https://68kmla.org/forums/index.php?/topic/30767-old-mac-modern-internet/

MLA Forum Access from Older Macs

https://68kmla.org/forums/index.php?/topic/30806-mla-forum-access-from-older-macs/

Building First Mozilla for PPC (and maybe eventually 68k)

https://68kmla.org/forums/index.php?/topic/30826-building-first-mozilla-for-ppc-and-maybe-eventually-68k/

I posted that I would like to test legacyweb.net and I will start a new topic for it. Maybe more people would like to test and respond.

https://68kmla.org/forums/index.php?/topic/30827-testing-the-legacywebnet-project-use-modern-web-on-older-softwarehardware/

 

VMSZealot

Well-known member
In fairness, I'm doing more than discussing. I'm writing too (and, within the next few weeks, I'll open source what I've done so that anyone who's hacked off with current solutions can help with something better).

This is more than idle chat. This is happening.

Legacy web seems like a nice idea - but it either doesn't work (from what I've read) or is tricky to configure. It also relies on stunnel which isn't the easiest software to use.

Mozilla for ppc already exists, of course, in the form of Classilla - but it's slow, and ppc only. I aim to fix this by using a helper machine (RPi) to get a fast, 68k solution working. Others may baulk at the idea of a solution which isn't running entirely on the Mac - but I think I have to accept that times (and security standards) have moved on and our old machines need a little help.

 
Last edited by a moderator:

bigmessowires

Well-known member
Here's my cheat sheet for the efforts being discussed. Please correct any errors.

Denodster is trying to build Mozilla for PPC, then will attempt to make a 68K version of that. 

Agent_js03 is developing something called LegacyWeb. It uses a Rasperry Pi as a proxy (if that's the right term). Secure email is accomplished through stunnel. Some type of unspecified proxy software filters and simplifies web pages on the fly.

VMS Zealot is developing a very similar project with no name (Old Mac, Modern Internet thread). It also uses a Raspberry Pi as a proxy, but the approach is slightly different. For this project, the RPi *is* the server instead of acting as a man in the middle. 

And then mactjaap has MacIPgw which has existed for some time. It can run on an Orange Pi, and it contains proxy components with a similar purpose to the others. Stunnel for email, http 1.1 to 1.0 proxy, web rendering proxy.

The efforts from Agent_js03 and VMS Zealot seem very similar to me. Maybe they can combine their efforts?

 

techknight

Well-known member
As far as applications in the browser, I dunno I kinda got used to that. I use Google Sheets faithfully now because of the editing and live updates. Works great for schedules and progress at work. I keep a bunch of stuff on sheets for this. And I keep backup copies just in case something happens. 

and Google docs/forms is good for siphoning information at tradeshows. 

 
Last edited by a moderator:
Top