• Updated 2023-07-12: Hello, Guest! Welcome back, and be sure to check out this follow-up post about our outage a week or so ago.

Old Mac, Modern Internet

Nathan

Well-known member
Well, good luck.

I remember trying to get my 6100 on the web. I seem to recall that I got it connected and that, at the time (~2006-2007), I could connect to some sites with netscape 4.08, but SSL? issues pretty much broke anything with a login/https. Of course that poor thing's HD won't boot now (I thought it was dead but recently discovered it's readable if I boot some other way).

 
How long does it take to build?
 
On a G3 with the source and IDE on a RAM disk, it takes about 25 minutes. On a PowerTowerPro 225 and everything on the hard drive, it takes about 45mins to an hour. On a 9500/132 it takes over 2 1/2 hours.
 
The moral of the story: Don't try this on your 6100/66.

Wow, that's a chunk of time.

 
Last edited by a moderator:

VMSZealot

Well-known member
…and the reason I'm targeting a Raspberry Pi is that I want my solution to work for all Macs, no matter how limited. PowerPC, even the least powerful, is way too high a bar. I want this to work on everything from the original 68000 up.

 

Nathan

Well-known member
For practical purposes you should probably target a 16MHz 68030 and up. I really doubt that the original Macintosh models with an 8MHz 68000 and maxing out at 4MB ram are going to be even possible to do unless you feel it's worth presenting a pure text page. There also are more 68030 and 68040 models than there are 68000 or 68020 ones.

Apparently it's been done before: http://www.keacher.com/1216/how-i-introduced-a-27-year-old-computer-to-the-web/You might consider starting with what their solution and adding on since it looks like they probably had to strip out a lot of stuff to make it usable for that mac.

 

Scott Squires

Well-known member
I'm not any kind of expert, but I've done enough web development and graphics programming to guess what the challenges are in making a layout engine.
You hit a number of the major reasons why a modern browser for vintage machines isn't possible. I almost posted a rant when I read "Arranging pictures, showing text, and loading pictures/text from other sites alone really shouldn't be that expensive." Modern web rendering uses massive amounts of computation.

I really don't understand this desire to use antique computers for modern tasks. Trying to use a computer for a task that is 1000+ times out of its league is frustrating and a waste of time. Email would be fine, and can be made to work with the right proxy. Viewing modern web sites is just crazy. That is like suggesting that Grand Theft Auto V should be ported to run on a SNES.

What would be neat to see is if people created a few sites that were designed for 1990s browsers. Then people using their 1990s computers could visit those sites and get an experience contemporary to the time.

 

Nathan

Well-known member
You hit a number of the major reasons why a modern browser for vintage machines isn't possible. I almost posted a rant when I read "Arranging pictures, showing text, and loading pictures/text from other sites alone really shouldn't be that expensive." Modern web rendering uses massive amounts of computation.
Does it need to though? That's the question that seems worth asking. I'm sure there are lots of people on here who'd trade a little flexibility in layout/rendering for the site actually working on their old machine. The point is that the actual things most web sites are doing on a fundamental level should be manageable on a much older machine. Wasting available computational resources is always much easier than being careful to minimize unnecessary usage.

 
Last edited by a moderator:

Gorgonops

Moderator
Staff member
Back the late 1990's, when the Pentium III came out, Intel ran some ads crowing some nonsense about how the Pentium III was going to transform your Internet experience because, well reasons. For a long time my jaded response to that was that clearly how that was going to work was with the increased power of the PIII inside your computer would be smart enough to just make up Internet content for you instead of having to wait for it to trickle through your modem. (Since, really, at the time the limiting factor for rendering web pages mostly was the time it took to download them, even on relatively low-end computers, IE, something like a 133mhz Pentium or even a fast 486.)

What I've come to realize lately, though, is effectively what I was saying there is true now: your CPU *is* essentially "making up" a lot of what you're seeing on the web today, in the sense that it's showing you programmatically-generated rather than static content. The reason a browser tab rendering a modern dynamic website takes up literally hundreds of megabytes of RAM and sucks all the CPU it can get is because there are so many active, moving, computing elements churning away in both the foreground and background that the "server-side" alternative to it would essentially be to have to push a live interactive video stream to your computer. Strictly speaking doing so might well be less CPU taxing than running all that javascript, but from a bandwidth (and to a lesser degree, server-side computation) standpoint that would be unworkable for the sort of Internet connections most people have.

Yes, you can certainly argue that on some level we don't *need* all that "fluff" (some of which we actively don't want, like all the behavior tracking and animated ads, etc), that for you Facebook would be just (if not more) useful if all it needed was a statically rendered HTML 3.2-compliant page with standard HTML <form> tags (and, heck, let's say <tables> instead of CSS), but there simply isn't any economic reason for the industry to give that to you. People are drawn to flashy things, companies want greater control of their IP and the presentation thereof, advertisers want deeply embedded tools for tracking and monetization... deck is pretty thoroughly stacked here.

 

Scott Squires

Well-known member
Does it need to though?
Absolutely.

The point is that the actual things most web sites are doing on a fundamental level should be manageable on a much older machine.
I think the problem is that you don't understand what most websites are doing on a fundamental level. Thousands of man-years have been spent optimizing web browsers for performance. The only way you will make them faster is by ripping out features wholesale. When you start doing that, websites will stop working at all.

You can't get websites to work on '90s computers by changing the browser. You have to change the content.

Which brings us to an alternative idea to creating sites designed for '90s computers. You could transform a page into content that a '90s computer could display. I didn't mention it before because I think creating '90s sites would be better long-term. Because if you try to make your transformation code generic to work on many sites, then it will be really bad at it and hardly ever work in a satisfactory way. If you make transformation code for specific sites, then you have to keep it up to date whenever that site changes the way it functions. Ain't nobody got time for that.

 
Last edited by a moderator:

denodster

Active member
You can't get websites to work on '90s computers by changing the browser. You have to change the content.
Personally I'm fine with the whole crappy html only experience I get in Netscape 4, I just would like to be able to load sites that are now https only such as 68kmla I think our machines could handle https if there were a 68k browser out there that knew how to do it on the modern web.

 

james_w

Well-known member
I'm currently working on building the first release of Mozilla for PPC mac. based on what I'm looking at it had a build target of a PPC mac running MacOS 7.6. the 68k c++ seems to be present in the source as the maintainers were asking for someone to devise a 68k build of Mozilla, but no one ever stepped up. My thinking is that if I can get it to build on a PPC mac (In Sheephshaver) I can start to strip down features until I'm left with just a bare bones Browser. At which point I would try come up with a build process for 68k macs. From there we could implement modern ssl support if possible. or at least patch up a few web standards.

I've been documenting my efforts here: http://andrew.colchagoff.com/netscapeI'm currently blocked by a lack of a single build tool, but I think I've got a cd on it's way from ebay that will supply the tool I need.
Loving this idea - I'll be following closely.

I have mountains of Apple Developer CDs but sadly they're in storage until my house is finished being renovated - sometime in the Autumn.

In the meantime, good luck!

Edit: oh and if you haven't already - please start a thread that I can set notifications on :)

 
Last edited by a moderator:

mactjaap

Well-known member
Cool this building of Mozilla for PPC ...and 68k

I will follow news about this development closely. Maybe a separate post?

 

Unknown_K

Well-known member
I think getting 68k machines online is a lost cause outside of FTP. I am happy we have TenFoxFour for late generation PPC machines.

 

Nathan

Well-known member
I don't see why it should be. No reason they shouldn't be able to handle/open some pages according to a given machine's capabilties. Pages with text only or with a handful of images should be quite doable. http://textfiles.com/for example really should be loadable even on 68k machines. TenFourFox is great, that's true.

There's also SSH potentially w/MacSSH, although like many programs it's probably somewhat out of date. 

 

Unknown_K

Well-known member
A simple website made in 1999 would work on a 68k machine, but those sites are long gone. Heck browsing on a much newer single core x86 machine sucks as well.

I use my old gear for what it was good for, and leave newer machine for modern tasks.

 

VMSZealot

Well-known member
I do ssh on my SE/30 (can't remember which program - I'll let you know later)

With regard to working email on an old Mac, I'm working on it (first 'sprint'). Other solutions exist, but mine will be the easiest and most flexible (or what's the point!?). As to why, well the old Macs, and particularly the compact Macs, are particularly well suited to the way I work. I'm not great at multitasking, and these old machines force the user to focus. Well, that and I'm too cheap to buy a new version of Fontographer - so I still use my old Macs anyway.

For internet, I see no reason why an old Mac - even one with a 68k CPU - shouldn't browse the modern web (other than for tasks like internet banking of course!). A proxy server can handle https and forward http to the requesting Mac, can convert images to a lighter, easier to process, format, and maybe even ensure than a lighter css is specified (I'll have to think some more on this). Obviously, JavaScript (where available) should be eschewed with great prejudice.

I'm also pondering putting up old sites that I have archived - just for the fun of having the 1990s online again!

 

Nathan

Well-known member
Cool. I don't have anything that old and as my 6100 is out of commission I don't have anything real old to try at the moment.

On the matter of email, something that uses POP and SMTP is probably the best choice, although many servers will probably require SSL and TLS (e.g. gmail does apparently) unless you run a separate older/custom server and give yourself your own email address (probably such a system would become a spammer target if it looks reasonably legitimate to other mail servers).

Yeah, I agree. Anything that's text and pictures and essentially read-only should be browsable and being able to read/post on forums would be nice. Really what you want with javascript is a mechanism for ditching anything relying on non-standard JS or libraries like JQuery so that basic javascript is still supported. Someone really should port Dillo (https://www.dillo.org/) to the older macs if that's even doable and maybe add JS support (http://duktape.org/) somehow.

Throwing up some old/simpler sites would be neat to look at at least.

Browsing the web is no more a 'modern task' (first web browser in 1990) than playing audio/video, recording music, word processing, sending email (email in use since 1970s), playing games or really anything else. People mostly do exactly the same things they've been doing since the late 80s and early 90s just with more powerful hardware and fancier software.

 
Last edited by a moderator:

VMSZealot

Well-known member
The code for email is far advanced now, although currently it only supports IMAPS for incoming and serves up POP for the vintage machine. Time permitting, I should have it going by the end of the month (I don't have much time to work on it).

Outgoing, which I haven't done much on yet, might be SMTP all the way or SMTP to IMAP. It depends on what works best for iCloud mail and gmail.

Come to think of it, I could probably synchronise Claris Organiser to iCloud with some crafty code - make 68k Mac iCloud compatible! But I'm getting ahead of myself…

 

Gorgonops

Moderator
Staff member
Browsing the web is no more a 'modern task' (first web browser in 1990) than playing audio/video, recording music, word processing, sending email (email in use since 1970s), playing games or really anything else. People mostly do exactly the same things they've been doing since the late 80s and early 90s just with more powerful hardware and fancier software.
I'm not clear what your point is here. Yes, these are all "tasks" that we've been doing forever, but in every case the functionality available to the user inside the software to do these things has *massively* increased. The first web browser from 1990 didn't even have the ability to display images inline (Mosiac, circa 1993, was the first there). Heck, it was several years before HTML (and the http protocol) even contained a mechanism for submitting <form> data, IE, the web was by definition *completely* non-interactive(*) in 1990.

(* "interactive" defined in this case as "being able to process input from the user". Certainly there are broader definitions of interactive software that would say that simply navigating hypertext qualifies because the user is able to dynamically choose what they're viewing, but, well... no.)

In short, your argument seems to boil down to saying that because Halo is a video game and people have been playing "video games" since the 1970's you should be able to run Halo on your Atari 2600, and that makes *zero* sense on anything but an irrational, emotional, "cry for help" sort of level.

 
Last edited by a moderator:

Nathan

Well-known member
I'm not clear what your point is here. Yes, these are all "tasks" that we've been doing forever, but in every case the functionality available to the user inside the software to do these things has *massively* increased. The first web browser from 1990 didn't even have the ability to display images inline (Mosiac, circa 1993, was the first there). Heck, it was several years before HTML (and the http protocol) even contained a mechanism for submitting <form> data, IE, the web was by definition *completely* non-interactive(*) in 1990.

(* "interactive" defined in this case as "being able to process input from the user". Certainly there are broader definitions of interactive software that would say that simply navigating hypertext qualifies because the user is able to dynamically choose what they're viewing, but, well... no.)

In short, your argument seems to boil down to saying that because Halo is a video game and people have been playing "video games" since the 1970's you should be able to run Halo on your Atari 2600, and that makes *zero* sense on anything but an irrational, emotional, "cry for help" sort of level.
Clearly. You're missing it by miles 100s of miles at least and being rather snide about it too.

What I'm getting is that many macs could access the internet in the 1990s and maybe even the early 2000s and that the basic content has not intrinsically changed. It is still comprised mostly of text and images which, at a basic level, these machines are capable of rendering. Drawing text and pictures has not become any harder. Even so-called "interactive" pages are hardly a new thing, even if older browser don't support the newer mechanisms/approaches they did do those sorts of things then afaik.

Frankly I could probably have used my 6100 running 7.6 and using Netscape 4 in 2006 to log in to gmail at least with the basic html interfaces if only it weren't for the issue of expired security certificates and the basic https/ssl issue.

MacWeb (https://en.wikipedia.org/wiki/MacWeb) from 1996 supposedly has support web forms.

Ultimately it should be possible to have a browser on some of these machines at least that can show the text and pictures it can handle and not the things it can't. As it is you may not be able to load anything at all even the text or pictures that should be within the capacity.

This is not all like expecting Halo to run on an Atari 2600, it's far more akin to being surprised that an "enhanced" version of an old game won't run on the original hardware specs AT ALL even though all it does is bump the asset resolution/detail up a little bit or being frustrated that some website totally refuses to work on your couple versions behind browser.

It's about the basic numbskullness that a website today comprised of essentially the same elements as one in the 1990s potentially won't work on a browser from then simply because there aren't any up to date security certificates or because it can't load a video or someone has gone way, way overboard using JS and more importantly, some trendy library that won't work in every browser now much less in one that's 10 or more years old.

 
Last edited by a moderator:
Top