• Hello MLAers! We've re-enabled auto-approval for accounts. If you are still waiting on account approval, please check this thread for more information.

Comparing 1990's Computing Technology

Compare these two boards:

- both are Moto '030

- both designed in 1990, built in 1991

Look at the sizing difference. I realize Amiga's had a lot more custom chips, but still, you can clearly see CBM (Commodore Business Machines) was no where on the tech level of Apple even at 1990.

The CBM boards are mostly single layer, with an internal layer for voltage.


 

 
A Classic II makes a pretty lousy basis for comparison to an Amiga 3000; the Classic II is a *far* minimalistic machine so even if they were produced at the same "technological level" the Amiga would be bigger. Also: The Amiga probably counts as a 1989 design; it was introduced for sale in June 1990, and the 1990 date on the board probably reflects the last board revision before production. The Classic II wasn't for sale until *October 1991* so realistically there might be more than two years separating the design stages of these systems. (It's possible Apple and Commodore had different criteria for what date got silk-screened on the motherboard.)

A better comparison would be a system like, I dunno, The Mac IIci. Compared to that system the Commodore board is still much "busier", but you need to factor in that the Amiga:

A: Is still a more complicated machine, with some architectural oddities like the distinction between "chip" and "fast" RAM. (The two types are in even in physically different packages on the Amiga board; I'm curious why it uses 44256's for fast RAM but there may be some technical justification.)

B: The Amiga has *many* socketed chips on it. The IIci doesn't have any. Sockets add bulk.

The gross size of the motherboards are roughly the same, so if you factor in how much more capable (or at least, how much more complex) the Amiga is the designs look roughly "equally competent". Looking at the mainboards for other 1990 introductions from Apple, like the IIsi and IIfx, really about the most you can say is that Apple was a year or two in front of Commodore in really embracing surface mount over through-hole. If you look at Commodore's 1992 models, like the 4000/1200/600, they're all surface mount. The whole industry was switching over around the 1988-1991 timeframe, so... nothing to see here?

 
Last edited by a moderator:
Yah you would have to compare the Macintosh II(*) to the Amiga 3000.  If you want a better comparison to the Classic look at the Amiga 600 system board.

On the whole you would have a hard time convincing me that Apple was ahead of Commodore in 1992,  Amiga were what Macs should have been in 1992.

 
There is no comparison. Comparing a Classic II to an Amiga 4000 is like comparing a 1980s VW Rabbit to a 1980's Lamborghini Countach. There is no comparison between the two other than being cars, or in this case - '030 computers. Apple failed in where Commodore succeeded as far as what 32 bit data/address should have been able to do.

 
But you kinda miss my point:

in 1990 Apple COULD build the ClassicII MB. The Amiga 3000 was the pinnacle of CBM technique. CBM could not have built anything close to that compact form.

 
But you kinda miss my point:

in 1990 Apple COULD build the ClassicII MB. The Amiga 3000 was the pinnacle of CBM technique. CBM could not have built anything close to that compact form.
You are comparing a desktop pro level product to a consumer compact.

You are missing a few Amigas like the compact Amiga 1200 & 600. In your comparison. Even the Amiga 500 has a small system board. Also the Amiga actually had like a sound chip and graphics oh and color.

By 1990s Apple was already behind everyone other then PC clones.

 
The Amiga hardware did many things that m68k Macs couldn't.

First, the memory was socketed ZIP because you could get static column memory which, when paired with the m68030, was faster than regular memory.

Second, the video hardware could do many, many things which the Mac video hardware couldn't (think Newtek Video Toaster with the video slot).

Third, the SCSI was top of the line. It could do real 32 bit DMA (it had special circuitry to interface it as fast as possible to the 32 bit memory bus) and could even do synchronous mode. The Amiga 3000 was the machine used for benchmarking SCSI in that time. Real world SCSI transfer with the m68030 could be in excess of 5 MB/sec. No m68k Mac could ever come close.

Fourth, they had their own 32 bit expansion bus which was capable of real DMA, bus mastering and memory expansion. NuBus was not quite as fast, although real-world throughput for both Zorro-III and NuBus were about the same.

 
You are comparing a desktop pro level product to a consumer compact.

You are missing a few Amigas like the compact Amiga 1200 & 600. In your comparison. Even the Amiga 500 has a small system board. Also the Amiga actually had like a sound chip and graphics oh and color.

By 1990s Apple was already behind everyone other then PC clones.

The 600 and 1200 came later- as AGA. This was ECS time-frame.

Perhaps a comparison between the A500 (still a 68000) and Classic is more apt. Or as people have suggested the CI and the A3000. But an Amiga 500 MB and a Classic (not Classic2) would still show what I am talking about. Apple was just many generations ahead of CBM. Maybe it's a West Coast vs East Coast thing.

I happen to have a roasted CI here, I'll remove the mb and photo next to the A3000.

I'm well aware of _WHAT_ the Amiga could do in 1990, as compared to the Mac. I worked in a shop that sold them and the Toaster. The OS is a whole other story, and I'm aware of what the AmigaOS was capable of in 1990.

Besides all this, you can look at the composition of the board and see- the A3000 board is more like something you'd see 10 years before, while the Classic2 is nothing like say, a SE board. The A3000 resembles a SE MB.

 
Last edited by a moderator:
By the time Commodore came out with the A3000 in the 1990s, is was undergoing Chapter 11 in the USA and dividing up the company in Europe. It was their mistake getting rid of Jack Tremiel who held the company together and ran it with an Iron Fist. Remember, when Commodore got rid of Jack, he scooped up a dying company named Atari and brought it back to life just to get even with the company that he started and grew but bit him in the ass. If he were more serious about it, the Atari ST would have a contender against the Amiga. He had already beaten Mac with the ST - the first under $500 and then later $300 machine that can do nearly do everything a Mac can do.

In short, if Jack Tremiel would have stayed with Commodore, it would have multi-layered boards like the Mac and Atari ST. But Commodore had to go cheap in order to survive the 1990s, and without Tremiel. The Amiga and the ST had sound and graphics chips the Mac did not. They both had memory foot print space that the Mac wished it had. Both the Amiga and ST at the time could emulate a Mac hands down while running their own software at the same time at half the price of a Mac! The Mac could not do that at double the price of itself!

So - better sound, better graphics, faster I/O, bigger memory foot print size, on the same CPU at the same speed without crippling it, at half the price, which is the better machine?

Again, if Jack Tremiel would have stayed with Commodore, that A3000 board would be 75% smaller with multi-layered boards. He was Jobs with a better understanding of the technology. But he was not there at the time the A3000 came out, and Commodore was a dying company by then so they had to build cheap and save money on eliminating R&D. With that in mind, now think, which one is "Better".

There is no better unless you want to contrast and compare. But compare what? Apple is better because ______. Commodore is better because _______. Atari is better because ________. Until you can fill in the blanks, this just talk, and talk is cheap.

 
Talk is cheap, here is mine: :)

1. Apple is better because it runs apps that are useful (for me), + iPhone

2. Commodore is better because the OS kicked butt and was amazing

4. Atari was better because I still love the 800XL

 
MassiveRobot - ROTFLMAO!!! I got to give you that one! You are so Right!

 
Last edited by a moderator:
Perhaps a comparison between the A500 (still a 68000) and Classic is more apt. Or as people have suggested the CI and the A3000. But an Amiga 500 MB and a Classic (not Classic2) would still show what I am talking about. Apple was just many generations ahead of CBM. Maybe it's a West Coast vs East Coast thing.
 
Uhm. No. The Amiga 500 is a 1987 design verses 1990 so you're already comparing Apples to oranges again. Compare an A500 motherboard with a Macintosh SE and your thesis pretty much falls apart. About the most you can say is the Mac SE's board is more compact but, likewise, it *mostly* does less. (*Video and sound systems are far less sophisticated on the SE, but it does include a SCSI chip while a mass storage controller was an add-on for the A500.) Both consist of a handful of LSI chips, mostly through hole, tied together with DIP-package 74xx logic and PALs. About the only thread you have to hold onto here is that Commodore didn't bother doing a "500CR" switching from through-hole to surface mount in 1990; going to surface mount waited until 1992.
 
Besides all this, you can look at the composition of the board and see- the A3000 board is more like something you'd see 10 years before, while the Classic2 is nothing like say, a SE board. The A3000 resembles a SE MB.

And, yet again, the question to be answered is: so what? Surface mount certainly has its advantages over DIP; it tends to be more reliable and allows somewhat more sophisticated automated assembly techniques but a computer built either way can work equally well. Using a less sophisticated assembly technique does *not* make the A3000 a less sophisticated computer than a contemporary Mac. The best explanation for why the Amiga stuck with the less modern assembly techniques undoubtedly comes completely down to financials: Commodore was seriously struggling by 1990 and probably didn't have the money to retool their assembly lines while Macs sold in much larger volume with *huge* profit margins, making it trivial for Apple to push some of those profits back into constantly redesigning the same old computer to save a few cents here and there during manufacturing. (And make no mistake, there was absolutely *nothing* innovative about the Classic/Classic IIs other than how far they pushed the envelope in the field of cynical cost-cutting.)
 
Anyway, I guess if kicking a dead dog makes you feel better have at it, but I'm still totally missing the point.
 
Commodore:

  • Understood the value of hardware acceleration for multimedia right from the start.
  • Embraced third-party hardware expansion from the get-go.
  • Had true preemptive multitasking and a well-documented high-efficiency kernel.
  • Had only basic memory management, with a tendency to fragment.
  • Provided only a rudimentary UI library, leading to Unix-like inconsistency and competing standards.
  • Gave us Lemmings.


Apple:

  • Did almost as well with multimedia in software.
  • Opposed hardware expansion at first and never really warmed to it.
  • Had only cooperative multitasking.
  • Had an inventive Resource Manager used pervasively throughout the system.
  • Placed a strong emphasis on good UI design and consistency across applications.
  • Gave us Myst.


Atari:

  • Hahaha, get real.

 
Last edited by a moderator:
And, yet again, the question to be answered is: so what?
I so agree.



Atari:

  • Hahaha, get real.
Atari was the first to have 256 colors in 1979 and boosted the video game industry into a level of interactive actions and standardized various game I/O still used today. Computers with 256 color capabilities did not come out until at least 1986 with the PC EGA Graphics card and just as late on Macs.

I know, I did video games as First Star's #1 Commodore programmer from '82 to '87; but I had to work on all three: Apple, Atari and Commodore. I also wrote a standardized joystick routine which made game programming easier. Instead of writing the game around the joystick, my routine allowed the game to be written and then the joystick I/O to be added later. It also shrunk the size of possible code by kilobytes when most systems were just less than 32K in size max.

 
The 600 and 1200 came later- as AGA. This was ECS time-frame.

Perhaps a comparison between the A500 (still a 68000) and Classic is more apt. Or as people have suggested the CI and the A3000. But an Amiga 500 MB and a Classic (not Classic2) would still show what I am talking about. Apple was just many generations ahead of CBM. Maybe it's a West Coast vs East Coast thing.

I happen to have a roasted CI here, I'll remove the mb and photo next to the A3000.

I'm well aware of _WHAT_ the Amiga could do in 1990, as compared to the Mac. I worked in a shop that sold them and the Toaster. The OS is a whole other story, and I'm aware of what the AmigaOS was capable of in 1990.

Besides all this, you can look at the composition of the board and see- the A3000 board is more like something you'd see 10 years before, while the Classic2 is nothing like say, a SE board. The A3000 resembles a SE MB.


Amiga 600 came out 8 months after the Classic II.  While the Amiga 3000 was 2 years before the Classic II, you are comparing to the wrong generation of Amigas.  Also my point about the Amiga performance had to do with the fact that most of that extra functionality was in custom chips, so yes it will take more room.

 
Atari was the first to have 256 colors in 1979 and boosted the video game industry into a level of interactive actions and standardized various game I/O still used today. Computers with 256 color capabilities did not come out until at least 1986 with the PC EGA Graphics card and just as late on Macs.
Technically the Atari 8-bits had a 128 color palette, not 256, but nonetheless they did have pretty much the best color palette on the home computer market for several years. While I respect that the C64 edges the Atari 800 in several technical respects (the sprite hardware is more flexible, the sound chip is superior... although the Atari had a faster CPU and the ANTIC/GTIA combination for video had some interesting tricks up its sleeve so it's by no means a walkover) I've always thought that the Atari 800 ports of games tend to be "prettier" than the C64 version. But, yeah, the first Apple computer to be as "colorful" as the 1979 Atari 800 was 1986's Apple IIgs. (Technically Commodore's "264"/Plus/4 had a similar palette to the Atari 800, but it was an utter failure when it came to market in 1984.)

Atari:

  • Hahaha, get real.
MIDI. ST's were really, really good for that. Some anal-retentive synth freaks use them to this day.

The other thing the Atari ST was good for was pointing out just how comically overpriced Apple's B&W Macs were. ($799 list price for a 512k Mono system with a single floppy vs. *drum roll*... $2795 for a Fat Mac.

One could sort of make the case that Jack Tramiel ultimately managed to torpedo *both* Commodore and Atari in the end by making each company a victim of its own success at cutting prices (and profits) down to the naked bone. If you concentrate too hard on marketing your products as "cheap alternatives" to an established premium competitor you run this risk of simply having your products perceived as cheap knockoffs regardless of whether they may actually be technically superior. (In at least some regards, anyway.) The other problem, of course, is that if you don't make enough profit to plow some money back into R&D you're eventually going to hit a wall; it's interesting that both Atari and Commodore basically managed about 2 1/2 generations of their platforms before they were crushed. IE, they both started with the initial, exciting 16 bit model, followed by a "workstation-class" 68030 machine, and then... a few years of stagnation followed by a last gasp run of machines that were cheaper rehashes/remixes of the previous two generations. (Yes, we could have a huge argument about whether the Amiga 4000 was a cheap rehash of the 3000 or a big leap forward at this point, but it's inarguable that the standard configuration actually included an *inferior* CPU to the 3000. Same goes for the Atari Falcon vs. the TT.) Motorola giving up on the 680x0 series of CPUs sure didn't help; Apple had the resources to follow in the footsteps of UNIX vendors and jump to a RISC platform, Atari and Commodore were just screwed. It's at *that* point that Commodore lost the plot. And let's be clear on this: it's not because they had bad engineers, it's all about bad management.

 
Last edited by a moderator:
Ok, so I'm comparing apples to oranges. It's still at least interesting to look at, to examine what was going on in 1990 and see two completely differnet paths to the same result - that is, a home computer built for the masses.

 
Back
Top