• Updated 2023-07-12: Hello, Guest! Welcome back, and be sure to check out this follow-up post about our outage a week or so ago.

Mac+ OS floppy request

Bunsen

Admin-Witchfinder-General
So... how bout that local sports team huh?

Glad to hear you got your floppies Interceptor2. Good luck finding Alchemy.

And Ensoniq, you say? 1992? I'm guessing ... original EPS? Or was the 16+ out by then?

 

tomlee59

Well-known member
I don't read the term "dual opposed heads" as 4 heads at all. I read it as two heads on opposite sides, which I believe is how the Twiggy drive operated and the reason it had two cutouts on opposite sides of the disk from each other. So I would propose it is a matter of semantic inference.
Yes -- thanks, Mac128, for correctly interpreting what I wrote, despite my unintended efforts at obfuscation. :)

Those drives were certainly odd-looking, as were the floppies they used. I had a chance to buy a box of broken Twiggy drives back in the late 1980s. In a rare moment of self-restraint, I passed up the opportunity. Whenever I see how much these things sell for on eBay, I kick myself.

 

Interceptor2

Active member
The Ensoniq Mirage used single sided floppies which were a custom 400k which can be read using appropriate software in DOS/Win, again I believe it uses a std WD floppy disk controller chip something like a WD1550. The Mirage is was relased in 1984

The EPS "Classic" was relased 1988 then the EPS16+ in 1990, it just so happens mine is a 1992 version. BTW it uses a 68000 + 68440

I know only specific floppy drives work with Ensoniq kit and as far as I know all can work on DOS/Windows.

Methinks maybe all the Motorola crowd were in cahoots, shareing and helping each other in developing what must have been cutting edge tech.

I like Mac128's comments "..I would point out that Apple is not the only computer company that spent years pounding its head against a wall trying to secure a monopoly with proprietary technology.." - yes indeed. As a tech based design company it is always hard to know when to stop working on a project/concept when there are potential huge profits to be made. I know of many, usually commercial people, who so chase that illusive patent onto which they want their name.

 

Gorgonops

Moderator
Staff member
I like Mac128's comments "..I would point out that Apple is not the only computer company that spent years pounding its head against a wall trying to secure a monopoly with proprietary technology.." - yes indeed. As a tech based design company it is always hard to know when to stop working on a project/concept when there are potential huge profits to be made. I know of many, usually commercial people, who so chase that illusive patent onto which they want their name.
There's a gift in knowing where your talents lie. Apple in the early 1980's had a serious issue with thinking, based on their early success with the Apple ][, that they were "smarter" than everyone else and if they chose to do something they would not only succeed but be the "leaders of the industry". If you strip off all the rose-colored nostalgia we project backwards on them it's pretty obvious that they were not, and did not.

Name a major product of Apple's other than the Apple ][ and its variants that came out of Apple's labs between 1979 and 1984 and was an actual "success" on the market. (And the Mac itself didn't exactly set the world on fire, for that matter. It generated a lot of buzz but didn't actually sell that well.) It's a darn short list. They certainly had their interesting "research projects", aka, the Lisa, but when it came to nuts and bolts engineering of a marketable product they were *terrible*. Apple's engineering approach as epitomized by the Apple ][ was best described as analogous to Madman Muntz's "Mutzing", IE, taking a product and cutting out features and circuitry to make it as simple and as cheap as possible. (IE, the Apple ]['s producing a color display without according-to-Hoyle "proper" NTSC circuitry, and the Disk ]['s use of a "standard" disk drive mechanism with all the frills and "unessential" circuitry ripped out and compensated for with software.) Doing that certainly takes "talent", but that approach is poorly suited to engineering a complete complex product from scratch. As Captain Spock once noted: "As a matter of cosmic history, it has always been easier to destroy than to create." Substitute "simplify" for "destroy".

Apparently Apple must of been compensating for its history of "Muntzing" because the Apple /// and the "Twiggy" drives are classic examples of products which are highly over-engineered and *more* complicated than necessary to provide their level of functionality, which is at best only a small percentage gain over the functionality of a "generic" product. The 400/800k floppy drives in the Macintosh are an example of the same mindset... to make it "better" they made Sony's drive *more complicated and expensive*, not less, and the result was an insignificant increase in capacity and increased costs for the user. Would it have made *any difference* in the long-term success of the Macintosh if they'd simply bought off-the-shelf floppy drives from Sony and replaced the IWM with a Western Digital WD1773? (Or simply adjusted the IWM to work with single-speed drives. Commodore made GCR drives that varied the number of sectors per track without varying the spindle speed. Clearly Commodore had better engineers than Apple.) Not a whit. Consumers wouldn't of cared at all, and the Mac would of been cheaper.

If you're going to innovate (and innovation is good), concentrate on places where there are significant gains to be made and real benefits to be provided to the user. Turning inward and engineering Rube Goldberg solutions that provide little or no positive benefits and only work to lock customers into your overpriced and proprietary product is... frankly pointless. It might increase your per-unit profits to set yourself up as the only source of parts for your product but by definition it limits your market to people gullible enough to get suckered into buying into your boarded-up little universe. (Unless you plan to go into the business of selling/licensing your proprietary innovation to all comers on reasonable terms so you can *create* the "industry standard". Which is something Apple has *never done*. When they make a proprietary product they keep it on the reservation and guard it so jealously that the world has no choice but to look somewhere else. The same thing happened to IBM with Microchannel and it cost them control of the PC marketplace.)

Anyway, that was my point about "innovation". Apple wasn't "innovating" with floppy drives, they were self-flagellating and trying to prove to the universe that they could innovate in the 1980s the way they did in the 1970s. They pretty much faceplanted on that one, sorry. They produced a perfectly usable product in the end, but that was mostly due to Sony's engineering succeeding despite having Apple's failure grafted onto it.

 

Mac128

Well-known member
Unless you plan to go into the business of selling/licensing your proprietary innovation to all comers on reasonable terms so you can *create* the "industry standard". Which is something Apple has *never done*. When they make a proprietary product they keep it on the reservation and guard it so jealously that the world has no choice but to look somewhere else.
One word: Firewire. More recently: Facetime. "*never*"? Your bias is showing. :beige:

 

Osgeld

Banned
FireWire is Apple's name for the IEEE 1394 High Speed Serial Bus. It was initiated by Apple (in 1986[2]) and developed by the IEEE P1394 Working Group, largely driven by contributions from Apple, although major contributions were also made by engineers from Texas Instruments, Sony, Digital Equipment Corporation, IBM, and INMOS/SGS Thomson(now STMicroelectronics).
although 2 notable items since the 1970's isnt really a super track record

 

techknight

Well-known member
may not be a good track record, but its better than, for example, what I got. none.....

pretty much so with just about everyone here, has anyone developed anything that became industry standard? I haven't. maybe a couple of you here and there have, but certainly not every single member in here actively posting/reading. So that's why i feel that at least someone coming with something that becomes industry standard, whether its a good track record or not, is still good.

 

Interceptor2

Active member
In 1990 I workd for a tech company, we were desiging and building graphic workstations as used in aerospace, serious money ones. One rival was the likes of Sun Sparc workstation and the Mac got mentioned a number of times. I can see now why the Mac IIfx was such a hit, that SuperMac graphics card and the speed were quite something. Alas as the PC started to get better and cheaper our products died.

Before that I worked designing colour monitors, I did a multisync, the other dev team were working on a better multisync. One day a salesman brought us in a NEC multisync all ready and working. It killed off our product there and then as customers were wanting that product now. Ours was much more adaptable, CRTs trypes and case styles but that saw a decline.

Yes it's hard to make an industry standard or it was then as the market would change so fast, besides nowerdays the far east have the techinical ability, manufacturing and ability to work as a team which we can't.

 

techknight

Well-known member
i hated working on multisync monitors. things were a real PITA to troubleshoot, probably because i didnt understand how it worked too well, and still dont.

The interesting thing, was the viewsonics and the 2 separate horizontal paths, had an HV output, and horizontal output, on 2 separate circuits. not an all-in-one common circuit. It would run for awhile without an issue then one day, it would just take out the horiz transistor. replace it, it was fine again for another couple of weeks.

hated them. glad they are gone....

 

Interceptor2

Active member
Multisyncs can be fun, I would give tutorials to the test guys (post manufacture) showing how they worked and what to look for if they don't work. I think it was a little easier for us tecknight as these were our designs so knew them inside out.

Design is fun too, two things stick in my mind. With the multisync concept be careful with a low line rate should the HT remain the same = very high voltages and usually something went boom. Well fizz boom actually.

A design engineer was measuring beam current using a rather crude but effective method (I'll spare you the full details) except his natural reaction was to grab hold of the ammeter to change the range - he forgot it was at 25kV - yes not a pretty sight, he survived but a really nasty experience.

 

techknight

Well-known member
yeap, a rather shocking experience. I wasnt paying attention when i was a kid, i had a monitor apart tinkering with it, I had it working and on, i layed my arm right across the anode. WEEEEEEEEEEEEEEEEEE... lets see, i woke up laying in the floor. i dont remember what happened in between the time i layed my arm on the anode, and the time i woke up on the floor.

 

Paralel

Well-known member
Nothing probably happened. You were most likely just unconscious on the floor. It's possible you had a self-correcting cardiac arrhythmia (if you were hit during the wrong part of the repolarization phase) or a petit mal seizure during that time (hard to say, postictal behavior probably looks quite similar to the state of consciousness following a high voltage shock) but obviously no serious neuro/cardiac damage, otherwise you probably wouldn't be here posting, unless your parents found you very shortly afterward.

 

techknight

Well-known member
no, i was perfectly fine.

But that was the first and the last time i messed with flyback voltages while ON.... Ive been zapped by charged CRTs on many occasions, but thats just a "tickle" compared to what happened when it was live. ive been zapped by electricity so many times growing up, now, when i come in contact with lines voltage, i hardly feel it. i just feel a tickle and go, oh crap. lol.

when I was probing voltages in a projector power supply, my pinky slipped while holding the probe and hit the 450V boost voltage. weee i felt that one, but it really wasnt too bad compared to that flyback event.

 

Paralel

Well-known member
It sounds like a good pair of thin, but sufficiently insulating, elbow-length rubber gloves would be a good investment :)

 

techknight

Well-known member
ROFL. that and one of those sticks linesman use on telephone polls. I forget the exact term for it though. slips my mind.

 

Interceptor2

Active member
Actually this is an option, rubber gloves that is provided they are suitable. Many people might be home users who just need to make a few adjustments to the yoke, the rubber gloves do add a degree of protection. :)

I find as you concentrate on the screen (even with a mirror) and fiddling with the yoke it becomes all too easy to make contact with some HT. To make it more risky, these computers are getting old and the insulation is deterorating so making an unexpected flashover more likely. HT leakage from the flyback transformer, tripler or even anode lead can happen anytime. I'm pleased to say that the anode cap is a really good fit and only comes away of not installed correctly in the first place. And as you all know a CRT will recover a charge when left with no anode cap connected. }:)

The rubber glove technique does get used in industry, I have seen "professional" gloves, really thick certified gloves guarenteed up to a specified voltage. Somewhat distracting I must admit and to use a pair in a develoment lab would be.. well, get attention.

 
Top