But on the 800k per 3.5" floppy issue, my Ensoniq sampler keyboards (1992) all format to 800k which can be read in a PC.
Are you sure they hold a full 800K or 720K? The PC standard was 720K formatted.
IBM was very conservative in terms of the number of sectors they crammed onto a track (9 512 byte sectors double-density, 18 "high density") because they wanted to ensure reliability. It's very possible to cram more onto a disk with MFM by reducing the distance between sectors. Several formatting programs (like
2M and FDFORMAT) were available to create higher-capacity disks usable under DOS, and in fact both IBM and Microsoft used higher-capacity nonstandard formats for software distribution (See the links for DMF and XDF on the 2M page).
Cramming one more sector per track gives you a disk that matches Apple's 800k format. (on an 80 track disk. On a stock 40 track PC drive you'd get 400k, again matching Apple's single-sided drives.) I will grant that you can make the case that Apple's varying-number-of-sectors model "spreads the data more evenly" across the disk and thus would tend to be more reliable when reading the inner tracks, but the market has spoken as to whether the additional cost for the small bump in capacity/reliability was worthwhile.
Twiggy drives were brilliant conceptually, but poorly executed. Doubt there would have been a lawsuit, as "Twiggy" was merely their nickname. Their official name was FileWare to be released as UniFile and DuoFile for the Apple ///, but failed so miserably on the Lisa that they went away quickly. Had they worked, they would have been a boon in 1983 supporting 871K on a 5.25" drive well ahead of the industry standard.
IBM chose 40 track 5.25 drives when it designed the PC because they were a "conservative" choice. 77 and 80 track 5.25" floppy drives existed in 1981 but were considered "finicky" because the narrower head was more sensitive to alignment issues. (Keep in mind we're being dirty pool here in the first place comparing the technology of a 1981-vintage standard to a 1984 model.) However Commodore (8050 and 8250 drives), Tandy (Tandy 2000), and a number of other manufactures used them (and they were available via third-parties for many other machines) and with "standard" formats they held 720-800k on the same double-density media used in standard drives. Fileware was by no means a vast improvement on existing technology, it was just another case of Apple wanting to do their "own thing".
(Undoubtedly the reason "Twiggy" was so weird was to make it patentable, thus allowing Apple to force manufacturers to take a license and give them cut of every disk drive and media sale. It's not because its bizarre layout was actually technically superior.)
Also pointed out, the PC/AT introduced 1.2MB HD 5.25 drives in 1984, but they were incompatible with the existing DD disks, and IBM was plagued with compatibility problems, particularly accidental corruption of data and damaging disks (the AT also suffered from a high number of defective hard drives). Perhaps that was the death knell of the 5.25" drive?
There were indeed issues with the 1.2 MB floppies, and in retrospect they might not have been the best idea ever. It wasn't because the technology itself was fundamentally flawed per se, it was because users couldn't be trained to use them reliably. (And really, you can't blame them) IBM would of been wise to have configured DOS to disallow writing to double-density disks with the high density drives, and it was also an unfortunate omission that the drives themselves lacked any ability to detect the density of the inserted media, thus allowing the user to format low-density media high-density and end up with a disk that would essentially erase itself over time.
Really I only really brought up the 1.2MB floppy to point out that 400K on one side of a disk was nowhere near "state of the art" for 1984. IBM' the company's behavior with regards to floppy drives was pretty deplorable, really. After going to 3.5 inch IBM for some bizarre reason decided to omit the density sensor from the custom drives they used in the IBM PS/2 series, despite it being standard issue across the rest of the industry. (Thus allowing users to shoot themselves in the foot exactly the same way they did with the 1.2MB disks. Users of clone systems didn't have this problem.) If there's a *real* lesson here it seems to be that conservative industrywide standards in the long run are "superior" to the overpriced proprietary solution that *any* single company puts out. The proprietary solutions may have some edge-case technical advantages, but if they're locked to a single brand they'll never achieve the penetration and longevity of the off-the-shelf products. Look at the guts of a MacBook today and it's clear that Apple has finally at least *partially* figured that out. ;^b