The Pro Audio Spectrum 16 NuBus's YMF262 FM synthesizer...

Mr. Ksoft

Well-known member
Heh, talk about a lazy design. They seem to have told someone "Take this ISA card and make it work on Nubus". Some x86 ASM samples for the mixer can be found in this zip: http://www.symphoniae.com/soundcard/MediaVision/SDK/PASSDK_201.zip

check the PAS\SUBS\MIXER directory.
Sadly these aren't as helpful as they could be. All the example code is meant to be used against the MVSOUND.SYS driver - the actual low level communication is hidden in that precompiled driver. This is even called out in the Developer's Toolkit Reference (Ch. 20 - Mixer Programming Essentials):
You can program the mixer using either the text string, command line interface or the low-level software API. [...] A direct hardware API is not provided.

So that does leave us needing to figure out how to communicate with the mixer ourselves, or I can see if there's something in the driver interface that replicates the MVSOUND.SYS calls. On the plus side, the include files in the devkit do at least identify the correct I/O addresses and what bits correspond to which mixer channels. It's just determining how the communication works - we only have two possible single byte addresses for the mixer (0xB88 for the main one and 0x78B for the "Parallel Interface Audio Mixer", whatever that is) so you probably have to twiddle some bits to tell it what channel to affect and use the rest of the bits to send a value. Still not totally sure about that part, but I'll keep studying things.
 

Mr. Ksoft

Well-known member
That old OSS code was extremely helpful. I sat down with it and annotated it to get an idea of what was going on then re-implemented the bare minimum.

And guess what? For the first time in over 30 years, an audible note came out of the card. It's done - we have access to the chip. The missing link was understanding to raise the volume of the FM channel as it gets initialized to zero by the driver.

I am going to clean my code up and post it up as a proof of concept if others would like to try their hand - it turns out it's all pretty basic once you know what slot your card is in, and how to manipulate the mixer. Next is the long road to create some actual software. First stop is probably a basic VGM player by recycling the playback code from my MS-DOS player VGMSlap. I'd like to do something a lot more "native" than just a text-based app but I haven't written GUI apps on Classic MacOS before, so I expect it will be some time until I'm up to speed on that.
 

demik

Well-known member
And guess what? For the first time in over 30 years, an audible note came out of the card. It's done - we have access to the chip.

I am going to clean my code up and post it up as a proof of concept if others would like to try their hand - it turns out it's all pretty basic once you know what slot your card is in, and how to manipulate the mixer. Next is the long road to create some actual software. First stop is probably a basic VGM player by recycling the playback code from my MS-DOS player VGMSlap. I'd like to do something a lot more "native" than just a text-based app but I haven't written GUI apps on Classic MacOS before, so I expect it will be some time until I'm up to speed on that.
Damn impressive work! Player Pro module when ? 😋
 

Mr. Ksoft

Well-known member
Damn impressive work! Player Pro module when ? 😋
Hah! That's actually an interesting idea...

As promised, here is my example program with source. Project file is for CodeWarrior Pro 4 (last 68k version) and uses the Metrowerks Standard Libraries and the Universal Headers, which are both part of the default install.

The compiled application included is hardcoded to Slot D since that's where my PAS16 is right now - if you need a different slot, change nubusSlot in the code and recompile. (When I make an actual meaningful program it will allow selection, or maybe detection if I can find a good way to do that - probably by probing for the DeclROM)

Turns out the whole SwapMMUMode() thing was a red herring, though - it works fine without it, so I removed it. I did pull in some logic to calculate the correct memory address based on a specified NuBus slot number and whether the system is running in 24-bit or 32-bit mode, and I think that should be sufficient to work. The 24-bit part is not tested since I run 7.6.1.

I'll also put this on GitHub probably, if I can figure out how the resource forks can be handled for the project files.
 

Attachments

  • PAS Poke w Source.sit
    77.9 KB · Views: 2

NJRoadfan

Well-known member
Resource forks should be stored in AppleDouble format. Use the OS X standard (._Filename) which specifies an entry for FinderInfo and the Resource Fork (in that order). It works with Netatalk out of the box and macOS should handle it as well.

Also... you could port Adlib Tracker 2. :p
 
Last edited:

Mr. Ksoft

Well-known member
I've completed a "proof of concept" VGM player, so we now have our first piece of "useful" OPL3 software for the Mac. It's a very quick and dirty port of snippets of code from my DOS player with minor changes to graft it onto a 68k Mac environment.

I've attached a SIT with executable, source, and sample VGMs to test with. Similar limitations to the previous test program apply (hardcoded slot mapping) but instructions are included on how to change it and recompile. The readme in general outlines some of the quirks.

I do want to do a proper GUI player, as I've mentioned, but I'll probably start from scratch instead of the huge mess of copied code I have now. 😅 The biggest obstacle at this point is timing - I am using the Extended Time Manager to run the VGM tick, but the problem is, VGMs are based on a 44100Hz sampling rate, or about 22.67 microseconds per sample. The most precise timing you can get with the ETM is 20 microseconds, but it's pretty much unattainable without completely overwhelming the CPU. I cut it down to 1/20th of the resolution (2205Hz - 453.51 microseconds) though this too can be easily changed in the source. Most VGMs don't take advantage of a full 44100Hz of sampling granularity and the difference in timing of the sample delivery is too small to notice. Performance wise, this compromise works on my system - but I'm also on an overclocked Quadra 800 at 44mhz.

I'm also still understanding how to properly yield CPU to the OS while not compromising the playback - I'm used to working on a single tasking environment like DOS, or a powerful modern system that has enough CPU grunt to make up for my code quality. If you accidentally click out of the window, you'll be hard pressed to get focus back to stop playback. This will be extremely important once I make the UI more interactive.
I think I'm supposed to use WaitNextEvent, and I have some scaffolding for that (based on some example code) that I tried and then took out. It may make more sense once I'm using "native" OS calls for input instead of the Metrowerks pseudo-console stuff. I found that even with focus on the app, it caused the TM timer to be much more unstable, which implies that it is probably not the ideal way to drive playback unless you are completely taking over the system. Plus, if you for instance hold a menu open, it stops the program from running completely, but the TM timer keeps running, so it "catches up" all the VGM commands at once.

Learning is constant - and I have a lot of catching up to do!
 

Attachments

  • LazyVGM.sit
    1.6 MB · Views: 0
Top