• Hello MLAers! We've re-enabled auto-approval for accounts. If you are still waiting on account approval, please check this thread for more information.

Compact Mac Video Timings Needed for SuperVideo Input Parameters

Which numbers did you come up with by trial and error and are they any different from mine?
I fudged the horizontal sync pulse and back porch widths slightly from what it says in the devnote because adjusting them to match the specs in that nerdhut article:
 

HSYNC-Frequency
22.25kHz, 45µs period, 18.45µs low 59% PWM duty cycle

VSYNC-Frequency (Refresh rate)
60.15Hz, 16700µs period, 180µs low 99% PWM duty cycle

DATA
15.6672MHz, 512 pixels in roughly 32.8µs



Made the vertical and horizontal sync frequencies match up with the quoted specs for the monitor exactly when using the specified dot clock rate. (With the calculator you specify the dot clock and the hsync/vsync frequencies are calculated based on the other items you specify.) IE, I made the sync pulse "3" pixels instead of two because the two pixels that *seem* to be specified in the developer notes are only 2/3rds of what's observed by Nerdhut, and I adjusted the back porch down by a few pixels to compensate and make everything else line up.

It's very likely the monitor would work fine with either my/nerdhut's settings or the 2/178 that's in the Devnote, so I recommend not obsessing over it. The DevNote's settings make for a pretty short sync pulse and reduce the effective vertical framerate slightly compared to what the published specs say it's "supposed" to be, but only a fraction of a percent.

Have you actually worked out the analog pieces you're going to need to make this happen? As I mentioned earlier, the compact mac's driver board doesn't take standard TTL vsync/hsync signals. There's an old article floating around which I'm sure you'll find a link to if you search enough where someone turned a compact Mac into a DOS MP3 player by driving the CRT from a VGA feature connector, I believe that article described the difference between the signals. My only vague recollection was at the very least you'll need an inverter on one of the lines?

 
Have you actually worked out the analog pieces you're going to need to make this happen?
Not yet, just made up worksheets on the copier at work when I was bored. One for each question, less the interlacing checkbox.

 1. Decide how many horizontal and vertical pixels the monitor should display - GIVEN

 2. Find the horizontal period, and the horizontal blanking period. - SPECS

 3. Determine the required pixel rate and oscillator frequency.

 4. Determine the pixel rate divider value. - ???

 5. Determine the time required to display one horizontal unit of pixels. - SPECS

 6. Determine the value for the Horizontal End Sync Field. - SPECS

 7. Determine the value for the Horizontal End Blank Field. - SPECS

 8. Determine the value for the Horizontal Start Blank Field. - SPECS

 9. Determine the value for the Horizontal Total Field. - SPECS

10. Determine the value for the Vertical End Sync Field. - SPECS

11. Determine the value for the Vertical End Blank Field. - SPECS

12. Determine the value for the Vertical Start Blank Field. - SPECS

13. Determine the value for the Vertical Total Field. - SPECS

14. Set "Interlace" Check Box if applicable -  NA

Gonna work through the formulas by the numbers, using the high precision specs rather than the roundoffs floating around in the docs. I figure my numbers will likely help you and Bolle on the Linux front? I like your suggestion, but as I said it's far beyond my comfort zone. ISTR something about an eight pixel grouping being a reference unit/variable in the formulas. Something is hazy about a word being fed somewhere serially through the system/buffering sequence? Dunno, when I run across it again I'll post it. Might be that one horizontal unit of pixels thing above?

As I mentioned earlier, the compact mac's driver board doesn't take standard TTL vsync/hsync signals. There's an old article floating around which I'm sure you'll find a link to if you search enough where someone turned a compact Mac into a DOS MP3 player by driving the CRT from a VGA feature connector,
Haven't gotten that far. Bolle's got the GS Neck Board under control and some notions about hooking everything up as I understand it. But I do remember that article. GIZMODO? I'll be looking for it! THX

 
Thanks much, I do remember that hack. Only had a bit more than an hour to search fairly blindly, not knowing technical descriptives offhand to refine the search string.

Bolle has the whole shebang to buzz out in terms of where the Xceed card sends what through where in that crazy harness.. My best guess is that the strength of the signal fed to the GS neck board might be the combined RGB lines wedged into the circuit without ever being fed through the A/B?

Dunno, that's above my pay grade. I'm just hoping this early, programmable Mac board can sync at lower frequencies than later cards. Quick look at:

3. Determine the required pixel rate and oscillator frequency.

.  .  .  makes me think that the clock of the custom monitor is specified in the formulas for the inputs and that the frequency the Control Panel spits out will be a commonly available part. The board's selection of PLLs would torque a standard frequency into something that works at whatever oddball frequency would be needed to drive a custom monitor? Dunno, still above my pay grade. Would that be analogous to the more flexible timing setups of the cards you're suggesting for driving the internal display from a Linux PC? :mellow:

 
Looks to me like the "Horizontal" problems with the two hacks we're referencing would be that those signals are generated by the logic board, derived from the system clock there and missing in their application of using A/B and neck board/CRT as a display sans mobo.

We'll need to either jury rig something for using those signals from the lobotomized mobo of the host Mac to hijack its components as an external monitor for my IIcx or provide a makeshift solution as they did?

Looking in another PDF I read that the Classic Mac series (128k-Plus) differs from the SE/SE30 A/B. No fun as I was aiming this kluge at a spare Plus. :-/

edit: YAY! For the first time ever I think, crucial illustrations for the next phase of a project rolled directly over at a page break! [:)]

 
Last edited by a moderator:
Xceed-Harness-00.JPG

This diagram from the manual I can grok! My guess it that the single wire between card and neck board carries the grayscale intensity and the rest is for timing? Nice that neither patent drawing no Manual illustration include the complexity of doing external monitor support, that's a big plus for me. KISS = KIAIS - Keep It All Internal Stupid

Xceed-Patent-Harness.JPG

This one I not, so I'll do a markup of the pretty one in AI for future reference. Looks like it is indeed VOUT connected directly to GS neck board from VidCard. Silly of me to think in terms of RGB bundling, that's backwards. Need to do that for going from B&W output to an RGB display at a usable brightness level, not applicable here, the VidCard will ONLY do GS (and maybe B&W for compatibility?) if and when.

 
Out of curiosity, how hard would it be to take the Macintosh II Video Card documentation, twerk its hardware spec to support GS/30 resolution, tweak the driver for its output and cut the NuBus legs from beneath TOBY to put such an abomination onto the PDS of the SE/30? I don't think I'm projecting too far from results unachieved or even attempted thus far, but it appears that much of the information we need is coming together nicely for the undertaking?

edit: not necessarily talking about building Frankenstein's monster out of a Toby card directly so much as recreating that model on a breadboard or PCB.

edit 2: revisiting the Toby docs as inspiration for a new, simplified 8bit GS framebuffer card for the68030 PDS?

Dunno, gonna keep running data and reference material down for that project and documenting it in this thread for the if and maybe when of a next phase.

 
Last edited by a moderator:
e4ce44ad2398425a32b0362bef9c5af5.jpeg.9912ce239bdd0baf83b489ece412858a.jpeg


 
My guess it that the single wire between card and neck board carries the grayscale intensity and the rest is for timing?
Yes. As I believe has been thoroughly explored in previous threads, the substitute neckbeard, er, neck board from Micron substitutes its own analog-proportional intensity feed for the digital only amplification stage on the native analog board, and for that it only needs to override a single line. This was their alternative to having to replace the analog board.

My guess as to how the card actually operates, given that according to the wiring harness it's taking *in* the Hsync and Vsync signals from the Mac motherboard and *then* feeding them into the connector on the analog board is it essentially "genlocks" to the timing signals output from the mac instead of generating its own Hsync and Vsync signals when it's operating in parasitic mode. That's not a sure thing, it *could* just have the inputs from the motherboard pass through a MUX so they can be switched in and out, but if I had to put money on it I'd go with the genlock theory if for no other reason then I doubt that those outputs can be shut off (IE, the internal video timing chain definitively disabled) on the motherboard and it might create noise/interference problems to have two separate clocks running at the same frequency but out of phase inside such a small box. Presumably there *is* a MUX onboard the card that allows it to either feed through the motherboard video (via a suitable amplifier) to the modified neckboard or to switch it out and sub its own.

On the driver side I'm guessing that when the Micron is running in internal mode it commits some sort of rudeness to delete the internal video from the list of available video devices when it's in control so the original B&W framebuffer doesn't show up a phantom headless viewport. This override of course does not happen when the card is configured to run an external display.

Remember the important takeaway from the DOS/MP3 build: The "Hsync" signal from the Mac motherboard isn't actually a sync signal, it's a drive signal. The monitor actually cannot "sync" to anything because it doesn't have an internal oscillator. The guy doing the build admitted his solution of tacking in some oneshots was kind of a hack. This is *another* reason to think that the Micron card is actually using the hsync/h-drive from the Mac motherboard instead of generating it on its own.
 

Dunno, that's above my pay grade. I'm just hoping this early, programmable Mac board can sync at lower frequencies than later cards.


Toby and friends can actually sync lower than this because they support NTSC interlaced output.

The board's selection of PLLs would torque a standard frequency into something that works at whatever oddball frequency would be needed to drive a custom monitor? Dunno, still above my pay grade.


This is another reason why it's bad to myopically concentrate on this one video card, because it's a proprietary beast that you apparently don't have full documentation for. You keep sticking the same screenshot of it up on the screen, do you even have the card plugged into something so you can *play* with putting different numbers in it and see how it affects the output? I honestly do not know how to interpret what it needs for a crystal. The 24mhz it shows in that screen shot is a little bit low for the pixel clock of standard 640x480@60hz VGA, which is 25.175 mhz, so it's probably *not* the pixel clock, at least not exactly? You *can* put together a 640x480 mode with a 24mhz pixel clock by tweaking the margins smaller than usual, but it's going to be a little bit out of spec for a normal VGA monitor and might need some tweaking of the h/v size knobs.

Maybe what that control panel does is when you give it a resolution and a crystal speed it tries to fudge the rest of the numbers to come up with something that works? Again, I certainly have never played with it so I have *no idea* how it works, if it autofills any of the fields based on what you input on other fields, etc.
 

Would that be analogous to the more flexible timing setups of the cards you're suggesting for driving the internal display from a Linux PC? :mellow:


Basically all VGA cards made since the mid-1990s have *extremely* flexible timing PLLs that can be adjusted to essentially arbitrary frequencies from NTSC on up. This happened because it became pretty clear pretty fast that slapping on a separate crystal for every pixel clock was not going to be scaleable. (Original 1987 VGA cards have two crystals on them, a 25.175mhz and a 28.something one, and all the video modes from 320x200 up to 640x480 (and the 720x400 text mode) are derived from simple divisions of those crystals. (And the original VGA always used the same horizontal sync value; it managed this by running at either 60 or 70hz depending on the number of lines in the video mode; semi-interesting trivia: the only mode at which the original VGA runs at 60hz is the 640x480 graphics mode, in text and CGA/EGA backwards compatibility modes it's 70hz.)

Your SuperMac card apparently both has PLLs *and* swappable crystals, which makes it weird. Unless you have complete documentation as to how it works and how much "fudge" the PLLs give you in selecting crystals I don't know really know how to even guess what the right value would be other than maybe it's in the ballpark of 15.6 mhz. Another worry might be that the software simply won't accept lower values than 640x480 even if the hardware is capable. Have you tried it?

At this point I would almost suggest your best course of action is to go find a really old Multisync monitor, like an NEC Multisync 3D or similar, if you can find one that works, so you can experiment at will with sub-VGA signal timings. One of those ancient multisyncs that can handle CGA/EGA/VGA/etc. should sync up with a simulated "I'm driving a Macintosh monitor" timing set and would let you tweak a modeline you're confident produces a picture before you hack the hardware.

 
Last edited by a moderator:
Thanks for the background info and guesses as to how the Xceed card works. It must indeed be doing some low level nasty to come up as the startup screen. One of the tidbits I picket up in research is that it outputs slightly less than(?) the full 342 lines as a way to throw the built-in video subsystem off its stride. Plays hob with compatibility IIRC.

This is another reason why it's bad to myopically concentrate on this one video card, because it's a proprietary beast that you apparently don't have full documentation for. You keep sticking the same screenshot of it up on the screen, do you even have the card plugged into something so you can *play* with putting different numbers in it and see how it affects the output? I honestly do not know how to interpret what it needs for a crystal.


Not myopic at all, the only reason I've used the same graphic is that I have it on hand and reference the required inputs from it. The numbers are whatever were in the existing profile I was modifying. That's how you do a custom setup, edit and existing setup and save it under another filename. In this case the ill fated attempt at 1600x1200? [:P] I do have the complete docs and am in the process of sharing them with all. With any luck I'll have it scanned into a really ugly PDF from the printer at work and it will be able to email to me tonight without dropping dead in its tracks.

This is an interesting piece of hardware from a time that no hard and fast standards had achieved hegemony, a nique opportunity presented. I've had on hand for many years and have long wanted to put through its paces. If it works, great, if no that's fine too, it's a learning experience. But there's no way I'm going to be able to go the VGA/Linux PC route, too much learning curve and too little time remaining for banging away at stuff that's not really fun for me.

Docs coming up real soon now! :approve:

 
I found the tidbit about 341 lines rather than the Compact Mac 342 lines, it was on gamba:

I have been using a Micron Colour 30 with the GrayScale 30 internal adapter
in my SE/30 for the last 6 months. It actually works as expected.
The oonly problem i have found is the way Micron has chosen to solve a
problem with stupid programs. Some programs start by checking the screen
size and if it's 512 * 342 the program decides it must be 1 bit black/while;
the programs does not check to see if the screen is grayscale. Micron
chose set the screen size to 512 * 341 to force these programs to check the
screen more closely.

This unfortunately gives a few other problems, with some programs. I have
had programs complaining that they could not run, because the program
needed at least a 512 * 342 screen. Others decide that if the screen isn't
512 * 342 it must at least be 512 * 384 (Apple 12" colour) and start up
with a dialog, where all the buttons have been placed outside the screen;
most of these only need an OK, that can be reached by pressing the return
button, but I have seen a few where I had to press the reset button to get
my machine back.


Can anyone confirm this?

 
Note the part about the card needing to read setup info from disk before it can bring video online. This experiment is only good for testing to get anything at all to drive Compact Mac CRT as a 512x342 grayscale display for a NuBus Mac. Before that, the first baby rollover will be to get the card to display that resolution driving one of my very capable 21" multisync CRTs. The MAG has an LCD readout listing Mac resolutions/frequencies, can't wait try that one out if and when.

 
So, mystery solved, the crystal frequency has to be the pixel clock. The 24mhz 640x480 in your screenshots *is* a nonstandard mode.

I think you have all the information you need to make educated guesses about all the other fields. If you have a "very capable" multisync monitor handy then by all means you can start shopping for a crystal. (Although it is worth verifying that your monitor actually supports hsync frequencies below VGA's 31khz, many later multisyncs have that as the floor.)

 
It's thanks to you and @BadGoldEagle that I was able to get so much information together and even gain a bit of understanding of how it all fits together. Both the MAG and the Radius have 5-BNC inputs, so I'd imagine they'll handle just about anything thrown at them. The Mag has BNC/VGA/MAC inputs, but the Radius with just BNC/VGA will do resolutions the MAG reports as out of sync. We'll see what happens, thanks much for the help! :approve:

 
You got me curious about the TPD choices. I thought I'd seen composite somewhere, looks like it's combined with HD, but what's the significance of the PrecisionView 2150 Sync range switch? Wondering if COMP.HD is where I should be connecting the SE Radius TPD card's CoAx connection for the next SE/TPD playtime session?

PrecisionView2150-inputs-sync.JPG




 
Last edited by a moderator:
You know, if you really want to get your hands dirty here's a lame suggestion: the only thing the monitor really cares about is vertical and horizontal frequencies and the widths of the sync pulses. It doesn't actually care about dot clock. (within reason, anyway.) If your video card has a 24Mhz crystal installed you could just fake up a video mode that otherwise fits your monitor but has a different horizontal resolution. For laughs I went back to the modeline generator and since 24mhz is about half again larger than 15.6mhz diddled around with the numbers for a 768x342 pixel mode. Came up with this:

image.png

Which results in this:

image.png

Xfree modeline:

Modeline "768x342"   24   768 790 794 1078   342 342 346 370  -hsync

Obviously this mode would result in non-square pixels on a 4x3 CRT but I think there's pretty good odds it would be *displayable* on a compact mac monitor. (hsync/vsync frequencies are within tiny fractions of a percent of correct and the hsync/vsync pulse widths are also very close to specifications.) If you want to try messing around with this before locating a 15.6mhz crystal, well, there you go.

 
OMG that is just so ugly as to be beautiful on a 9" CRT. By displayable, do you mean from my card in a NuBus Mac or something that might be done as an alternate resolution on a GS Card, thinking the former? Might there enough play in vertical adjustment to square the pixels up into letterboxed wide aspect ratio goodness without releasing magic smoke?

No joy on the lower frequency crystals. I've only got the 64mhz crystal for the big SuperMac monitors. I wonder how much crystals go for from parts suppliers, they're all over eBay in 15.6MHz modules for whatever.

 
Last edited by a moderator:
OMG that is just so ugly as to be beautiful on a 9" CRT. By displayable, do you mean from my card in a NuBus Mac or something that might be done as an alternate resolution on a GS Card
I was assuming your card if it were so equipped with the 24mhz crystal that's in your screenshot. If you don't actually have said crystal then I guess it's so much for that idea.

As I've mentioned before, using a newer card that actually *does* have a programmable frequency generator for the pixel clock would totally eliminate these problems. It's actually kind of pathetic that the Supermac card only has a single crystal on it. I mean, I guess I'm kind of confused again; can you even change resolutions without changing crystals? (The manual kind of implies no?) At least contemporary SuperVGA cards that lacked programmable clock generators compensated by slapping more crystals onto the card where they could all be accessible at once. (It's kind of comical, actually, how many cans you'll find on really old VGA cards. Check this bad boy out. It's not even a particularly high-end card.)

d197c421d422f5cbf569ea13f09ef700_XL.jpg.75c9c3f67c568bb640427caf33207cc6.jpg


 
Last edited by a moderator:
Back
Top