Jump to content

Gorgonops

Moderators
  • Content Count

    5071
  • Joined

  • Last visited

2 Followers

About Gorgonops

Profile Information

  • Gender
    Not Telling

Profile Fields

  • OCCUPATION
    Board-Certified Kvetcher

Recent Profile Visitors

1170 profile views
  1. I kind of thought it went without saying/repeating that if you actually want to feed *anything* from your SuperMac card into the compact monitor you'll need something like a Micron neck board (or requisite hacks to the original analog board) *and* the necessary hacks to the horizontal drive outputs. If you have a sufficiently flexible multisync monitor lying around you can *prototype* the mode on that, but ultimately you're nowhere without doing some actual soldering. Find the data sheet for a CRT similar to the one in a compact Mac and figure out what the drive voltage is for the intensity pin leading to the electron gun, and from there look for a high speed analog op-amp that can convert the 0-0.7v range that's standard for VGA-era analog monitor inputs into what the tube wants. Presumably that is what the Micron board has on it, unless said op-amp is on the card itself.
  2. I was assuming your card if it were so equipped with the 24mhz crystal that's in your screenshot. If you don't actually have said crystal then I guess it's so much for that idea. As I've mentioned before, using a newer card that actually *does* have a programmable frequency generator for the pixel clock would totally eliminate these problems. It's actually kind of pathetic that the Supermac card only has a single crystal on it. I mean, I guess I'm kind of confused again; can you even change resolutions without changing crystals? (The manual kind of implies no?) At least contemporary SuperVGA cards that lacked programmable clock generators compensated by slapping more crystals onto the card where they could all be accessible at once. (It's kind of comical, actually, how many cans you'll find on really old VGA cards. Check this bad boy out. It's not even a particularly high-end card.)
  3. You know, if you really want to get your hands dirty here's a lame suggestion: the only thing the monitor really cares about is vertical and horizontal frequencies and the widths of the sync pulses. It doesn't actually care about dot clock. (within reason, anyway.) If your video card has a 24Mhz crystal installed you could just fake up a video mode that otherwise fits your monitor but has a different horizontal resolution. For laughs I went back to the modeline generator and since 24mhz is about half again larger than 15.6mhz diddled around with the numbers for a 768x342 pixel mode. Came up with this: Which results in this: Xfree modeline: Modeline "768x342" 24 768 790 794 1078 342 342 346 370 -hsync Obviously this mode would result in non-square pixels on a 4x3 CRT but I think there's pretty good odds it would be *displayable* on a compact mac monitor. (hsync/vsync frequencies are within tiny fractions of a percent of correct and the hsync/vsync pulse widths are also very close to specifications.) If you want to try messing around with this before locating a 15.6mhz crystal, well, there you go.
  4. So, mystery solved, the crystal frequency has to be the pixel clock. The 24mhz 640x480 in your screenshots *is* a nonstandard mode. I think you have all the information you need to make educated guesses about all the other fields. If you have a "very capable" multisync monitor handy then by all means you can start shopping for a crystal. (Although it is worth verifying that your monitor actually supports hsync frequencies below VGA's 31khz, many later multisyncs have that as the floor.)
  5. Yes. As I believe has been thoroughly explored in previous threads, the substitute neckbeard, er, neck board from Micron substitutes its own analog-proportional intensity feed for the digital only amplification stage on the native analog board, and for that it only needs to override a single line. This was their alternative to having to replace the analog board. My guess as to how the card actually operates, given that according to the wiring harness it's taking *in* the Hsync and Vsync signals from the Mac motherboard and *then* feeding them into the connector on the analog board is it essentially "genlocks" to the timing signals output from the mac instead of generating its own Hsync and Vsync signals when it's operating in parasitic mode. That's not a sure thing, it *could* just have the inputs from the motherboard pass through a MUX so they can be switched in and out, but if I had to put money on it I'd go with the genlock theory if for no other reason then I doubt that those outputs can be shut off (IE, the internal video timing chain definitively disabled) on the motherboard and it might create noise/interference problems to have two separate clocks running at the same frequency but out of phase inside such a small box. Presumably there *is* a MUX onboard the card that allows it to either feed through the motherboard video (via a suitable amplifier) to the modified neckboard or to switch it out and sub its own. On the driver side I'm guessing that when the Micron is running in internal mode it commits some sort of rudeness to delete the internal video from the list of available video devices when it's in control so the original B&W framebuffer doesn't show up a phantom headless viewport. This override of course does not happen when the card is configured to run an external display. Remember the important takeaway from the DOS/MP3 build: The "Hsync" signal from the Mac motherboard isn't actually a sync signal, it's a drive signal. The monitor actually cannot "sync" to anything because it doesn't have an internal oscillator. The guy doing the build admitted his solution of tacking in some oneshots was kind of a hack. This is *another* reason to think that the Micron card is actually using the hsync/h-drive from the Mac motherboard instead of generating it on its own. Toby and friends can actually sync lower than this because they support NTSC interlaced output. This is another reason why it's bad to myopically concentrate on this one video card, because it's a proprietary beast that you apparently don't have full documentation for. You keep sticking the same screenshot of it up on the screen, do you even have the card plugged into something so you can *play* with putting different numbers in it and see how it affects the output? I honestly do not know how to interpret what it needs for a crystal. The 24mhz it shows in that screen shot is a little bit low for the pixel clock of standard 640x480@60hz VGA, which is 25.175 mhz, so it's probably *not* the pixel clock, at least not exactly? You *can* put together a 640x480 mode with a 24mhz pixel clock by tweaking the margins smaller than usual, but it's going to be a little bit out of spec for a normal VGA monitor and might need some tweaking of the h/v size knobs. Maybe what that control panel does is when you give it a resolution and a crystal speed it tries to fudge the rest of the numbers to come up with something that works? Again, I certainly have never played with it so I have *no idea* how it works, if it autofills any of the fields based on what you input on other fields, etc. Basically all VGA cards made since the mid-1990s have *extremely* flexible timing PLLs that can be adjusted to essentially arbitrary frequencies from NTSC on up. This happened because it became pretty clear pretty fast that slapping on a separate crystal for every pixel clock was not going to be scaleable. (Original 1987 VGA cards have two crystals on them, a 25.175mhz and a 28.something one, and all the video modes from 320x200 up to 640x480 (and the 720x400 text mode) are derived from simple divisions of those crystals. (And the original VGA always used the same horizontal sync value; it managed this by running at either 60 or 70hz depending on the number of lines in the video mode; semi-interesting trivia: the only mode at which the original VGA runs at 60hz is the 640x480 graphics mode, in text and CGA/EGA backwards compatibility modes it's 70hz.) Your SuperMac card apparently both has PLLs *and* swappable crystals, which makes it weird. Unless you have complete documentation as to how it works and how much "fudge" the PLLs give you in selecting crystals I don't know really know how to even guess what the right value would be other than maybe it's in the ballpark of 15.6 mhz. Another worry might be that the software simply won't accept lower values than 640x480 even if the hardware is capable. Have you tried it? At this point I would almost suggest your best course of action is to go find a really old Multisync monitor, like an NEC Multisync 3D or similar, if you can find one that works, so you can experiment at will with sub-VGA signal timings. One of those ancient multisyncs that can handle CGA/EGA/VGA/etc. should sync up with a simulated "I'm driving a Macintosh monitor" timing set and would let you tweak a modeline you're confident produces a picture before you hack the hardware.
  6. Here it is, and it includes a schematic for fixing the horizontal drive pulse. http://members.optusnet.com.au/eviltim/macmp3/macmp3.htm Again, strictly note that this doesn't solve the 1-bit digital input problem for the video information.
  7. I fudged the horizontal sync pulse and back porch widths slightly from what it says in the devnote because adjusting them to match the specs in that nerdhut article: Made the vertical and horizontal sync frequencies match up with the quoted specs for the monitor exactly when using the specified dot clock rate. (With the calculator you specify the dot clock and the hsync/vsync frequencies are calculated based on the other items you specify.) IE, I made the sync pulse "3" pixels instead of two because the two pixels that *seem* to be specified in the developer notes are only 2/3rds of what's observed by Nerdhut, and I adjusted the back porch down by a few pixels to compensate and make everything else line up. It's very likely the monitor would work fine with either my/nerdhut's settings or the 2/178 that's in the Devnote, so I recommend not obsessing over it. The DevNote's settings make for a pretty short sync pulse and reduce the effective vertical framerate slightly compared to what the published specs say it's "supposed" to be, but only a fraction of a percent. Have you actually worked out the analog pieces you're going to need to make this happen? As I mentioned earlier, the compact mac's driver board doesn't take standard TTL vsync/hsync signals. There's an old article floating around which I'm sure you'll find a link to if you search enough where someone turned a compact Mac into a DOS MP3 player by driving the CRT from a VGA feature connector, I believe that article described the difference between the signals. My only vague recollection was at the very least you'll need an inverter on one of the lines?
  8. I'm kind of confused as to why you're looking in the LC II devnote for anything related to this if your goal is to set up a Compact Mac-compatible display mode? The Compact Mac screen does *not* run at the same scan rates as the 12" 512x384 monitor. If you're serious about trying to pull this off, IE, converting a B&W Mac monitor to run off of a more conventional video card, I have a suggestion for you: put this SuperMac video card out of your head for a while because it's actually a pretty lousy thing to be starting with. Instead of having a programmable timing generator like you'll find on more modern cards it requires swapping oscillators to implement new video modes, and of course is tool to set up said modes isn't by any means a standard thing so you're going to have a lot of trouble finding anyone able to help you translate specs into its lingo. I'd suggest starting with a rotgut PC with a simple VGA card (almost anything will do, ATI cards have always been a good choice from a compatibility perspective), slap Linux or FreeBSD on it, and once you've determined how you're electrically going to interface the VGA card to your compact Mac monitor (an exercise for the reader, remember that in addition to either needing to swap the neck board out or radically hacking the analog board if you want to accept an analog grayscale signal you'll have to add some circuitry to make it accept standard sync signals, that has been discussed in a few other threads) you can try using a standard Modeline calculator to come up with a video mode that works. Based on the values in the "important numbers" box on this page I used this modeline calculator to come up with the following Xfree86 modeline: Modeline "512x342" 15.6672 512 526 529 704 342 342 346 370 -hsync Here's what I input to come up with that; some of the values arrived at by trial-and-error: And here's what the calculator says they add up to: Which looks like a pretty good fit for what we know. Once you have this working with something like a standard video card then perhaps you can figure out how to translate a modeline into what your SuperMac's control panel wants and also know what crystal you need to buy.
  9. Another thing to think about here: regardless of whether or not pulling that little fake declaration ROM off the board in an SE/30 would make it "not see" the display hardware you have to remember that the actual display hardware would be there regardless. IE, there would still be address decoders sitting on the bus enabling the SE/30's VRAM and whatever control ports it presents inside the SE/30's slot area *unless* there is some sort of mux/buffer that has to be explicitly enabled to allow that hardware to work, and the control port for *that* would also likely reside somewhere inside the slot $E area. That stuff is *electrically* going to conflict with any other PDS device you try to put there. The fact that the SE/30 displays "simasimac" instead of a blank screen when it's broken in such a way to prevent it from starting is a clear indication that the hardware in there is completely "hard-coded" and doesn't require any software initialization to come on and start pushing pixels out the door. (Video cards that require programming something like a CRTC chip *don't* display anything if the CPU isn't starting unless the chip's registers are set to come on in a working base config; this is the case with the display chips in some old home computers, but not so much anything more modern.) That is also a really big strike against any idea that it's possible to "soft-disable" it.
  10. From a link in the last link: (Unfortunately the corruption that wipes out a few of the values appears to be in the original PDF for the Classic II Developer Note, at least the first version I stumbled across. But you can actually copy-paste the text out of that area if you have the PDF and see some of what's cut out.. )
  11. Gorgonops

    Color Classic doesn't boot off external HD

    Neither BasiliskII nor Mini vMac are particularly good for this because they both always treat virtual disks as if they were gigantic floppies. Therefore your can't run the tools necessary to create a valid disk/partition label which you'll need to have on the SD card. (It is barely possible to pull this off in BasiliskII if you can convince it to treat your SD card as a generic scsi device because this *will* expose it to BasiliskII in such a way that it "looks like" a SCSI drive for Drive Setup, but that's really picky to make work and it usually only works at all under Linux.) If you can't get SoftMac to run than you can probably do it in MAME, as outlined here: http://www.savagetaylor.com/2018/12/26/setting-up-your-vintage-classic-68k-macintosh-using-mame/ Just fair warning, MAME is really unpleasant and arcane to set up compared to any other Mac emulator. The last time I played with it the Mac emulation was also horrendously buggy. (As late as 2015 it couldn't even boot System 7. I guess that's fixed? Hard to tell because the documentation is so bad.)
  12. Gorgonops

    Is an inverter bi-directional: 74LS04 specifically?

    If the original LC *does* pass the FC3 code to the expansion connector because it expects cards to select on different ranges depending on whether it's in 24 or 32 bit mode and all subsequent LC machines don't then that's an annoying anomaly. It would *imply* at least that a card manufacturer that wanted to make a card that worked in all LC models would have to make hardware that groks and responds properly to both methods even if the low-address decoding is really only needed on the one model. That is one thing that has me a little confused, and makes me wish the original LC DevNote was out there in the wild. (Maybe the equivalent information is in one of the "Inside Macintosh" volumes.) The original 68020 Mac II shipped with that proprietary chip in the MMU socket that apparently provided some subset of the MMU's mapping capabilities to switch between 24 and 32 bit addressing mode. Given how the "D..." manual talks about "the computer hardware translates 24-bit addresses" and doesn't specifically call out the original Mac II as an exception I'd assume that if all it says about decoding is true then it must be capable enough to handle that... and based on that I naturally assumed the LC must include the equivalent functionality built into the chipset somewhere. But, I dunno, maybe it just doesn't? I guess you could technically get away with it since it has that hard 10MB RAM ceiling if you were careful about where you put everything else... with one exception? If the original LC actually supports the full 16MB normal slot space then you'd *have* to at least have some kind of switching to keep the ROM and RAM from being shadowed into said slot space when running in 32 bit mode. Maybe there's something in the chipset that suppresses chip select for all motherboard devices when the address bus is in $FE00:0000-$FEFF:FFFF? (My uneducated guess as to why the original LC/LCII don't have A28-A30 was to thwart trying to us the 256MB SuperSlot spaces to make a RAM expansion board or other peripherals that would violate the "cheap and cheerful" designation of the LC? Only good reason I can think of.) Given what you've got there is a network card that almost certainly only occupies the 1MB of minor slot's worth of memory and I/O then, yeah, it might make life easier because you "only" need to fake out the 32 bit $E enable on the PDS card to respond to your chosen alternate slot. In experimenting with the LC what did you read on the address bus when it was looking at the declaration ROM?
  13. Gorgonops

    Is an inverter bi-directional: 74LS04 specifically?

    Last broken record post, I swear. From the LC II devnote:
  14. Gorgonops

    Is an inverter bi-directional: 74LS04 specifically?

    Okay, if it's verified then I guess you know what you're doing. (I tried looking for specific discussion of that in the other thread, I really did, and honestly I just couldn't follow it through all the interstitial pictures of wiring looms. Maybe I need more coffee.) I'll just note one last time that my concerns are based on passages like this: (Page 133 of D..3rd ed:) This *is* in the NuBus section of the manual, but there are passages in the PDS section (that I think I've quoted before) that made me think that this translation still happens with PDS cards. But, hey, if I'm wrong I'm wrong. Do you have a reference that shows PDS cards actually need to decode both the standard *and* minor slot spaces? I assume the A28-A30 decoding issue is why you have the PLD in there. Are the "A22LC" and "A26" outputs from that chip essentially what's going to be used for device enable for the LC card?
  15. Gorgonops

    Is an inverter bi-directional: 74LS04 specifically?

    My point is that I'm not entirely convinced you are hitting the correct address lines. The Apple manual reads an awful lot like the MMU takes care of presenting a 24 bit view of the 32 bit normal slot space, that you don't need to have two separate sets of decode on the card. And if that is actually true then inverters on A20-22 are not going to change the address decoding in a useful way. To modify the 16MB slot space you should be playing with A24-26. And worse, if you really think there is going to be a problem with a conflict in 32bit mode you can't get away with ignoring it because it also reads to me like during hardware initialization it's going to be in 32 bit mode when it's looking for declaration ROMs, etc, regardless of what the control panel setting for the OS is. I also got the distinct feeling looking at the third edition of the "Designing..." Manual's LC PDS section that there are some wrinkles about how chip select works for LC PDS that are different enough from 030 PDS that you're going to need to address them too. (You did notice how the LC PDS is missing A28-30, right?) But, hey, maybe I have no idea.
×