• Updated 2023-07-12: Hello, Guest! Welcome back, and be sure to check out this follow-up post about our outage a week or so ago.

Mac "flicker free" oddbal frequency reslutions to True60HzVGA by active conversion

Trash80toHP_Mini

NIGHT STALKER
I'm going to say it again.
You don't need to go on and on about it. Ugly business grade multisync panels are not yet in short supply, got it, enough already. We covered that way up at the top. One day, when those high hour LCDs fade away, we may well need such an active converter as all the CRTs will have perished as well. But need will come up no time soon, that's clear enough.

This is an offhand technical discussion and very interesting in the details so far, more of a feasibility study, most certainly not a market study.

I like my three big CRTs, they go with the collection. The UltraSharp's are convenient, but do not fit in with my collection outside of the G4/OS9 Graphics setup. Nor do the KDS/Radius Panels I've collected  from the Nineties for that matter, but they do look a lot better than the Dells when I'm playing with one next to the IIfx and big Radius TPD.

Active conversion to HDMI displays would be aimed at working Classic Macs into a primary workstation in the future where styling won't matter so much. Two or three Macs on a shelf, rotating in and out of the collection and sharing main display, keyboard and mouse would be handy for Classic Mac playtime. ATM a Dell Multisync (1600x1200 20") is the menu screen for my primary display and I can play with the IIsi that lives under the desk or any of several other Macs in rotation on it.

 

Trash80toHP_Mini

NIGHT STALKER
For the most part I basically agree that this is a solution looking for a problem.
Agreed, somewhat disappointed at that, but I'll agree.

You can't just "analog-ishly" pitch frames onto the floor to resolve frame-rate mismatches. Remember, what's being transferred in the video signal at any given moment in time is the color/intensity of a single pixel *somewhere* in the 2D grid that makes up the screen, and that feed is completely continuous.
That feels like too much of an oversimplification of what I was trying to say. The video signal is a stream of those values as you describe, each right on the dot clock. Gate valving the higher frequency stream of the first four frames to match the display's expected input over four consecutive frames keeps that stream steady. The fifth frame gets dropped when it's time to draw the first of the next four frames and so on.

IOW, you're analog-ishly retarding each and every pixel signal to match phases over four full scanline sequences and then jumping to the sixth scanline sequence of the input to start the first frame of the next four on the display end as there is no time remaining for that fifth frame to be drawn/converter/whatever?

That's muddy as all get out as well, maybe it's backwards? :blink:

edit: a gate valve is used to lower fire hydrant water pressure levels to garden hose water pressure levels. 75FPS being the hydrant and 60FPS being the garden hose.

 
Last edited by a moderator:

Gorgonops

Moderator
Staff member
There isn't any way to do what you're describing because, think about it: a gate valve controls *pressure*, which is roughly analogous to voltage. The electronic analog is a simple resistor. The problem here is *frequency*, IE, we need to reduce the number of phase transitions per second for an output relative to its input without losing any data except for discrete packets we choose to throw out. There's no way to do that I can think of without a "memory", and further I'm having trouble thinking of any analog memory systems (like, I dunno, mercury delay lines) that allow the output to be " clocked" at a different rate than the input. 

 
Last edited by a moderator:

NJRoadfan

Well-known member
The OSSC has applications on more modern systems as well. It supports sync-on-green, composite sync, and H+V sync sources, so it has utility in that area as well. The scaler function can be disabled if one wants 1:1 output on the digital side. Regarding refresh rates, the "mismatch" is rarely a problem with displays. You won't get judder/tearing from the mismatch (67hz into a 60hz native display) simply because a vintage mac can't output a solid 67 or 75 frames per second via its onboard video chip anyway (remember that this was the era of postage stamp sized 15fps video).

 

Cory5412

Daring Pioneer of the Future
Staff member
One day, when those high hour LCDs fade away,
They're still being made TODAY, except, now, with LEDs. For example: https://store.hp.com/us/en/pdp/hp-prodisplay-p17a-17-inch-5:4-led-backlit-monitor-(energy-star) (disclaimer: I haven't used this particular display personally, I have a Dell 19-incher from 2014, with most of the same feature-set)

The supply of vintage Macs will dry up before the supply of these displays does.

Like I said, the majority of vintage Macs (even the majority of modular 68k Macs) will output in a format even relatively cheap TVs can handle. The two modular Macs that are currently in a questionable state on this front can have a nubus display adapter installed.

This is largely a solved problem.

 

Gorgonops

Moderator
Staff member
Regarding refresh rates, the "mismatch" is rarely a problem with displays. You won't get judder/tearing from the mismatch (67hz into a 60hz native display) simply because a vintage mac can't output a solid 67 or 75 frames per second via its onboard video chip anyway (remember that this was the era of postage stamp sized 15fps video).
So far as I'm aware (most?) regular LCD monitors/TVs refresh their LCDs at the same rate as the input so the only circumstances under which tearing or judder would appear would be with a buffering converter. You certainly wouldn't see it with the OSSC since its whole selling point is there's no buffering, it's designed with minimizing any and all lag as the primary success criteria.

Here's an example of an FPGA-driven converter targeting oldschool system that does frame-rate conversion:

https://www.serdashop.com/MCE2VGA

It uses the oft-forgotten 400 lines@70hz timings that the original VGA cards used for text and EGA high-res modes. The modes it's using this to display were originally at 50 or 60hz. It apparently produces very pretty output, but even it gets trashed by that certain subset of gamers that obsesses about *any* delays or video artifacts, even ones that the human eye really can't readily pick out or only become apparent in certain edge cases. (Diagonal scrolling of playfields, etc.)

Like I said, the majority of vintage Macs (even the majority of modular 68k Macs) will output in a format even relatively cheap TVs can handle. The two modular Macs that are currently in a questionable state on this front can have a nubus display adapter installed.
I think the main sticking point here is that what's desired is some magic glue that will work with *any* old Mac video card, including obscure high-end ones that came out of the box pre-configured to work with *very* specific monitors. It's not an *impossible* ask, but it may be a product with a target market measured in the single digits. (Frankly if your interest is running ancient productivity software that benefits from massive pixel counts you'd be far better off running that on a newer computer than the original 68k platform. Unlike games most of that stuff is decently tolerant of running either on a newer Power Mac with more-modern video hardware or fully in emulation.)

 
Last edited by a moderator:

Cory5412

Daring Pioneer of the Future
Staff member
With all that said, I'm curious to see whether you can distill what you think the problem is.

And, don't just list a bunch of  75Hz video modes, because LCD monitors do fine at 75Hz, they do fine with 832x624 and 1152x870, and most of the Macs that can output those resolutions can go down to 60Hz anyway. Is there a particular video card that can't? Why can't that video card just be replaced with one that can? (You linked to the 24AC video card above, which is a multiscan video card and will be able to do the 60Hz output version of every resolution it can do, just as a Quadra or Power Mac's onboard video can.)

As the CRTs all die out, I think that it's just going to be a reality that we aren't going to get 640x870 portrait display mode.

I think the main sticking point here is that what's desired is some magic glue that will work with *any* old Mac video card, including obscure high-end ones that came out of the box pre-configured to work with *very* specific monitors.
Possibly, if so, you've managed to say in a single sentence what I haven't been able to extract out of five or six posts.

If this is true, then my suggestion here is that using those video cards is likely going to have to die along with those displays. It seems to me like the better solution in this scenario is to replace those video cards with something a little more generic, that was itself multisync, such as the cited 24AC video card, or the hypothetical NuBus HDMI output video card that has been discussed. Both are also solutions for systems like the IIci (and IIsi?) that didn't originally allow for a "VGA" mode -- but, again, I'm reasonably sure most midrange business LCDs will accept that signal with no trouble.

I should dig out my IIsi and give it a whirl with my P1914S and my U1504FP.

In the case of the IIci/IIsi in particular, where 640x480 is very close to the to the top capability of the video system, the OSSC seems relevant, especially if someone has a display that for some reason doesn't accept 67Hz input. My question is, what's cheaper: 135 Euro (excl. VAT) for the OSSC or $120USD for a brand new display that will work. (At worst, there's the 'Elite' version for $160, which also includes a USB hub and DVI and DisplayPort connectivity, but I suspect the basic version will work.)

///

Of course, we could also do what the video game people are doing and maintain+repair the CRTs we have, but I know that shipping CRTs around and finding someone willing to repair them and then either funding it well enough that that person can do it professionally or not burning them out by surrounding them with people's semi-broken CRTs while they work a regular job is going to be tough to do.

 

Trash80toHP_Mini

NIGHT STALKER
Already acknowledged that. Enough already, that doesn't keep it from being interesting or possibly leading to more useful applications of the concept.

There isn't any way to do what you're describing because, think about it: a gate valve controls *pressure*, which is roughly analogous to voltage. The electronic analog is a simple resistor.
That didn't sound right after some thought, so I researched "Analog Delay Circuit" and came up with some interesting tidbits:

https://en.wikipedia.org/wiki/Analog_delay_line

https://en.wikipedia.org/wiki/Bucket-brigade_device the concept  of which led to development of the CCD.

A well-known

integrated circuit device around 1980, the Reticon SAD-1024[2] implemented two 512-stage analog delay lines in a 16-pin DIP. It allowed clock frequencies ranging from 1.5 kHz to more than 1.5 MHz. The SAD-512 was a single delay line version. The Philips Semiconductors TDA1022[3] similarly offered a 512-stage delay line but with a clock rate range of 5–500 kHz. Other common BBD chips include the Panasonic MN3005,[4] MN3007[5] and MN3205,[6] with the primary differences being the available delay time.
Here I'm curious about modifying the single BNC output of in my SE to something more stable for the several multisync LCDs and CRTs I've recently tested with mixed results. Also wondering about application to early Full Page Display cards if they're analog interfaces and any other proprietary analog interface for a dedicated display.

A digital version of same might bring the multitude of proprietary DE-9 TTL interfaced cards into phase with true VGA adaptation or a Mac sync rate that can be handled by available LCDs?

The frequency mismatch between the Radius Full Page Display card generations might be another application. I know of at least one member lucky enough to have Display and a mismatched card. Putting any FPD card's output onto an LCD would be a very cool in general.

 
Last edited by a moderator:

Gorgonops

Moderator
Staff member
That didn't sound right after some thought, so I researched "Analog Delay Circuit" and came up with some interesting tidbits:
I mentioned analog delay lines in my previous reply, and I will repeat: those are not the droids you're looking for. These devices introduce delay in signals, they do not change their frequency. Frequency is what you're trying to change if you're attempting to take pixels output at dot-clock X and massage it to run at dot-clock Y. The closest digital analog to a bucket-brigade device is a dynamic shift register; both are essentially a type of memory that can only retain a value by cycling the bits through it, and the bits can only be read off at the same frequency as they were written in.(*)

(* Okay, that's not *entirely* true, I think in theory you *might* be able to fill a bucket brigade device and then clock it down or up and play it out at an alternate rate, I don't know enough about the devices to know for sure, but the sample length actually stored in them is very short, and that's at audio frequencies. These antique devices simply will not work for video applications; there's a reason why they've been almost entirely replaced with digital sampling and RAM.)

Also, just for the record, what "didn't sound right"? You explained it yourself that a "Gate Valve" controls water pressure, and the electrical analogy to water pressure is voltage. Again, pixels are *transitions* in voltage. A plumbing analogy to that might be vibration from water hammer or rapidly slamming the valve open and shut again? Sorry, but I don't know what plumbing apparatus exists to take *vibration* from a pipe full of water and magically change it so it's vibrating at the same amplitude but at a different frequency. (IE, preserving the information embedded in the vibration but "de-clocking" it from the source.)

 
Last edited by a moderator:

Trash80toHP_Mini

NIGHT STALKER
There isn't any way to do what you're describing because, think about it: a gate valve controls *pressure*, which is roughly analogous to voltage. The electronic analog is a simple resistor.
The part about the the electronic analog being a resistor was what didn't sit right after some reflection. As I understand it they limit current output where I'm looking at retarding a signal to a lower frequency.

The problem here is *frequency*, IE, we need to reduce the number of phase transitions per second for an output relative to its input without losing any data except for discrete packets we choose to throw out.
Chucking that extra frame onto the floor seems difficult to to analog-ishly.

There's no way to do that I can think of without a "memory", and further I'm having trouble thinking of any analog memory systems (like, I dunno, mercury delay lines) that allow the output to be " clocked" at a different rate than the input.
Maybe I read the bit about the ICs mentioned above or something like them being up to the job of retarding a single pixel drawn at whatever frequency to hitting the screen a specified interval later than it would, effectively reducing the frequency of the video stream.

I was thinking about something like a bucket brigade retarding each B&W pixel blip from an output stream generated by the likes of my TPD/SE, slowing it to hit the screen at a 60Hz interval rather than at its native, shorter 75Hz interval. After drawing 4 frames of 5, a counter would start the process over again from frame 6 generated by the TPD card with frame 5 hitting the floor. That's a rough example, dunno the freq of that particular card, doesn't matter for the purposes of illustration.

Everything is timed at the Dot Clock of the card, everything tick on that 75Hz would be retarded to 80% so that 4 of every 5  .  .  .

Again I'm sure I've overlooked the memory problem, but maybe not?

@Cory5412 the reason for listing those 75Hz resolutions was that 4 of 5 frame 60Hzx/75Hz multiple. That's far more simple than trying to convert 66Hz to 60Hz along the lines of the stepdown approach.

 
Last edited by a moderator:

Gorgonops

Moderator
Staff member
The part about the the electronic analog being a resistor was what didn't sit right after some reflection.
Well, technically yes, a resistor isn't the *optimal* way to reduce voltage because technically it restricts current, not voltage, and thus the voltage drop over a resistor is dependent on the current draw. (And it's also terribly inefficient at high currents, converting most of the input power to heat.) In real life you'd want to use a more sophisticated power regulator. "Resistor" was simply shorthand for "a flow limiter" or "kink in the hose" because in the (yes, flawed in certain respects, especially in how it ignores magnetic fields) "Water Analogy" for electricity PRESSURE EQUALS VOLTAGE.

In any case this is completely irrelevant to the underlying point I was making, which is that frequency is NOT equal to pressure.

 
Last edited by a moderator:

Trash80toHP_Mini

NIGHT STALKER
He, I'm still trying to piece that post together. Quoting across a page break like that is maddening.

edit: just [:)] signed off on the blather above, fire away!

 
Last edited by a moderator:

Gorgonops

Moderator
Staff member
I was thinking about something like a bucket brigade retarding each B&W pixel blip from an output stream generated by the likes of my TPD/SE, slowing it to hit the screen at a 60Hz interval rather than at its native, shorter 75Hz interval.
NO.

For now the third time, delay lines *delay*, they do not *slow*. Perhaps that's a un-intuitive distinction, but it's actually pretty simple: If I pick up the phone and call you from a mile away and that conversation goes through a direct trunk line there will be essentially zero *delay* in the call; you will hear my voice with probably less than a millisecond of delay between the encoding on the microphone end and the reproduction in the speaker. Now let's pretend I call you on a telephone that relays my voice via a laser pointed at one of the reflectors the Apollo astronauts left on the moon. That's almost a four second round trip for light, so there will be a *significant* delay in you hearing the sound of my voice. (If you were looking at me through binoculars you'd see my mouth moving like you were watching a *very* badly dubbed movie.) But, and this is the important part, the pitch of my voice will not change. It is delayed, but not slowed.

Again I'm sure I've overlooked the memory problem, but maybe not?


You clearly have. Think about what you're saying here and ask yourself how the delay line which *delays* every pixel by, say, the 20% you're bandying about here, from arriving at the monitor also manages to *stretch* that pixel by the same 20% so the waveform is the same, but slower, and then think about what happens to each subsequent pixel as they pile up at the start of your waveform-stretching-with-no-memory-to-buffer creation.

 
Last edited by a moderator:

Trash80toHP_Mini

NIGHT STALKER
"Water Analogy" for electricity PRESSURE EQUALS VOLTAGE. But that is completely irrelevant to the underlying point I was making, which is that frequency is NOT equal to pressure.
My plumbing analogies always leak. The bucket brigade think threw me for a loop, one of the images I'm playing with is a gear driven sprinkler head. :blink:

 

Trash80toHP_Mini

NIGHT STALKER
I believe you! [:)]

For now the third time, delay lines *delay*, they do not *slow*. Perhaps that's a un-intuitive distinction, but it's actually pretty simple  .  .  .
Got it, sort of, but "pitch" staying the same appears to conflict with this statement.

.  .  .  ask yourself how the delay line which *delays* every pixel by, say, the 20% you're bandying about here, from arriving at the monitor also manages to *stretch* that pixel by the same 20% so the waveform is the same, but slower, and then think about what happens to each subsequent pixel as they pile up at the start of your waveform-stretching-with-no-memory-to-buffer creation.
You appear to be saying the pitch of the pixel changes? ISTR something about applying filters before and after using a delay to keep a signal consistent, but too tired to read it over again ATM.

Do you want the wave form to be the same if you're trying to match dot clock frequencies in a B&W video signal stream? It was a case of high or low on the dot and phosphor persistence in the CRT took up the slack along with brightness and contrast controls to get an acceptable image  .  .  .  did it matter how long or short the spike might be so long as it's centered on the dot?

Apologies, too tired to think very straight ATM, but still curious.

 
Last edited by a moderator:

Gorgonops

Moderator
Staff member
You appear to be saying the pitch of the pixel changes?
This only refers to the hypothetical machine you've built in your head that somehow accomplishes what real delay lines don't do. It's a thought experiment to imagine exactly how you're supposed to change the pitch of a signal without either storing it and playing it back at a different rate (ie, requiring a memory of some kind) or doing some kind of brutal lossy periodic resampling. (Which is what, for example, toy electronic voice changers do, but those techniques are not applicable to raster video, at least not if it's running at a different frame rate.)

 
Last edited by a moderator:

Trash80toHP_Mini

NIGHT STALKER
OK, I come up with crazy ideas for all kinds of crap, most of 'em don't pan out and some bomb out completely like this one, but every once in a while I set some wheels to turning. Learning is good. Thank you!

@Cory5412 about that "VGA Adapter" moniker. IIRC I've got some in packages that say "Macintosh Video Adapter." I'll have to check, that would be a better term that's more in tune with the MacVDO manual I posted, but many say Mac to VGA adapter or something similar like the "Belkin Mac-VGA" product manuals someone posted. What a mess.

 
Top