Jump to content
Sign in to follow this  
Powerbase

DVI-out to HDMI to TV on Radeon Mac Edition PCI, OS 9

Recommended Posts

Does anyone have any experience in using the DVI port on an 'ole Radeon to output to to a TV?   I used a cheapo-generic dvi-to-hdmi adapter but I'm not sure that's what's at fault.  I've heard (I can't dredge them up at the moment) that the original Radeon had some issues with the DVI port (no output, etc.).

 

I did used to have it hooked up via good ole VGA but that's not practical as its on the opposite side  of the room from the TV now (I have a tested 25-foot HDMI cable I use now).

Share this post


Link to post
Share on other sites

Just for sanity's sake, have you checked to make sure the DVI output works on a computer monitor with a DVI port? Just to reduce the number of variables, etc.

 

At least at the moment I'm lacking the Google-Fu to find any awnsers, or even formulate the right question, so I'll settle for saying that I think you might be having a real problem that I personally never found the answer to. I got my first HD TV back in 2004, a rear-projection Sony WEGA; it was the first TV in its model line to have both an ATSC tuner and an HDMI port, the previous ones had DVI. One of the first things I did with that TV was buy a DVI->HDMI adapter cable and try hooking it up to various computers, and something I vaguely remember was that some of the oldest video cards I tried simply didn't play ball with it.

What video modes are you using over VGA? (My TV didn't have VGA so that wasn't an option.) Is the computer picking up the correct native resolutions, etc? I think one reason the WEGA didn't work with really old video cards is it would *only* list standard HDTV modes as options in its EDID resolution list and said cards would wig out at that at boot time even though they were technically able to display those resolutions later.

Share this post


Link to post
Share on other sites

Early Radeons and later versions need different DVI to HDMI adapters. I tried to use an original adapter from a HD2600XT on a HD4650 and got no output but a new adapter worked fine.

Share this post


Link to post
Share on other sites

DVI to HDMI *should* be just a passive wiring adapter. I've never heard of there being "different" versions...

 

Okay, this page says that ATI may have done something weird and nonstandard to enable audio over DVI(?) that depends on a nonstandard adapter plug. But I 100% guarantee that's not going to be an issue on any Radeon card that works with OS 9. (I mean, there are "early" Radeon cards and there are *EARLY* Radeon cards. ATI was making Radeon cards for seven years before the 2600XT came out.)

 

I'm trying to remember now if my TV worked when connected to a Titanium Powerbook. I have vague memories of having issues with that but it working fine on an Aluminum model...

Share this post


Link to post
Share on other sites

There is no "standard" way to do audio over DVI. A number of vendors came up with hacks to do it (apparently for the sole purpose of supporting proprietary DVI-to-HDMI adapters) but, yes, an ancient Radeon card is never doing to do it. When using a DVI->HDMI cable back in the day I always kept an RCA to mini-jack audio cable plugged in to take care of that.

Again, my vague guess is that, assuming the DVI port on the card works with monitors (it's not a flashed card, is it? All bets might be off there) there's something about the EDID information that the card just isn't grokking.

Share this post


Link to post
Share on other sites

I have a machine with a Radeon X800 (is that early enough? :)) ) using a DVI to HDMI adapter that came with a much newer HD3870 with no problems on some random Vizio 1080p HDTV. Its a 100% passive adapter and DVI to HDMI cables date back to the port's creation. Audio over DVI on newer cards is carried the same way as it is over HDMI, its embedded in the video signal (HDMI has no dedicated audio pins). A 25ft HDMI cable could be a problem though as signal integrity becomes a problem with cheap cables on longer runs.

Share this post


Link to post
Share on other sites
15 hours ago, NJRoadfan said:

Audio over DVI on newer cards is carried the same way as it is over HDMI, its embedded in the video signal (HDMI has no dedicated audio pins).

Sure, but technically speaking if your video card has a DVI port that supports audio with a passive adapter it's really basically an HDMI port with a DVI connector on it. (Well, okay, it may also do dual link.) HDMI wasn't ratified as a standard until late 2002, which postdates most OS 9 compatible video cards so... yeah, audio is not going to be in play here.

 

(Some googling suggests the HD 2000 series, circa 2007, was the first Radeon that supported audio on the DVI port, and the Wikipedia article suggests the HDMI dongle for those may have had some proprietary-ishness to it.)

Share this post


Link to post
Share on other sites

The Amazon reviews of the Radeon 9200 Mac Edition mention that the DVI port doesn't work on HDMI TVs back in 2006: https://www.amazon.com/ATI-Radeon-9200-Video-100-436011/dp/B0002F8MJY

 

The oldest DVI equipped card I have is a 1999 Matrox Marvel G400-TV with the official DVI daughter card add-on. Granted its in a PC and not made by ATI, I should see if it works with HDMI TVs.

Share this post


Link to post
Share on other sites

I want to say that my TV worked with the DVI port on an Aluminum Powerbook (Radeon 9600)... actually, I the guy in that review mentioning HDMI also says it worked on his Powerbook, so I guess it must have.

 

I'm trying to specifically remember now what I tried back in the day that failed; I'm pretty sure I had a Radeon 7500 and several different models of Geforce (ranging from 2 to 4MX... or 5000-something, maybe? It definitely did not work with the Geforce 2, its DVI port hardly worked with monitors) still in play on the PC side, and I *may* have tried it on a Titanium G4, honestly can't remember. (I ended up leaving it connected to the oldest card that worked in a junk PC set up to run emulators. It was mind-blowingly awesome back in the day to play N64 Super Smash Brothers hacked for widescreen, using PS2 controllers instead of those atrocities the original hardware came with... anyway.)

Share this post


Link to post
Share on other sites

While ATI was early with the HD series cards that had audio built in, Nvidia was very late to the game and required a 2 wire cable going to the digital out of your motherboard to your card to get audio with HDMI adapters.

Share this post


Link to post
Share on other sites

If you really *need* audio to go over HDMI to your TV (instead of running an analog wire separately) you can get active DVI->HDMI converters that include an audio encoder. (This one is analog, but there are also units that accept coaxial/optical digital streams... and others that have a USB plug hanging off them to drive an embedded 2-channel sound card.) I'm curious how well those work in practice.

Currently I have an HDMI->DVI adapter in service but it's going the other way, allowing me to drive an old 18" 1280x1024 computer monitor with a Raspberry Pi velcroed to the back. I of course had to mess with the boot settings to get the Pi to output audio over the RCA jack instead of HDMI.

Share this post


Link to post
Share on other sites

It only matters if you are trying to make a set top media machine and don't want to mess with multiple cables and inputs. I prefer ATI cards for this because they have the HD decoder built in (newer Nvidia do also but I use old hardware).

Share this post


Link to post
Share on other sites

I have had this problem with several old video cards. Looking into it, DVI and HDMI are not as compatible as most people think. For one thing, HDMI uses "Y′CBCR" colour space by default with RGB being optional, DVI uses "RGB" and "Y′CBCR" is optional. 

 

Some old video card had an S-video connection with a higher number of pins and with the correct adapter(example: https://www.thinkcables.com/products/01779.html, you would use this to hook the card up to a TV with component video. Audio would be separate.

Share this post


Link to post
Share on other sites

The color space mismatch could well be the problem. It would be interesting to test the machine using the DVI-to-HDMI cable on a modern computer monitor with both HDMI and DVI input, since that would almost certainly support RGB color space. It's possible too that even if the TV supports RGB but defaults to YCbCr it could be outputting EDID info that the firmware/driver for a really ancient card doesn't understand and makes it bail.

Share this post


Link to post
Share on other sites

This is true, an HD monitor(not TV) should work fine with it. I tried hooking my Mac Mini up to a couple of now older HDTV and did not work.(just got a black screen). I wish there was an easy workaround for this since I want to use it with my video capture card witch SEEMS to suffer from the same problem. 

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

×