• Updated 2023-07-12: Hello, Guest! Welcome back, and be sure to check out this follow-up post about our outage a week or so ago.

ATI Radeon 7000 PCI - Dual or Single Head?

Trash80toHP_Mini

NIGHT STALKER
I'm only driving one LCD off the DVI port, so it doesn't really matter. But I'm curious about this just the same. Spec. didn't seem to be present in the manual PDF for the Mac version. This leads me to believe it's a single head card.

The only Dual head card I've got for the QuickSilver and MDD are a pair AGP Nvidia GeForce4MX (VGA and DVI connector variants) cards, but the second port on both versions is that accursed, proprietary ADC port. This renders it useless without an expen$ive ADC to DVI converter if you're not running an Apple Display from the Iron Age.

That said, any opinions about the ATI Radeon 7000 PCI? I'm using it as a primary/tool bar display for AI8, so the 2D/3D acceleration aspects aren't really an issue, but knowing its foibles pre-purchase would be helpful.

 

Gorgonops

Moderator
Staff member
The "official" retail Radeon 7000 Mac Edition is a dual-head card with VGA and DVI (and S-Video, for that matter) ports. However, note that there do exist R7000-based Mac cards that only have a single port populated, both third party and from Apple. (One of the optional video cards for the G5 Xserves was an R7000 board that only has a VGA port; I have two of these stuck in my B&W. Sadly the circuit board has the traces for the other ports, they and the supporting components just aren't populated.)

but the second port on both versions is that accursed, proprietary ADC port. This renders it useless without an expen$ive ADC to DVI converter if you're not running an Apple Display from the Iron Age.

I was about to say that the ADC video port converter shouldn't cost that much because, well, I knew the adapter to use an ADC *monitor* on a DVI *card* was ridiculous, but doing the opposite just requires a simple wiring adapter, but sheesh, they seem to have gotten really thin on the ground over the last decade because I can't find one for anything less than an utterly absurd price.

Wait, here's a guy on eBay that has them BIN for $19. The GeForce4MX is a significantly faster card than the Radeon 7000 so it's probably a better way to go.

 

Trash80toHP_Mini

NIGHT STALKER
Yea, faster is better in most cases, but having a decent PCI VidCard around for general use has a quality all its own. I don't run any games, just Graphics and CAD/CAM apps under 9.2.2 so speed isn't really a factor. Yeah, going the other way requires a power supply for the monitor, that was expensive! That's a whole lot of trouble to go through just to run one of the few displays afflicted with ADC.

$19 for the adapter vs. $30 for the card. Thanks for that link, G. I'll have to mull it over tonight.

 

Trash80toHP_Mini

NIGHT STALKER
Hold on a second! What the HECK is DVI-D? Maybe I should go for the card that has all the pins?

s-l1600.jpg.8b66d862f620c0c720ef04686b830a98.jpg


 

Cory5412

Daring Pioneer of the Future
Staff member
A Radeon 7000 card should have a DVI-I connector so that you can use DVI to VGA adapters. I don't know if there are any cards that don't, at least not anything older than, say, three years or so.

The things that mostly have DVI-D are monitors, usually they'll have all of the pin holes, but they'll only be wired for digital input. That is probably the case with most Apple adapters and monitors (in particular, the DVI to ADC adapter and any Apple monitor featuring DVI input.)

 

Gorgonops

Moderator
Staff member
So, it's odd they put that picture in there, because if you look it's not a picture of the plug on the adapter, it's a picture of the business end of a DVI *cable*. The picture of the *adapter* shows it has all the DVI-D pins present, the missing pins on that picture above indicate that it's only a "single-link" cable instead of a "dual-link" cable. Perhaps the poster just wanted to make clear that the adapter doesn't accept DVI cables that have the *analog* pins present.

DVI-D just means "Digital DVI". As mostly pointless as it is there's a standard for "DVI-A" which is analog (VGA) signals carried over a DVI plug. Here, maybe this will help:

1024px-DVI_pinout_svg.png.4b430c22b4f92f4e6ce8d9297a68b402.png


What it boils down to is that this adapter does not carry through the stuff in pink and won't accept cables with the pink pins present. The only reason this would matter to you is if you intended to stack a "DVI-I" (IE, "DVI Integrated", IE, both digital and analog present) to VGA adapter on after this to run an analog monitor. But if you want to run an analog monitor you'd be better off getting a direct ADC to VGA cable in the first place.

(In related auctions I see another brand ADC-DVI adapter for sale, and it likewise lacks the analog pins, so perhaps they don't exist.)

As to whether this adapter or a second PCI card is a better idea, well, under OS X I'd say dual-heading on the GeForce is a better idea because PCI video cards are distinctly second-class citizens under OS X. Under OS 9, well, probably doesn't matter. Although if you intend to put a Radeon into a box with a different brand card as the primary I'd sort of wonder if you're asking for driver problems you won't have if you're just using one card.

 

Gorgonops

Moderator
Staff member
DVI-D just means "Digital DVI". As mostly pointless as it is there's a standard for "DVI-A" which is analog (VGA) signals carried over a DVI plug. Here, maybe this will help:
Just replying to myself for laughs because, well, I thought it was pretty funny at the time: I *have*, once in my life, actually seen a monitor that had a DVI plug on it that *only* had the analog pins present. It was an incredibly cheap knockoff brand widescreen monitor/TV combination that really only supported VGA, but to make it *sound* better they stuck a DVI shell connector on it and included a DVI-A cable in the box, I guess assuming that nobody would notice.

 

Trash80toHP_Mini

NIGHT STALKER
LOL! Now I have to decide if I want to get the card I can use in the rest of the PCI Macs in less than a week or wait for the slow boat from China to land the adapter at my doorstep. ::)

Thanks for pointing out the whackdoodle assortment of pics of that dongle, G.

Thanks all for the info, I was heading out to do a few errands and come back home to all the answers!

p.s. speaking of errands: the kid at CrapShack sent me to Lowe's Home Improvement to get the continuity tester they no longer carry. :lol:

 

Trash80toHP_Mini

NIGHT STALKER
I just ordered yet another dongle for my Macs. I'm hoping going that route will make switching between displays as main screen/expanded desktop will be simplified. Someday I need to do a count of all the dongles and adapters I've amassed over the years.

I hear dongles are the next new thing for the iLeet MBP crowd. :lol:

 

Trash80toHP_Mini

NIGHT STALKER
The GeForce4MX is a significantly faster card than the Radeon 7000 so it's probably a better way to go.
While buttoning up the VidCards/dongles/adapters/cables mess to wait out delivery of that new dongle, I found a PCI card that might work in the meantime:

PNY GeForce FX5200 DDR 256MB VCGFX522PPB PCI VGA Video Card #GF05200PUD25G

The bar code lettering starts out: M103 so I'm hoping it's the Mac version. Any way to tell besides installing it and firing it up?

 

Cory5412

Daring Pioneer of the Future
Staff member
I can't find any references online to a PNY GF5200 for Macs, so you've probably struck out and bought yourself a pretty mediocre low end GPU for a PC from ~2003-2004. It won't hurt to pop it in and try, though.

 

Gorgonops

Moderator
Staff member
Allegedly it's fairly straightforward to flash rotgut gf5200 cards with a Mac firmware. However, beware that these cards are only good for OS X, they're essentially useless under 9.x.

 

Compgeke

Well-known member
The PNY FX5200 PCI cards with the two VGA ports are especially easy to flash. I've done it a few times myself on them and a BFG AGP card.

 

Trash80toHP_Mini

NIGHT STALKER
Yep, that's the one, THX. But the very thought of flashng a VidCard has always been an anxiety producing process for me.

.  .  .  beware that these cards are only good for OS X, they're essentially useless under 9.x.
Well, that answers that in my case. After trying the very first release of OSX, I decided that it was slower running my OS.9 production software and that I didn't like it. Upgrading/borking my production processes and converting those apps that would be compatible under X would be cost prohibitive and time consuming.

I must have bough it for the ubuntuBox to achieve a triple headed work environment for learning to use the GIMP.

 

Cory5412

Daring Pioneer of the Future
Staff member
Beside the main point, but:

After trying the very first release of OSX, I decided that it was slower running my OS.9 production software and that I didn't like it.
Actually that's an interesting question.

10.0 and 10.1 are widely regarded as being horrible. It was almost more of a proof of concept than something you could actually use. Did you revisit that over time or keep with OS 9?

OS 9 was pretty practical for a surprisingly long time (Apple had to release a new Power Macintosh specifically for it in 2003), but depending on what you were doing, if you were still using it in, say, 2006, that might have been weird.

 

bunnspecial

Well-known member
I flashed a PNY FX5200 a while back for use in a B&W with a G4 upgrade. The one I have is dual VGA and has 256mb VRAM. I have the Quadro equivalent to this card, which is dual DVI, but I haven't gotten around to flashing it.

nVidia cards are a bit of a pain to flash since they have to be done in a PC and you need exactly the correct version of NVFlash. Too old or too new most likely won't see the car, or if it does see it won't allow you to flash it. ATI cards are much easier in this respect since they can be flashed in OS X.

As a bit of fun trivia on the different DVI standards-there's actually one Apple ADC monitor that doesn't work at all with the A1006 DVI-ADC adapter. That monitor is the 17" ADC CRT. The ADC port is-for all intents and purposes-a DVI-I port with USB and power added. The A1006 only passes the DVI-D signal and the CRT requires the DVI-A signal. I've been told that there are 3rd party adapters that will work with the CRT, but I've never bothered since I have a good ADC card for any computer I would use one with(my main one in active service is on a Cube with a Geforce 3).

 

Trash80toHP_Mini

NIGHT STALKER
OS 9 was pretty practical for a surprisingly long time (Apple had to release a new Power Macintosh specifically for it in 2003), but depending on what you were doing, if you were still using it in, say, 2006, that might have been weird.
LOL! I'm happily playing around in AI8 tonight under 9.2.2 on the Quicksilver, how's that for crazy? :blink:

I think it was 2004 that I shut down the sign shop and moved south. OS9 and my apps were going strong at that point. I was still cutting vinyl lettering from layouts done in AI on the DA. If I'd stayed in the business and gotten into imaging on vinyl I probably would have made the move to OSX when my first primary app dropped support for OS9 or when I first needed one that was an OSX only release.

Thankfully that never happened and I'm cozy in my little OS9 graphic arts bubble IRL and using a dedicated Firefox NetStation under Win7 to interface with this surreal world.

p.s. I'm slowly prepping the 2003 MDD/OS9 Special Edition for the time when the QS might pack it in. I believe that backup machines should be upgrades waiting in the wings. [:)] ]'>

 
Top