You're a confusing kinda guy lol. Extensions on, extensions OFF, wax ON , Wax OFF.
It's simple: Nvidia card = Nvidia extensions ON, ATI extensions OFF.
I think I've got it, but you won't like it.
From Wikipedia:
…the DVI connector includes pins for the display data channel (DDC). A newer version of DDC called DDC2 allows the graphics adapter to read the monitor's extended display identification data (EDID). When a source and display are connected, the source first queries the display's capabilities by reading the monitor EDID block over an I²C link. The EDID block contains the display's identification, color characteristics (such as gamma value), and table of supported video modes. The table can designate a preferred mode or native resolution. Each mode is a set of CRT timing values that define the duration and frequency of the horizontal/vertical sync, the positioning of the active display area, the horizontal resolution, vertical resolution, and refresh rate.
Problem One: Your 27" monitor needs dual-link DVI to display its full resolution. The card cannot supply that.
Additional confusing issue: The card has a full set of pinholes, so you CAN plug a dual-link cable into it though the "extra" pins receive no signal.
Problem Two: You would think (note the italics) the monitor would just adapt itself to the incoming signal and scale to a viewable size, but evidently it won't.
Problem Three: The DDC-over-EDID signal (yeah, I know…arrgh!) also communicates the color resolution and such. So, since the monitor and Nvidia card aren't talking, the card keeps resetting itself to the "fallback" default: 640 x 480 @ 60Hz 256 colors. That's why the display goes black.
Remember, this all started just because you want to remove one monitor from your desk AND don't forget, you had NO trouble with the 24" single-link monitor. I have a 27" Samsung "Syncmaster 2770" that accepts the single-link signal from my stock ATI card in my G4 and scales it to full size.
I do NOT however, use it for film work. The pic, while generally OK would quickly drive me nuts for lack of fine detail.
I have no idea why your monitor won't "talk" to what is a standard DVI signal and at least display something.
It may (this is a guess) speak only DDR2 - which, although I would think that would be stupid, it wouldn't surprise me either. This is info NOT supplied in the manual (naturally) but would be typical of hardware-engineer thinking because after all, what kind of idiot would try to use something as old and obsolete as OS9 on a G4 in 2019? A real pro would certain have only the latest and greatest, right?
At the end of the day, honestly, if all you need to do to make everything work well for you is make space for two monitors (or find one that is more versatile) you're ahead of the game.