Please login or register.

Login with username, password and session length
Advanced search  

News:

Pages: 1 [2]   Go Down

Author Topic: DVI video jitter on Radeon 9800 pro 64 MB  (Read 10808 times)

darthnVader

  • Moderator
  • 512 MB
  • *****
  • Posts: 683
  • New Member
Re: DVI video jitter on Radeon 9800 pro 64 MB
« Reply #20 on: February 25, 2025, 06:49:40 AM »

I am running a G4 quicksilver 933 with an ATI Radeon 9000 pro 64 MB card using DVI cable, monitor is a Samsung syncmaster 244T which usually had no issues before.

On highest resolution (and also native Mac stuff) I get this annoying jitter as shown here: https://youtu.be/dnoSrzu9zDg?feature=shared

I tried different digital/analog DVI cables and also installed the latest drivers from 2002.

This only happens on the highest resolution but disappears when I switch to the next lower resolution.

I ran Apple hardware test completely and it found video memory to be ok.

Anybody ever seen this? Can it be the power supply?

Likely a bug between the advanced EDID of the display and the 'ndrv' of the Radeon 9000.

The sad life and support cycles of GFX cards and their lack of documentation.

If users had run into this issue during the support cycle of the Radeon 9000 Pro then some type of hotfix would have been issued by ATI and the issue would have been 'resolved' to the limitations of the hardware.


EDID data, once upon a time was on a writable flash chip, so the manufacturer of the display could also issue hotfixes for these type of display issues when OEM, or aftermarkets like ATI/AMD nVidia made them aware of an issue with the EDID.

I've used tools on Windows that could decode the EDID data so it could be edited and flashed it back to the display to resolve specific issues.

When you are dealing with a display that was not in common use when the OS or graphics card was still in it's active support cycle, if it's decades newer, then such issues are rather common.

LCD's normally have pretty good whitepapers available on the web. Pulling the display apart and seeing what LCD panel was used, if you can't find reliable info on the web, normally yields the white papers for the LCD so we can understand what the actual timings and gate of the display are.

Then tools like EDID overrides like SwitchRes become useful stopgaps that can fix the issues in the desktop environment.

A real fix would be the EDID or the 'ndrv', but those are more complex. Advanced EDID is pretty documented and once you read the documentation of the LCD and decode the EDID a fix maybe available if the chip that contains the EDID is flash programable.

If it is not, or the issue is not with the EDID, then you are stuck with SwitchRes or disassembling the code from the 'ndrv' and trying to patch it at the middleware level.

When Open firmware boots, it reads the graphics card ROM and executes the code there, reading the display EDID and trying to find the correct timings of the display to set a proper resolution. OF will try and honor a resolution in NVRAM if one exists and just pick the best mode it can find if one does not.

If and 'ndrv' is in the cards ROM it will load it as a property in the device-tree, but OF can not do anything with that data. It passes the data to the Mac OS on boot, and on New World Macs, the Mac OS ROM file in the System folder and the system suitcase will execute the data in the 'ndrv'.

This happens very early, that blank grey screen between the Happy Mac, and the OS splash screen. It will also check the disk /System/Extension folder for an Updated disk based 'ndrv' and use that if it matches the graphics card and it's encoded date is newer than the encoded date for the 'ndrv' in ROM.

The 'ndrv' also polls the EDID data, and all this gives you your final display data the OS can use. Your list of resolutions and such.

Hotfixes can also be made in the other parts of the graphics card ROM, but rarely are, because it mostly deal with the GPU and VRAM, and that very early display detection. Display detection EDID is standard stuff, and as long as there are no bugs that prevent being able to enter OF, fixes are very rarely made on this level, but sometimes.

But long has passed on the support cycle of a graphics card they stopped selling and supporting when I was a young man, and ATI/AMD has not been forthcoming with Whitepapers for many of it's older GPUs.

This is why Open Hardware is needed. We need at least a basic overview of the CRTC functions of the GPU, and the 3D shit would not hurt either.

Now sometimes, copywrite and NDA's are an issue, and back in the day, more so. The graphics card companies would buy tech they did not own and use that on the card, and if they agreed to an NDA or copywrite on that, things get murky even decades later on what specs for a GPU they can release and what they can not. To add to this, when AMD bought ATI, what did they buy, and what do they own?

If a 3rd party made some of the tech, and that 3rd party went out of business 15 years ago, who do you contact about an NDA or copywrite release so AMD can share the whitepapers, that may or may not still exist and may or may not have been an agreed part of the sale of ATI to AMD.

Is any of that complex enough for you?

 
Logged

darthnVader

  • Moderator
  • 512 MB
  • *****
  • Posts: 683
  • New Member
Re: DVI video jitter on Radeon 9800 pro 64 MB
« Reply #21 on: February 25, 2025, 07:23:16 AM »

I wasn't talking about the connector. I was talking about how Apple implemented ADC through the video card.
The ADC power connector on the motherboard frequently moved. In many cases, you cannot move an ADC-equipped card between generations of G4/G5 and have ADC continue to work.
They used undocumented connectors in the AGP slot to enable extra features for ADC, those same pins then became documented in a later version of AGP which is why we have to tape off certain pins on cards.
The connector, as you said, was perfectly fine. The way they haphazardly implemented it was not, and was certainly a contributor to the extreme dearth of aftermarket cards with ADC.

Hoo boy…
1) The ADC moved on the G5 cards because they were 8xAGP and not intended to be used on G4 and such anyway, and because the AGP slots themselves expanded to accommodate dual-link cards.
2) The only reason you tape off pins is to enable an 8xAGP card to operate in a Mac 4xAGP slot, and that's just a hack that hard-cores like us use to make the "cool" 8xAGP G5 cards (mostly Radeon 9600 or 9800) run at 4xAGP to work in older G4's to get more memory for games since they can't really utilize 8xAGP anyway.
3) Apple caused the, as you say, "extreme death" of ADC itself beginning with the 30" Cinema Display and later monitors requiring more power and overtaxing the computer PSU's.  No card was ever made with two ADC ports for the same reason.
I don't think you can really refer to any ADC-equipped cards at all as "aftermarket" since ADC was proprietary, never appearing on anything other than Macs. Any card you find with an ADC connector was OEM, either stock or optional on BTO, in either a late B&W, Cube, QS, MDD or G5.

Apple used some of the unused pins on AGP 2x and 4x to drive the USB of the ADC display, I've often wondered if that modification of the standard was allowed, and if it was not, then they are not AGP.

At any rate, because of this, plugging and older APG/ADC card in a G5's AGP pro slot, that used those unused pins would result in a fire.

Not the power connector part, that little tab at the end that was moved, but the AGP pins themselves.

Apple did not want the G5's burning in your house because you put a older APG 2x 4x ADC card into your new G5.

So they moved the tab as a way to prevent users from suing them.

AGP was backward and forward compatible, Apple's deviation of AGP was outside the norm, and bit endusers on the ass.

But I have put a PC Rage Pro x1 AGP card into the 8x AGP slot of the G5, and there is not electrical issue.

The only reason the G5's had ADC was people who upgraded their Pro towers didn't always want to upgraded what was at the time an expensive display.

Apple would have left it out entirely.

And what happened to MDX??

Were there ever any 3rd party MDX cards?

ADC modified AGP was better than AGP, but the deviation from the standard because Apple did not want to use a cable from the Logic board to the ADC card, or put the USB into the power tab???

Lots of things Apple did when they were a Computer company, now they are a phone company and oh by the way we also have a hobby of making Macintoshes.
Logged
Pages: 1 [2]   Go Up

Recent Topics