Mac OS 9 Lives
Mac OS 9 Discussion => Hardware => Video Cards, Monitors & Displays => Topic started by: gert79 on January 02, 2024, 11:36:26 AM
-
I am running a G4 quicksilver 933 with an ATI Radeon 9000 pro 64 MB card using DVI cable, monitor is a Samsung syncmaster 244T which usually had no issues before.
On highest resolution (and also native Mac stuff) I get this annoying jitter as shown here: https://youtu.be/dnoSrzu9zDg?feature=shared
I tried different digital/analog DVI cables and also installed the latest drivers from 2002.
This only happens on the highest resolution but disappears when I switch to the next lower resolution.
I ran Apple hardware test completely and it found video memory to be ok.
Anybody ever seen this? Can it be the power supply?
-
#1. I'm not a "I know everything about every little thing about Mac graphics cards and a zillion different video monitors" guy…… BUT:
I'm pretty sure you're looking for a boogeyman where there isn't one.
No, It's NOT the power supply.
It does NOT look like jitter.
It does look kinda like what happens when you input a low-res graphic to a very hi-res system and all of the detail that's not there to begin with has to be interpolated (read: made up out of thin air) by the display system.
You didn't bother to specify what "the highest resolution" or the "next lower resolution" you tried as determined by the monitor Control Panel were.
BUT
I suspect you are simply looking at an odd rez where that specific card output just won't work with the Samsung.
You know, odd, unexplainable stuff happens all the time with tech and even more so with 20+years-old tech.
You want to make yourself crazy? Install SwitchRes, get some real test patterns/images and you can spend a day charting out exactly what happens at each and every possible pairing. Then the next day, you can ask yourself why you did that because that's time you don't get back.
Or maybe someone else actually knows something about this oddity that I don't because… GOTO #1
-
im inclined to agree with above. that looks like display distortion - IF you KNOW FOR A FACT that both card/cable/monitor are in good working order and they support what ever resoultion you are on... then id lean towards hardware issue with the card BUT you would need to do some testing.
if you have another display, try that (anything, even a tv)
second - if you have another cable, try that.
third, try another video card at that same resoultion. an old PCI card, the origional Geforce from the mac itself, what ever. see if it works.
narrow things down.
-
It's the cable. OS9 gets real touchy at higher resolutions concerning cable quality. I've had cables work fine in OSX that do exactly this same thing at the same resolution in OS9. You need top quality cables for stable high res use in OS9.
-
Thank you guys for all the answers.
Issue on my end is, i do not have much parts to swap, i really only got that one monitor and one card and two different cables.
Also I am physically not in front of the machine right now so sorry i cannot tell exact resolutions and just say "highest" and "next lower".
I will look into finding a better quality cable to solve this issue, before i only have seen blurry images on cheap 15 pin vga cables.
-
im running an MDD 1.42 with a radeon 9000, hooked up to two 1920x1200 24" monitors. i had to go through a *lot* of VGA and DVI cables before I got ones that worked at that resolution without the kind of issues you described. "heavy duty" thick shielded cables, usually with ferrite cores at one or both ends, seemed to be the most reliable.
-
im running an MDD 1.42 with a radeon 9000, hooked up to two 1920x1200 24" monitors. i had to go through a *lot* of VGA and DVI cables before I got ones that worked at that resolution without the kind of issues you described. "heavy duty" thick shielded cables, usually with ferrite cores at one or both ends, seemed to be the most reliable.
You know, this makes me think (I know…rarity) about the way people always rag on Apple for their proprietary nothing-else-fits marketing. I also have an MDD and multiple 1920x1200 Cinema Displays. I have never had a cable issue or similar with those damn ADC cables everyone loves to hate.
I'm just sayin'……
-
I always felt that the reason ADC wasn't successful had more to do with its poor implementation in hardware rather than any inherent disadvantage of the connector itself.
-
I always felt that the reason ADC wasn't successful had more to do with its poor implementation in hardware rather than any inherent disadvantage of the connector itself.
ADC itself is an attempt to get rid of the power supply in the monitor, in favor of Apple’s design solutions. Which, of course, is an absolutely idiotic idea, let +12 volts with such a current flow through the motherboard and AGP slot and through the multilayer PCB, near the low-voltage tracks on the board. In itself, it is no different from DVI, and I must say thanks to Apple in the fact that they used DVI rather than an analog VGA signal, which, otherwise, I suspect would not have stability in the vicinity of high power supply currents of the monitor
-
I always felt that the reason ADC wasn't successful had more to do with its poor implementation in hardware rather than any inherent disadvantage of the connector itself.
Actually, there was/is nothing "wrong" with the connector at all. It's simply a DVI–plus–USB–plus–power switching all in one.
ADC itself is an attempt to get rid of the power supply in the monitor, in favor of Apple’s design solutions. Which, of course, is an absolutely idiotic idea, let +12 volts with such a current flow through the motherboard and AGP slot and through the multilayer PCB, near the low-voltage tracks on the board. In itself, it is no different from DVI, and I must say thanks to Apple in the fact that they used DVI rather than an analog VGA signal, which, otherwise, I suspect would not have stability in the vicinity of high power supply currents of the monitor
No, ADC was a "cleanup" solution to remove cables and create a even–Grandma–can–do–it plug and play system. The kind of integration Apple has always pursued that just happens to have the additional advantage (for Apple) of creating minor incompatibilities that gently encourage users to stay within the Apple ecosystem. They have gotten away with that by always producing leading–edge (although expensive) products. Right after ADC appeared, Belkin and a dozen other companies immediately produced all of the little adapters and cable etc. needed to interface with other stuff. I probably have 2 or 3 of those genuine Mac power bricks Apple made themselves so people could connect those Geewhiz Cinema Displays to non ADC–equipped computers… even (horrors) PC's.
Actually, it's 25 volts…not 12, and that current was directed through a separate connector on Mac–only cards, it's a very short hop from the PSU connector to the extra tab by the AGC connector and it's routed away from data lines… an advantage you get when you are the designer/manufacturer. But you already know that…?
As for VGA, ALL of the ADC-equipped graphics cards starting with the G4 AGP always had a second connector… either VGA or DVI. So, you were never forced to by a new monitor with your new Mac but again, "gently encouraged" to because you could simply use that "old" port for your "old" monitor and poof! you have duals!
So I say: "Incompatible? HAH!… This ain't nothin'… remember back when ALL Apple monitors ran at totally unique freqs and rez? That meant NO Apple monitors connected to PC's and NO PC monitors connected to Apple computers. THOSE were the good old days!"
-
So I say: "Incompatible? HAH!… This ain't nothin'… remember back when ALL Apple monitors ran at totally unique freqs and rez? That meant NO Apple monitors connected to PC's and NO PC monitors connected to Apple computers. THOSE were the good old days!"
Wait a minute. What about those little boxes with DIP switches that allowed non Apple monitors to be connected to Macs? I was happily running Mitsubishi 21" behemot with my 9500 with the help of such box. Others were using ViewSonic and similar stuff. Or are you talking about something very old? Before mid '90s or so.
-
and NO PC monitors connected to Apple computers.
Yeah, I probably still have a few dozen brand new adapters of the DB15 male to VGA era (DIP switches set both the resolution and frequency), they are very small square adapters and still available on amazon !!??
https://www.amazon.com/Adapter-DB15-Male-Female-Switches/dp/B0016OC1J2
WTF, I never imagined these would still be sold in 2024.
Oh one last thing about ADC, it also provided the ability for remote tower power on via the touch sense button on the front of the cinema display which was so damn cool and way ahead of it's time and apparently not possible 20 years later on any new computer since the alien technology has been lost.
-
Now - an attempted veer to back on topic…
Test different cables OR acquire & test another
low-cost, used DVI video card. Same problem
persists? Side-eye the Samsung Syncmaster.
And/or live with a lower resolution. ;)
-
Now - an attempted veer to back on topic…
Test different cables OR acquire & test another
low-cost, used DVI video card. Same problem
persists? Side-eye the Samsung Syncmaster.
And/or live with a lower resolution. ;)
Yes indeed. I will admit I'm looking at a SyncMaster as I type this.
Wait a minute. What about those little boxes with DIP switches that allowed non Apple monitors to be connected to Macs? I was happily running Mitsubishi 21" behemot with my 9500 with the help of such box. Others were using ViewSonic and similar stuff. Or are you talking about something very old? Before mid '90s or so.
Yup. very old. First came the little 7" lunchbox Macs with NO external video, then came the II's: the Iix, IIci etc. ALSO no video unless you could find a crazy expensive compatible video card. THEN came the Quadras and stuff and there were finally enough macs with the DB-15 display connectors that it was worth making those little DIP adapters.
-
Ok i realized i cannot work with such a high resolution anyways so it's no damage running it on lower resolution.
As this ATI card also has a ADC and from what i have seen dual monitors are possible, is there any kind of adapter for ADC to DVI or even to VGA?
I really googled a lot but could not find such an adapter.
Or for emergency, is it possible to dismantle a DVI cable and pick up some pins from the ADC to be able to get the VGA signal?
This is only an idea as the pins on these connectors seem to be the same but of course different housing and layout.
-
Your video shows what happens when an LCD panel is unable to synchronize to the incoming signal. The color data for a particular pixel on screen comes through the cable at a specific time. If the "SyncNovice" is too early or too late sampling that data, it will appear displaced one pixel to the left or right from where it should be. The timing error relative to each pixel continually changes, which is why the image visually swims like that.
On a lower resolution, the sample frequency (called the dot clock) is lower, meaning the color data are further apart in time. Therefore, the same timing error as before can still recover the correct pixels since it falls within a wider window of time.
LCD monitors from almost 20 years ago typically do not perform as when they were new because of choices made in their construction. One problem in particular was some of the capacitors commonly used, which because they were very small, tend to dry out more quickly than larger types. As they dry out, they lose the ability to stabilize the voltages inside the monitor, which introduces more noise and reduces the timing accuracy of the color sampling.
-
Actually, there was/is nothing "wrong" with the connector at all. It's simply a DVI–plus–USB–plus–power switching all in one.
I wasn't talking about the connector. I was talking about how Apple implemented ADC through the video card.
The ADC power connector on the motherboard frequently moved. In many cases, you cannot move an ADC-equipped card between generations of G4/G5 and have ADC continue to work.
They used undocumented connectors in the AGP slot to enable extra features for ADC, those same pins then became documented in a later version of AGP which is why we have to tape off certain pins on cards.
The connector, as you said, was perfectly fine. The way they haphazardly implemented it was not, and was certainly a contributor to the extreme dearth of aftermarket cards with ADC.
-
I wasn't talking about the connector. I was talking about how Apple implemented ADC through the video card.
The ADC power connector on the motherboard frequently moved. In many cases, you cannot move an ADC-equipped card between generations of G4/G5 and have ADC continue to work.
They used undocumented connectors in the AGP slot to enable extra features for ADC, those same pins then became documented in a later version of AGP which is why we have to tape off certain pins on cards.
The connector, as you said, was perfectly fine. The way they haphazardly implemented it was not, and was certainly a contributor to the extreme dearth of aftermarket cards with ADC.
Hoo boy…
1) The ADC moved on the G5 cards because they were 8xAGP and not intended to be used on G4 and such anyway, and because the AGP slots themselves expanded to accommodate dual-link cards.
2) The only reason you tape off pins is to enable an 8xAGP card to operate in a Mac 4xAGP slot, and that's just a hack that hard-cores like us use to make the "cool" 8xAGP G5 cards (mostly Radeon 9600 or 9800) run at 4xAGP to work in older G4's to get more memory for games since they can't really utilize 8xAGP anyway.
3) Apple caused the, as you say, "extreme death" of ADC itself beginning with the 30" Cinema Display and later monitors requiring more power and overtaxing the computer PSU's. No card was ever made with two ADC ports for the same reason.
I don't think you can really refer to any ADC-equipped cards at all as "aftermarket" since ADC was proprietary, never appearing on anything other than Macs. Any card you find with an ADC connector was OEM, either stock or optional on BTO, in either a late B&W, Cube, QS, MDD or G5.
-
One problem in particular was some of the capacitors commonly used, which because they were very small, tend to dry out more quickly than larger types. As they dry out, they lose the ability to stabilize the voltages inside the monitor, which introduces more noise and reduces the timing accuracy of the color sampling.
Similar also applies to some of our "vintage" video cards as well... as I have replaced capacitors on several GPUs that exhibited noise, shudder, etc. It's easier to replace a video card (and / or some of its capacitors) over internal exploratory monitor surgery. Have also replaced well-aged heatsink paste and / or thermal adhesive on some GPUs too. ;)
-
I downgraded the card to a ATI Rage pro, for temperature reasons, and now I get proper 1920x1080 signal with 75Hz from a 15pin VGA cable. There is a bit of color bleeding, even with a high quality cable, but the cheap Acer monitor i use allows to fix that somewhat in the settings. SVP Displays nicely Fullscreen without any pixel mashing or stretching, woohoo!
-
I am running a G4 quicksilver 933 with an ATI Radeon 9000 pro 64 MB card using DVI cable, monitor is a Samsung syncmaster 244T which usually had no issues before.
On highest resolution (and also native Mac stuff) I get this annoying jitter as shown here: https://youtu.be/dnoSrzu9zDg?feature=shared
I tried different digital/analog DVI cables and also installed the latest drivers from 2002.
This only happens on the highest resolution but disappears when I switch to the next lower resolution.
I ran Apple hardware test completely and it found video memory to be ok.
Anybody ever seen this? Can it be the power supply?
Likely a bug between the advanced EDID of the display and the 'ndrv' of the Radeon 9000.
The sad life and support cycles of GFX cards and their lack of documentation.
If users had run into this issue during the support cycle of the Radeon 9000 Pro then some type of hotfix would have been issued by ATI and the issue would have been 'resolved' to the limitations of the hardware.
EDID data, once upon a time was on a writable flash chip, so the manufacturer of the display could also issue hotfixes for these type of display issues when OEM, or aftermarkets like ATI/AMD nVidia made them aware of an issue with the EDID.
I've used tools on Windows that could decode the EDID data so it could be edited and flashed it back to the display to resolve specific issues.
When you are dealing with a display that was not in common use when the OS or graphics card was still in it's active support cycle, if it's decades newer, then such issues are rather common.
LCD's normally have pretty good whitepapers available on the web. Pulling the display apart and seeing what LCD panel was used, if you can't find reliable info on the web, normally yields the white papers for the LCD so we can understand what the actual timings and gate of the display are.
Then tools like EDID overrides like SwitchRes become useful stopgaps that can fix the issues in the desktop environment.
A real fix would be the EDID or the 'ndrv', but those are more complex. Advanced EDID is pretty documented and once you read the documentation of the LCD and decode the EDID a fix maybe available if the chip that contains the EDID is flash programable.
If it is not, or the issue is not with the EDID, then you are stuck with SwitchRes or disassembling the code from the 'ndrv' and trying to patch it at the middleware level.
When Open firmware boots, it reads the graphics card ROM and executes the code there, reading the display EDID and trying to find the correct timings of the display to set a proper resolution. OF will try and honor a resolution in NVRAM if one exists and just pick the best mode it can find if one does not.
If and 'ndrv' is in the cards ROM it will load it as a property in the device-tree, but OF can not do anything with that data. It passes the data to the Mac OS on boot, and on New World Macs, the Mac OS ROM file in the System folder and the system suitcase will execute the data in the 'ndrv'.
This happens very early, that blank grey screen between the Happy Mac, and the OS splash screen. It will also check the disk /System/Extension folder for an Updated disk based 'ndrv' and use that if it matches the graphics card and it's encoded date is newer than the encoded date for the 'ndrv' in ROM.
The 'ndrv' also polls the EDID data, and all this gives you your final display data the OS can use. Your list of resolutions and such.
Hotfixes can also be made in the other parts of the graphics card ROM, but rarely are, because it mostly deal with the GPU and VRAM, and that very early display detection. Display detection EDID is standard stuff, and as long as there are no bugs that prevent being able to enter OF, fixes are very rarely made on this level, but sometimes.
But long has passed on the support cycle of a graphics card they stopped selling and supporting when I was a young man, and ATI/AMD has not been forthcoming with Whitepapers for many of it's older GPUs.
This is why Open Hardware is needed. We need at least a basic overview of the CRTC functions of the GPU, and the 3D shit would not hurt either.
Now sometimes, copywrite and NDA's are an issue, and back in the day, more so. The graphics card companies would buy tech they did not own and use that on the card, and if they agreed to an NDA or copywrite on that, things get murky even decades later on what specs for a GPU they can release and what they can not. To add to this, when AMD bought ATI, what did they buy, and what do they own?
If a 3rd party made some of the tech, and that 3rd party went out of business 15 years ago, who do you contact about an NDA or copywrite release so AMD can share the whitepapers, that may or may not still exist and may or may not have been an agreed part of the sale of ATI to AMD.
Is any of that complex enough for you?
-
I wasn't talking about the connector. I was talking about how Apple implemented ADC through the video card.
The ADC power connector on the motherboard frequently moved. In many cases, you cannot move an ADC-equipped card between generations of G4/G5 and have ADC continue to work.
They used undocumented connectors in the AGP slot to enable extra features for ADC, those same pins then became documented in a later version of AGP which is why we have to tape off certain pins on cards.
The connector, as you said, was perfectly fine. The way they haphazardly implemented it was not, and was certainly a contributor to the extreme dearth of aftermarket cards with ADC.
Hoo boy…
1) The ADC moved on the G5 cards because they were 8xAGP and not intended to be used on G4 and such anyway, and because the AGP slots themselves expanded to accommodate dual-link cards.
2) The only reason you tape off pins is to enable an 8xAGP card to operate in a Mac 4xAGP slot, and that's just a hack that hard-cores like us use to make the "cool" 8xAGP G5 cards (mostly Radeon 9600 or 9800) run at 4xAGP to work in older G4's to get more memory for games since they can't really utilize 8xAGP anyway.
3) Apple caused the, as you say, "extreme death" of ADC itself beginning with the 30" Cinema Display and later monitors requiring more power and overtaxing the computer PSU's. No card was ever made with two ADC ports for the same reason.
I don't think you can really refer to any ADC-equipped cards at all as "aftermarket" since ADC was proprietary, never appearing on anything other than Macs. Any card you find with an ADC connector was OEM, either stock or optional on BTO, in either a late B&W, Cube, QS, MDD or G5.
Apple used some of the unused pins on AGP 2x and 4x to drive the USB of the ADC display, I've often wondered if that modification of the standard was allowed, and if it was not, then they are not AGP.
At any rate, because of this, plugging and older APG/ADC card in a G5's AGP pro slot, that used those unused pins would result in a fire.
Not the power connector part, that little tab at the end that was moved, but the AGP pins themselves.
Apple did not want the G5's burning in your house because you put a older APG 2x 4x ADC card into your new G5.
So they moved the tab as a way to prevent users from suing them.
AGP was backward and forward compatible, Apple's deviation of AGP was outside the norm, and bit endusers on the ass.
But I have put a PC Rage Pro x1 AGP card into the 8x AGP slot of the G5, and there is not electrical issue.
The only reason the G5's had ADC was people who upgraded their Pro towers didn't always want to upgraded what was at the time an expensive display.
Apple would have left it out entirely.
And what happened to MDX??
Were there ever any 3rd party MDX cards?
ADC modified AGP was better than AGP, but the deviation from the standard because Apple did not want to use a cable from the Logic board to the ADC card, or put the USB into the power tab???
Lots of things Apple did when they were a Computer company, now they are a phone company and oh by the way we also have a hobby of making Macintoshes.