Announcement

Collapse
No announcement yet.

XFX GeForce 6800 XT AGP [PV-T42K-UDE3] – recapping and mods

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    XFX GeForce 6800 XT AGP [PV-T42K-UDE3] – recapping and mods

    Aaaaand… we have more retro-PC nostalgia repairs again.
    This time it's with a XFX GeForce 6800 XT AGP video card. The XFX model number is PV-T42K-UDE3.

    As with many early XFX cards, crap caps are the main issue here. The first one I bought and recapped was back in 2017. I've been meaning to post about it for a long time, but never got around to it due to the card needing a better heatsink/cooler to test it fully with the recap (more on that later on.) In the meantime, I acquired a 2nd one of these cards not too long ago (February.) Not like this makes any difference. Both cards came with exactly the same crappy GSC/Evercon caps bulged, despite being from different sellers on eBay. Here are pictures of the 1st card, as it came to me:




    And for those who want to see higher-res pictures, here is the 2nd card:
    https://www.badcaps.net/forum/attach...1&d=1595709723
    https://www.badcaps.net/forum/attach...1&d=1595709723
    https://www.badcaps.net/forum/attach...1&d=1595709723

    Probably no need to say it so many times now… but what the heck: GSC/Evercon caps are _GARBAGE_
    Here is what a 1500 uF Evercon ME and a 1000 uF Evercon ME from the 1st card showed on the cap tester:


    - Yup, clearly failed. The caps from the 2nd card were the same too, more or less (only slightly higher ESR and slightly lower capacitance, with one 1500 uF cap being completely open.) But you know what – at least they still show some capacitance, despite the high ESR. I can't say the same about Sacon FZ, though, which almost always like to go completely or nearly open-circuit with just a few picoFarads of capacitance remaining. One would think that with time (and name changes) GSC/Evercon/Sacon would improve quality a bit. Instead, they went the other way and figured to make their caps even more crap. LOL?

    Anyways, moving onto the recap… there are 5 major power rails on this video card to be aware of:
    1) 12V rail from PSU for GPU VRM high-side
    2) 3.3V rail from PSU/motherboard AGP connector for RAM VRM high-side
    3) GPU V_core
    4) RAM V_dd
    5) GPU V_tt

    I marked each one of these on the following cap diagram:


    Note that there are some free/unpopulated cap spots that could also be filled in, if desired. For those who can't view the pictures or just want “numbers” to work with, here is a list of the caps connected to each rail.

    **** 12V rail from PSU for GPU VRM high-side:
    CE1 (SMD-only): unpopulated, 8 mm dia.
    CE2 (SMD-only): 16V, 470 uF, 8 mm dia. SMD, not sure of brand
    CE3 (SMD-only): unpopulated, 8 mm dia.

    **** 3.3V rail from PSU/motherboard AGP connector for RAM VRM high-side:
    C89 (SMD-only): unpopulated, 6.3 mm dia.
    CE31 (SMD-only): unpopulated, 8 mm dia
    CE16 / CE17 (SMD and through-hole, 8 or 10 mm dia.): 16V, 470 uF, 8 mm dia. SMD, unsure of brand

    **** GPU V_core
    CE4 / CE25: Evercon ME, 6.3V, 1500 uF, 10x13 mm
    CE5 / CE26: Evercon ME, 6.3V, 1500 uF, 10x13 mm
    CE6 / CE27: Evercon ME, 6.3V, 1500 uF, 10x13 mm
    CE32 / CE34: Evercon ME, 6.3V, 1000 uF, 8x13 mm
    CE33 / CE35: unpopulated
    * Note: all of the above cap spots can accommodate both SMD and through-hole caps in 8 mm and 10 mm diameter.

    **** RAM Vdd
    CE22 / CE30: unpopulated
    CE23 / CE28: Evercon ME, 6.3V, 1500 uF, 10x13 mm
    CE24 / CE29: Evercon ME, 6.3V, 1500 uF, 10x13 mm
    * Note: all of the above cap spots can accommodate both SMD and through-hole caps in 8 mm and 10 mm diameter.

    **** RAM Vtt
    CE21 (SMD-only): 16V, 470 uF, 8 mm dia., SMD, not sure of brand

    So for a minimal but complete recap, one would need 6x caps, rated for 4V minimum and 1200 to 2200 uF. For the GPU V_core, even 2.5V caps can be used (and I did that, as I will show later on.) To keep things simple, though, something like Panasonic FS (P/N: EEUFS0J152) or Rubycon ZLH (P/N: 6.3ZLH1200MEFC10X12.5) would probably work fine. And if you don't mind sacrificing a PCI slot below the AGP slot to make the card take two slots, you can go with taller 16-20 mm caps. On that note, something along the lines of Rubycon ZLH (6.3ZLH2200MEFC10X20), Panasonic FM (EEUFM0J152__), Panasonic FR (EEUFR0J222__), Panasonic FS (EEUFS0J202__), United Chemicon KZE (EKZE6R3E__152MJ20S), and United Chemicon KZH (EKZH6R3E__222MJ20S) would all work fine. All of these are currently widely available. Using polymer caps is another option.

    Given that this card still actually worked even with the bulged Evercon ME caps, it's clear that it's not too picky about cap specs.

    With that, I did a test recap on the 1st card by replacing all of the caps with United Chemicon KZG 6.3V, 820 uF caps I had pulled from a motherboard. (Note that Chemicon KZG, especially the above 6.3V, 820 uF caps, are known to have problems when not in use. Mine tested OK, so I only put them in to test the card's stability.)
    https://www.badcaps.net/forum/attach...1&d=1595709723

    The recap worked well, so I revised it slightly with some salvaged Sanyo SEPC 4V, 1200 uF polymer caps, among a few other changes:


    That recap worked fine too, and I left the 1st card as-is ever since. Why? Because the stock cooler, like on most cards from that era, was not adequately cooling the card and I didn't want to test any further with it. It wasn't until I completed a modded cooler for the 2nd 6800 XT that I finally did a full set of stress tests.

    Anyways, that is all regarding the recap on this XFX GeForce 6800 XT.

    Before leaving, here is a GPU-Z screenshot, as always:


    - Captain Junk Repairs singing out... for now.
    Attached Files

    #2
    The Not-So-Cool Cooler on the XFX GeForce 6800 XT AGP

    Right. So this is the reason I didn't want to fully stress the 1st recapped 6800 XT card: the cooler.


    Yeah, it looks clean and fan spun freely with no issue. Thermal compound was also in fair condition (as I saw later when I pulled the cooler off.) However, the moment I removed the top cover on the cooler to clean it, I knew it probably wouldn't be enough for the TDP of this card (which is… not sure, as different places online show different values, so I'll leave this for a later discussion.) My hunch turned out correct.




    With an average room temperature of 83°F (approx. 28°C), and no side panels installed on my test PC (i.e. case temperature = room temperature), the GPU core was already IDLING at 57°C. And this was with the cooler cleaned and the fan running at 12V (100% duty), as the default cooling profile in the card's BIOS dictates. As such, the XFX 6800 XT AGP is extremely loud - it sounds like a jet engine already. Despite this, the cooling absolutely sucked. With full GPU load, it took no more than 10-15 seconds for the GPU core to reach 65°C. If I left it running, it would likely have gone up to 80°C or possibly even higher. In fact, those are more or less the “normal” running temperatures for many of the GeForce 6800 card series.

    Now, I'm sure many might wonder who keeps their house/room temperature above 80°F / 27°C, right? However, those temperatures are actually not too far off from what you might see in a closed case with mediocre ventilation (and back in the days when this card came out, many cases didn't have good ventilation.)

    So let's see how the card does with good ventilation then… or rather, with low room temperatures. As mentioned, I bought the 2nd XFX 6800 XT in February of this year and tested it shortly after the recap. At that time of the year, I let the room temperatures dip to as low as 63°F / 17°C. The 2nd 6800 XT came with an XTremely dusty cooler:

    Of course, I didn't test it like that. The nice thing about the XFX 6800 XT is that you don't have to remove the cooler from the card to clean the fins. This is good in case one wants to test the card with the stock thermal compound, to see how good/bad it is. In my case, it was terribly dry:


    Side note: the cooler on this card is used not only for the GPU core, but also the RAM chips and AGP to PCI-E bridge chip. Talk about cheap!

    So I cleaned the 2nd card, as shown below, by taking off and washing the heatsink along with wiping the fan clean with a moist rag.
    https://www.badcaps.net/forum/attach...1&d=1596138648
    Notice the flaking on the heatsink? Yep, that is the plated steel coating falling off, probably due to the dust + air moisture mixing and causing the coating and aluminum below it to oxidize. And YES, folks, this is an ALL-Aluminum heatsink. Not that there is anything wrong with aluminum, or steel-plating. But look how thin the whole cooler is. Depending on where you get your information online, some places say the 6800 XT puts out as low as 50W, and others as high as 70W. Whichever the case, such a thin cooler just can't cope with that kind of TDP. And my low room temperature tests confirm this:

    At 63°F / 17°C ambient, the card idled a little lower at 51°C (only 6°C lower than at the high room temperature), and that is with new thermal compound (the old one was dry and performing even worse.) Once I kicked the load up to 100% with ATI Tool synthetic benchmark, it took maybe an extra 10-15 seconds (compared to the first run at 83°C / 28°C room temperatures) for the GPU core to pass the 65°C mark again. But the fact is, the card ran way too hot again. Given the low ambient temperatures were about 10°C lower than they are now in the summer and that the core would have likely gone up to 80-85°C in the hot room temps, then one can expect this card to run somewhere in the 70-75°C range with low room temperatures (and good case cooling.)

    Of course, this is still a dismal result, in my opinion. Knowing that the GF 6 series were already at the early beginning of the bumpgate issue for nVidia, anything over 60°C on the core is likely not good for the card in the long term.

    Sure the card has already outlasted its useful life already. However, it still bugs me how manufacturers get away with using such crap coolers. More than anything, though, it was the LOUD noise this stock cooler made. So I had to do something to improve the cooling on this card - if not to improve temperatures, at least to improve noise.

    This is where the TDP discussion comes in to play. The GeForce 6800 Ultra and GT, based on the NV45 made on the older 130 nm tech consume about 81 and 67 Watts, respectively, as shown here:
    https://www.techpowerup.com/gpu-spec...800-ultra.c720
    https://www.techpowerup.com/gpu-spec...e-6800-gt.c130

    The XFX 6800 XT AGP cards, however, are based on NV42. This GPU core is made on slightly newer and smaller 110 nm tech. It also has only 12 ROPs (vs. 16 for the GT and Ultra) and usually runs at a lower core clock. All of these should help lower the power consumption. However, the NV42 also has less pixel and vertex shaders (8 and 4, respectively), vs. the GT and Ultra (which have 16 and 6.) Shaders do add to the power consumption. However, when there are fewer shaders, the video card has to “work harder” and thus may not always end up consuming less power. One example of that is the GeForce 7900 GS vs. 7900 GT, with the GS consuming 1 Watt more, despite having less/cut ROPs, TMUs, and pixel/vertex shaders. All in all, the power consumption is mostly based on the core clock, core voltage, and memory bus width. Thus, with the 6800 XT still being a 256-bit memory bus GPU and running at a clock speed similar to the GT, I expected the power consumption would be similar to the GT (likely smaller due to the NV42 being made on 110 nm tech.), though still no less than 50W TDP (if not closer to 60W.)

    So what do we have for cooler options? Some of you who've seen my other cooler mod threads probably know already what I will be using. Stay tuned, though, and you will see how this one turned out.
    Attached Files
    Last edited by momaka; 07-30-2020, 02:18 PM.

    Comment


      #3
      Re: XFX GeForce 6800 XT AGP [PV-T42K-UDE3] – recapping and mods

      Can't wait.

      Comment


        #4
        Re: XFX GeForce 6800 XT AGP [PV-T42K-UDE3] – recapping and mods

        Originally posted by momaka View Post
        Evercon ME
        ME=¿Me encanta?=¡Buen intento, imbéciles!

        Translation: I love it?=Nice try, assholes!
        Last edited by RJARRRPCGP; 08-01-2020, 12:20 PM.
        ASRock B550 PG Velocita

        Ryzen 9 "Vermeer" 5900X

        16 GB AData XPG Spectrix D41

        Sapphire Nitro+ Radeon RX 6750 XT

        eVGA Supernova G3 750W

        Western Digital Black SN850 1TB NVMe SSD

        Alienware AW3423DWF OLED




        "¡Me encanta "Me Encanta o Enlistarlo con Hilary Farr!" -Mí mismo

        "There's nothing more unattractive than a chick smoking a cigarette" -Topcat

        "Today's lesson in pissivity comes in the form of a ziplock baggie full of GPU extension brackets & hardware that for the last ~3 years have been on my bench, always in my way, getting moved around constantly....and yesterday I found myself in need of them....and the bastards are now nowhere to be found! Motherfracker!!" -Topcat

        "did I see a chair fly? I think I did! Time for popcorn!" -ratdude747

        Comment


          #5
          XFX GeForce 6800 XT AGP with modded cooler [WIP]

          I don't think this is going to be a surprise really, but my (practically) never-ending supply of Xbox 360 rev2 (1 copper heatpipe) CPU heatsinks has come in handy once more. Since the 2nd 6800 XT had a pretty dusty cooler and probably ran very hot for its last owner, I figured this would be a good card to experiment on (after all, who knows how much life that GPU has left.) Thus, I wouldn't feel too bad if I damaged it by mistake. This didn't happen, though. The new Xbox 360 CPU heatsink fit just right with two machined aluminum pieces to fit the screw hole pattern on the board.

          But I wasn't done here. The NV42 GPU in this card being a newer chip (native PCI-E), it has an AGP to PCI-E bridge, as shown in pictures on my previous post above. Since there was no way I could have the Xbox 360 heatsink also cover the AGP to PCI-E bridge chip, I had to put a separate, Northbridge-sized heatsink, on it. With that done, the prototype heatsink mod had just started.



          One thing worth noting is that the AGP to PCI-E bridge chip die is very narrow. To balance the NB heatsink on it, I used a piece of 14-AWG steel wire, cut into a V-shape, which provided spring-loaded tension over the center of the die. The steel wire was held on one side by another V-shaped piece of staple wire, soldered to two negative cap terminals on the board. On the other side, the steel wire was held down with a screw (at least temporarily, for testing purposes.)

          Testing the card with an 80 mm fan going at full blast (12V) towards the Xbox 360 heatsink resulted in the following temperature graph.

          Temperatures looked very promising, but the 80 mm fan at full speed was still a little loud for my taste (though already much better than the stock cooler). More surprising was the AGP to PCI-E bridge: despite having a much larger heatsink now, it still ran very hot (no temperature diode, so no “scientific” value, but the heatsink felt very hot to the touch… I'd guesstimate the die was running probably 50-60°C, if not more.)

          Thus, more revisions were needed. Also, I wasn't a fan of the fan (terrible pun? ) not being mounted on the card. I wanted to finish a cooler mod for once where the fan is actually held onto the card and fully functional. Not only that, but the XFX 6800 XT has the ability to control fan voltage on its built-in fan connector.

          And then I had an idea: that aluminum bracket on the card seemed like a very convenient support for mounting a fan to it. Since I was already stepping into… no, swimming deep in … ghetto-modding territory (and because the Xbox 360 rev2 CPU cooler needs a medium/high-pressure fan), I opted to use one of my ghetto-mod staples: the Xbox 360 half of a 70 mm fan The only thing I dislike about those is having to cut all of the excess plastic away, as they tend to be unnecessarily bulky otherwise. It's probably hardly worth the time and effort, knowing how much a new cheapo 70 mm fan costs. But I already have a bunch of these doing nothing and sitting in a box, so I might as well use them. And they are good Delta and Nidec fans (in this case, it was a Delta AUB0712HH, rated for a not-so-shabby 0.4 Amps @ 12.) Not to mention they'd go well with the whole Xbox 360 junk parts theme, right?

          One screw later, and this was the (preliminary) result of the fan mounted to the bracket:


          Don't mind the masking tape with the hand-drawn “High Voltage” mark on it – that was me simply re-using a piece of tape when I was changing a light switch in the kitchen the day before and had it temporarily covering the hole overnight, just so that someone wouldn't try to touch in there by accident.
          The idea for the tape on the cooler was to direct air flow only through the HS and to provide additional support for the fan, since it was held with only 1 screw in 1 corner. You can see the bridge chip HS also received a small 4010 fan. I left the long wires on it for testing, so I could try the fan with different voltages to see how much air the bridge HS needs to stay cool. With this particular fan (Sunon Maglev, IIRC), 5V was enough. On the other hand, the 70 mm fan on the GPU heatsink was connected to the fan connector on the video card so I could vary the speed/PWM with RivaTuner.

          With all of that done above, time for testing once again.

          Result: not bad!

          The above GPU temperatures are with 100% GPU load @ 80°F / 27°C room ambient. The 70 mm fan speed was set manually through RivaTuner @ 90% PWM (about 7.1V at the fan connector) when the GPU was under load, and @ 70% PWM (about 5.2V) for the card idling. The 70 mm fan wasn't exactly quiet under load, but not much louder than the 70 mm fan on the stock socket 939 cooler in my test PC and definitely a big improvement over the stock fan. After all, the stock cooler on the 6800XT had the core IDLING at 57°C, and here we see the peak temperatures level at 58°C under full GPU load - cool ! In idle, though, the fan at 70% PWM was overall quiet, especially for hardware from that era.

          All in all, the Xbox 360 rev2 CPU cooler was a good fit for this card's TDP, as I expected (which I'm guesstimating is around 60-65 Watts just for the GPU core.) It's only a shame how many slots it takes – 4, to be exact, with the AGP slot included in the count. That's a lot! Luckily, though, the quirky PCB design that XFX did on this card has the GPU more towards the back. This pulls the cooler toward the back of the card too, thus actually leaving enough space for certain “short” PCI cards (i.e. ones that only take half height and also if their PCB ends directly after the PCI-slot) to still fit. As such, despite the bulky heatsink mod, I can still fit certain PCI WiFi cards (or a modem, LOL!) Therefore, this cooler mod takes more like 3.5 slots worth of space. It's still a lot, but let's be honest: on more modern AGP boards, who uses all or even more than a few of their PCI slots? So on a full-sized ATX mobo, this card will fit fine.
          Attached Files

          Comment


            #6
            XFX GeForce 6800 XT AGP with modded cooler [FINAL VER]

            Now to make it look neater, perhaps…
            Cardboard? -check
            Hot glue? -check
            … and plus ~1 hour of time…
            https://www.badcaps.net/forum/attach...1&d=1596644601
            It's a fan shroud!

            Test fitted:


            Better than the masking tape, I think.

            I also swapped the bridge chip HS and fan with a different set, as I usually save those 4010 fans for motherboards with hot-running MCPs, where I also could run into clearance issues if the MCP is near/below the PCI-E or AGP connector.

            The “new” bridge HS fan came from one of those oldschool 5.25” bay “coolers”. Remember those? That was back from the era of beige cases that didn't have much cooling or vents on them. This 5.25” bay cooler I got in a junk box of PC parts left out for trash collection. I never used it, because the 2x 40 mm fans were both connected to 12V and way too loud… not to mention, the fans were already grinding. Even after cleaning and oiling the fans, I figured I'd probably never use such a thing, so I pulled the 2x 40 mm fans for side projects and saved the cooler bay as a spare 5.25” cover.

            Another thing I noted while doing the temperature tests earlier was how hot the RAM chips ran – probably in the mid or high 50's °C, based on touch. Looking at the PCB, it's as if XFX left mounting holes particularly for RAM heatsinks, so I took advantage of those.
            https://www.badcaps.net/forum/attach...1&d=1596644601

            Using some aluminum scraps from a bathroom mirror frame project I did a while back while renovating one of our bathrooms, the RAM chips got their heatsinks:


            Basically, three screws were used to hold three wood dowel “nuts” (hand-made, of course) onto the PCB, right where those mounting holes were. On each wood dowel nut, I put one or two holes for 14 AWG steel wire - same type as the one used to hold down the bridge chip. Then, I made more V-shaped steel tension “springs”, and used those to keep downward pressure on each of the RAM heatsinks. For heat transfer, I reused the original RAM thermal pads between the aluminum RAM heatsinks and the RAM chips. Also, I removed the screw that (temporarily) held the bridge chip steel “spring” wire and instead routed the wire through one of the wood dowel nuts. This essentially makes my entire cooling mod removable and reversible (though it is a bit cumbersome to install and remove.)

            A quick test showed all temperatures were still the same, more or less.

            Actually, the highs improved by about 1-2°C, as the temperature graph above was taken at approximately 84°F / 29°C room ambient, and yet the GPU core temperature stayed mostly at 57°C. The RAM sinks also appeared to work - or at least get very hot again, just like the bare RAM chips did. So it appears they are taking heat away from the RAM chips. But with no direct airflow, I'm not sure how effective they are with cooling. Perhaps the one RAM heatsink behind the GPU works better, as the 70 mm fan pushes quite a bit of air through the heatsink. I actually intentionally left a tiny bit of space between the shroud walls and heatsink so that some air could also be used to cool the rest of the card, including the RAM heatsinks.

            Making some last touches, the final version is this:




            Yup, the fan shroud received black latex paint. I ordered a custom small sample jar from the HW store instead of spray, as I find spray doesn't work well for everything and sometimes tends to flake off on metal. In contrast, semi-gloss latex paint tends to stick very well to metal and can even cover/seal sanded rust spots without them returning.

            Also, through earlier experiments, I found the bridge chip fan was cooling the HS well enough when running at 7V (+ wire on 12V, - wire on 5V), so I soldered its wires directly to the PCB for that voltage. I didn't connect it to the main fan connector, because at 5V, it wasn't pushing enough air to cool the bridge chip even when the card was idling. Additionally, I wasn't sure how much more current the card's fan regulator could take. The new 40 mm fan doesn't take much current, but those 70 mm Delta fans are fairly power-hungry being rated for 0.4 Amps. So I didn't want to push things on the card's fan controller any more than needed..

            Finally, I changed a few things on my recap job:
            - The GPU V_core is now filtered by 2x Nichicon HZ 6.3V, 2200 uF caps (H06xx date codes), 1x Sanyo SEPC 2.5V, 1200 uF cap, and 1x Nichicon SMD 6.3V, 1000 uF cap
            - The RAM Vdd is filtered by 1x Rubycon MFZ 6.3V 2700 uF and 1x Sanyo SEPC 2.5V, 1200 uF cap.
            So basically, all rails have slightly more capacitance (and much better ESR) than they did originally.

            With all of that done, I think I finally reached a “complete-enough” state with this card, at least for now. It's recapped and runs cool…er. What else is there?
            .
            .
            .
            .
            .
            Actually, if I have to be particular (and I always am, for better or for worse ), there are a few things left.
            #1 is that when the card boots, the default fan speed is always set at 100% PWM, regardless of load or temperatures. This being the GeForce 6 series, there is no temperature-controlled fan feature. Instead, fan speed is based on performance state (IIRC.) In 2D mode, fan runs at one speed, in 3D/performance mode another, and throttling/overheated yet another. Given the above tests, I think it would be reasonable to set the fan PWM to 70% and 95% for 2D/desktop and 3D/performance modes. Most likely, I'll need to flash a custom BIOS on the card for that.

            To anyone else thinking of attempting this mod: please note that the PWM control on the XFX GeForce 6800XT card is NOT linear at all. As I found through experimentation, PWM under 50% drops the voltage significantly below 5V, and as such, my 70 mm Delta fan turned Off. Also, there is a jump above 97% and 98%. At 97%, the voltage at the fan connector is still around 7.1-7.3V. But at 98% or more PWM, the fan voltage jumps to nearly 12V. So careful what values you set.

            Luckily, I also found that the ForceWare driver version I was using (270) supports temperature readout and fan control in MSI Afterburner. This allowed me to move away from manual fan control in RivaTuner and force a custom fan profile in MSI Afterburner instead… more precisely, this:.

            And that gave pretty much the same exact idle and load temperatures as before.

            The #2 thing that could use changes is 2D and 3D state clocks. Right now, the card is running at 350 MHz core and 1000 MHz (DDR) on the RAM in all states. This is absolutely not needed when the card is idle. And even in 3D mode, I find that there is almost no difference in performance in games when the core is running at 350 MHz or under-clocked to 300 MHz. In fact, I also tried OC-ing the core to 475 MHz, and it hardly made a difference in any of the games I tested. The only slight difference was in temperatures: @ 300 MHz, the core load temps dropped by 1-2°C. At 475 MHz OC, the core touched 60-61°C at times, which is not bad given the OC.

            I was also able to push the card all the way to 500 MHz on the core without any voltage adjustments at all, and the card was perfectly stable. This makes me wonder, how much overhead is actually there on V_core voltages for the stock 350 MHz clocks. If there is a lot, perhaps I can under-volt V_core a little and reduce GPU temperatures further (I guess we can call that #3 on my list of stuff to do). As for the RAM, I tried only under-clocking it. It is Qimonda/Infineon, so I didn't try pushing my luck with OC. With the RAM under-clocking, there was also no difference in performance as well, at least until I lowered the clocks to about 750 MHz. Below that, a few older games that rely more on fast RAM did get smaller FPS numbers. So I'll probably leave RAM clocks as is.

            All in all, though, these 3 items are not a big deal, so we will see when I try any of them out. Until then, that is all.
            Attached Files

            Comment


              #7
              Re: XFX GeForce 6800 XT AGP [PV-T42K-UDE3] – recapping and mods

              that is an absolutely beautiful cardboard origami fan shroud! the japanese will be proud of your origami work!

              as for modding the bios, u can dump it here as an attachment for me to take a look at for potential bios modding ideas. however, be warned and dont get your hopes up, because i noticed its a bridge chip card and as u know nvflash doesnt work for flashing on bridge chip cards. tried to flash my agp 6600 gt cards and they dont work because of the bridge chip. dumping the vbios works fine but not flashing.

              Comment


                #8
                Re: XFX GeForce 6800 XT AGP [PV-T42K-UDE3] – recapping and mods

                Originally posted by ChaosLegionnaire View Post
                that is an absolutely beautiful cardboard origami fan shroud! the japanese will be proud of your origami work!
                LOL, thanks!

                Originally posted by ChaosLegionnaire View Post
                as for modding the bios, u can dump it here as an attachment for me to take a look at for potential bios modding ideas.
                Attached at bottom of post.

                Originally posted by ChaosLegionnaire View Post
                however, be warned and dont get your hopes up, because i noticed its a bridge chip card and as u know nvflash doesnt work for flashing on bridge chip cards. tried to flash my agp 6600 gt cards and they dont work because of the bridge chip. dumping the vbios works fine but not flashing.
                Oh yeah, I think I recall you telling me this before.
                Well, that's a bummer... and maybe also a good thing, as this means I won't have much else to do on this card. Though it is worth mentioning that I also have the PCI-E version of this card, albeit with NV41 GPU. Core and RAM clocks are the same on both. So perhaps with a physical BIOS IC swap, I can take the AGP card's BIOS, solder it to the PCI-E's, flash, then back on the AGP? (And then one has to wonder if it's worth the work, just for some silly fan speed corrections, lol. )

                On a different note, I discovered there may be an issue with the XFX 6800 XT - or at least the 2nd AGP card (the 1st one still has the original cooler). The issue is stutter / terrible frame times in 3D mode. Just look at the frame time vs FPS (framerate) in the following two screenshots:
                https://www.badcaps.net/forum/attach...1&d=1596863014
                Here you can see FPS is at 79 and frame time at 27.7 ms, which is clearly not right, because 27.7 milliseconds corresponds to around 36 FPS.

                But it gets worse.
                https://www.badcaps.net/forum/attach...1&d=1596863014
                Two seconds later, and the frame time shoots up to 52.9 ms, or approximately what would be expected from 19 FPS. But the counter is showing 80 FPS!

                Not only that, but even with V-sync On for nearly-constant FPS, I was still getting micro-stutter:
                https://www.badcaps.net/forum/attach...1&d=1596863014
                Basically, the frame times were still choppy and spiky, despite the FPS being steady at 57-59 FPS. And it's not a CPU resource issue, because with V-sync on, you can see the CPU usage was now below 100%.

                So basically, regardless of what game I tried, I was getting micro-stuttering every few seconds in every game I tried (though some worse than others, like HL2, for example.) In the above 2 shots it was Colin McRae Rally 4, simply because I know this game gets excellent performance on similar cards, like my PCI-E 6600 LE, PCI-E 6800 XT, and 7300 GT AGP - all with the same ForceWare 270.61 XP drivers. The fact that an FX5200 64-bit card gave smoother frame times in HL2 and CMR4 I think confirms there is an issue.

                So obviously, there is something going on under the hood of this card that's not quite right, but I can't figure out what yet. I tried the card on 3 different PCs (AMD Athlon 64 3200+ s939 OC to 2.5 GHz [my primary video card test PC], AMD Athlon 64 3200+ s754, and AMD Athlon 64 3400+ s754.) The first two PC I've already tested with other video cards - both nVidia and ATI. Although I've never cleared the old drivers on them when switching between cards, it's generally never been an issue. Thinking that this might now be the issue, that's when I put together another PC - the Athlon 64 3400+. On that one, I didn't have any other video cards installed previously. And to confirm what I was seeing isn't a drivers issue, I started with a much older nVidia driver (61.21 released on 05/14/2004.) Issue persisted, so I moved up to ForceWare 90, then 175, and then finally back to 270. But regardless of which version I tried, the micro-stutter was identical and persistent every time with all of these drivers.

                Searching online for similar stories, I see this also happened to a few people with much newer video cards (we're talking about GTX 1000 and 2000 series here, along with AMD RX 5000 series.) So the only thing I can conclude right now is that this issue may be due to failing GPU or RAM.

                I suppose when I make another modded cooler for the 1st XFX 6800 XT and test it, that's when I might see if that one has the same problem or not.
                Attached Files

                Comment


                  #9
                  Re: XFX GeForce 6800 XT AGP [PV-T42K-UDE3] – recapping and mods

                  strange. not sure what causes these framedrops and microstuttering. are u running the game with the on screen osd for seeing the frame rate, vid mem usage, clocks etc.? on the agp amd cards im using with the bridge chip, i found that using the osd causes a frame rate drop. turning off the osd stops the frame rate drop. so i set a hotkey to toggle the osd on/off quickly to take a quick peek at what the monitoring info is.

                  but other than that, u can check to see if agp acceleration is enabled or if fast writes is enabled or not. nvidia made many different versions of the 6800xt. in the vbios of yours, in nibitor, it identifies itself as an agp 6800 gs with the br02 bridge chip instead, so it could be the driver disabled certain features for compatibility reasons and u need to re-enable them manually if they dont cause issues or lockups.

                  also, i took a quick peek at the vbios with nibitor and i was sad to see there isnt any voltage table, so no undervolting in the vbios. u can only change the performance levels to 2 and run the clocks and fan speeds with 2d and 3d settings.

                  so if u wanna volt mod, unfortunately, pencil or potentiometer mod on the gpu vrm controller chip are the only two ways on that card...

                  EDIT: also, i dont recommend swapping the vbios chip onto an agp card for flashing because the vbios firmware could contain extra firmware data for the bridge chip and/or the vbios data is different between the two pci-e nv42 chip and the agp nv41 chip. so if u flash the pci-e 6800xt vbios onto a native agp 6800xt, it *might* become inoperable on the next reboot because it has the wrong pci-e vbios on it now. u have to desolder the vbios chip and solder it back on the 6800xt br02 to see if it works and then solder back the native agp 6800xt vbios chip to get the native agp 6800xt card back. quite troublesome and a lot of work and it may not work, resulting in lots of desoldering and resoldering to see if it works or not.
                  Last edited by ChaosLegionnaire; 08-09-2020, 10:24 PM.

                  Comment


                    #10
                    Re: XFX GeForce 6800 XT AGP [PV-T42K-UDE3] – recapping and mods

                    Originally posted by ChaosLegionnaire View Post
                    strange. not sure what causes these framedrops and microstuttering. are u running the game with the on screen osd for seeing the frame rate, vid mem usage, clocks etc.? on the agp amd cards im using with the bridge chip, i found that using the osd causes a frame rate drop. turning off the osd stops the frame rate drop. so i set a hotkey to toggle the osd on/off quickly to take a quick peek at what the monitoring info is.
                    Yup, I had the OSD On (and I did toggle it On and Off without any difference.)
                    Don't think that's the issue, though. My HS-modded GeForce 7300 GT AGP didn't suffer from these issues, and neither did my x800 or x1950 (all with bridge chips.) In fact, I turned on MSI Afterburner and the OSD specifically because before that, I had only Fraps running to check the FPS. But when I noticed that the FPS in Fraps was OK and the game still stuttering, that's when I pulled out MSI AFterburner.

                    Originally posted by ChaosLegionnaire View Post
                    but other than that, u can check to see if agp acceleration is enabled or if fast writes is enabled or not.
                    Yes, checked and tested with both Fast Writes enable and disabled. Didn't make a difference.

                    Originally posted by ChaosLegionnaire View Post
                    also, i took a quick peek at the vbios with nibitor and i was sad to see there isnt any voltage table, so no undervolting in the vbios. u can only change the performance levels to 2 and run the clocks and fan speeds with 2d and 3d settings.
                    Yeah, I'm not surprised at all about the lack of a voltage table, since the XFX 6800 XT doesn't have one of those "smart" PWM ICs with I2C bus control. It's got just a plain old synchronous buck PWM IC (UX1), same as the RAM VRM PWM IC (UX2.) So for voltage control, I probably need to do a hardware mod and play with the feedback resistor network (which in all honestly, I prefer over SW/BIOS flashing, as it's not possible to "brick" the card permanently, unless you really goof up and put a voltage so high that it destroys the GPU itself.)

                    Originally posted by ChaosLegionnaire View Post
                    so if u flash the pci-e 6800xt vbios onto a native agp 6800xt, it *might* become inoperable on the next reboot because it has the wrong pci-e vbios on it now. u have to desolder the vbios chip and solder it back on the 6800xt br02 to see if it works and then solder back the native agp 6800xt vbios chip to get the native agp 6800xt card back. quite troublesome and a lot of work and it may not work, resulting in lots of desoldering and resoldering to see if it works or not.
                    Exactly.
                    I don't mind the soldering part, though. In fact, this is why I'd rather go that route, as it seems a lot less likely to end up with a bricked card.

                    Either way, I'm trying to finish so many other old projects right now that I may not get to the BIOS and V-modding on the 6800 XT anytime soon.

                    Comment

                    Working...
                    X