The Radeon 9800 Pro To Radeon 9800XT Mod Guide!

Discussion in 'Reviews & Articles' started by Adrian Wong, Apr 13, 2004.

  1. BridgeBoy

    BridgeBoy Newbie

    But you can't find out if you have a 128MB card with XT PCB until you mod it and find out if the temperature monitoring becomes active, right? And after reading your guide I thought you said we will at least get about a 7% performance increase, right?

    I'm just happy to mod it for the fun of it; I've never modded anything before and I will be building a new A64 dedicated gaming rig (for overclocking) within the near future so I am getting "my feet wet" with this mod.

    And also; I think you may have updated your guide since I last read it. I thought you had recommended this BIOS regardless of the manufacturer of the card:

    GigaCube.9800XT.128MB.Samsung.E-die.bin

    But now I see the update form Redman that says make sure to use the right one for your manufacturer, whichg in my case is ATI, so I should use this one right?:

    ATI.9800XT.128MB.Samsung.E-die.bin

    Is that correct?


    By the way; this is how I'm going to attempt to cool it sufficiently...lets see if I can attach this picture correctly: Hmmm.....I guess I need to edit it down to less than 150KB then? To display a picture? or is there another way to do it?
     
  2. Chai

    Chai Administrator Staff Member

    Nope. You can use BIOS from any manufacturer. The most important thing is, you must flash the correct BIOS for your onboard RAM! Hynix or Samsung. Manufacturer's ID doesn't matter at all.

    No, as long as it's an XT PCB, 99% of them will have temperature monitoring.
     
  3. BridgeBoy

    BridgeBoy Newbie

    Ok, thanks for that clarification! :)

    Regarding the XT PCB thing though, how do I tell if mine has XT PCB? I didn't see a way to tell from your guide; except for actually modding the card and then seeing if the temp monitoring option appears in the control panel. Is there another way to tell before I flash the bios?

    Also, how do I insert a picture into my post? Do I have to reduce it less than 150KB and then attach it? Or is there another way?
     
  4. Chai

    Chai Administrator Staff Member

    You can check the 256MB section for the XT PCB clarification. Page 5 I think.

    You can either attach it, but must be 150kb and below, or you can host it elsewhere and use the {img}http://www.url.com{/img} tag.

    Replace {} with [].
     
  5. BridgeBoy

    BridgeBoy Newbie

    Ok, well, I'm still not getting it on how to identify the PCB; sorry for my ignorance on this detail. Unless you are saying the only 128MB versions that have the XT PCB are the ones with that stainless Steel looking boxy cooler? I guess I don't even know what the PCB is? So I can't figure out what I am looking for in order to identify it.

    Thanks for the clarification on how to get a pic on here... ;)
     
  6. BlessidThug

    BlessidThug Newbie

    Hi BridgeBoy, the only sure way is to take the heatsink/fan off of the GPU. Then you can wipe it off and look at the core to see if it says R360.

     
  7. BridgeBoy

    BridgeBoy Newbie

    Hi, yes, I have done that and I know I have a R360 core.......but it is whether or not I have an XT PCB or a Pro PCB that is the question. What is a PCB anyway? What do the letters stand for, is it an acronym?

    If you read Chai's comments in the last few posts, or read the guide, he says that only some of the 128MB 9800 Pro's have XT PCB's. I know I have the R360 stamped on the core, but I don't know how to verify what kind of PCB I have.
     
  8. BlessidThug

    BlessidThug Newbie

    BB, the XT Printed Circuit Board (PCB) has 8 little chips (4 rows of 2) in the top lefthand corner. It has the memory chips (black squares) running along the top and righthand side of the heatsink. Look back for that picture.

     
  9. BridgeBoy

    BridgeBoy Newbie

    Ahhh, ok, then no, I don't have those eight chips...I have the ones with the heat sinks on them. Thanks for the clarification!

    But I do have the R360 core. So will I still get anything worth mentioning in performance by flashing to a 9800 XT bios? Of course, I am still struggling in getting teh core to clock above 396 Mhz using ATI Tool to find the max core. I have two extra 3.5" case fans doubled up in line with each other right in front of the stock cooler & fan, I also lapped the bearing area of the cooler with 400->600->800->1000->1500->2000 grit sand paper. It looks pretty mirror like. The core is a perfect mirror too by the way....not black like I have seen some pictures. I then used just a little heatsink compound from Radio Shack.

    I wonder if I should just try clocking it to 412 and 365 and then run Aquamark03, and those other programs mentioned in teh guide and see how it does? And forget about getting ATI Tool to find the Core Max? After all, ATI Tool keeps reseting the core clock back down when it identifies teh tiniest of artifacts that teh human eye can not even detect.

    I assume that when the guide says to run those programs that you are to simply look at teh screen and see if you notice any artifcats with teh naked eye. Is that what I am supposed to do?
     
  10. shank69xo

    shank69xo Newbie

    :lol: :lol:

    Yesss. I got my 9800pro 256mb last week, and to my luck its the XT in disguise. I falshed with XT bios and it is running sweetly. My 3DMark03 jumped from 6300 to 7800. Thanks Chai for this wonderful mod.
     
  11. BlessidThug

    BlessidThug Newbie

    BB, I think that the first thing that you should do is get a new cooler. At $15, you cannot go wrong with the VGA Silencer v.3. Then you would know that you are not running into an overheating problem. For me, the ATI Tool was too forgiving and took my card way higher than it could safely go. Keep at it, you're doing well.

     
  12. shank69xo

    shank69xo Newbie

    UPDATE, okay, yesterday everything ran great with the 9800XT bios. Now today I am getting artifacts, real bad. I have the VGA silencer metioned in the above post, and I also have BGA RAMs sinks on the card. It runs at a steady 57 degrees, when I am doing nothing, and when I am gaming, it runs at 77 degrees. The cooler is running on high. The temps have not changed since yesterday, but thereare artifacts now. I had to flash back to my original BIOS. Now get this, I overclocked 9800 PRO to run at XT speeds, and I get no artifacts, but the cartd is running at 83 degrees when gaming. And my benchmark score went down 300 points. Anyone have any suggestions as to what I can do. I want to run the XT bios on this card. It is afterall an XT in disguise.
     
  13. Chai

    Chai Administrator Staff Member

    Weird. Did you try to underclock after you flash to XT BIOS?
     
  14. swatX

    swatX Newbie

    i flashed my 16x16mb 9800 pro to XT without any problem ran it @ pro speeds caz didnt have good cooling. so it ran great. then i flashed it back to the original bios.

    nice guide though. i used ATIwinflash works great. thx.


    btw how much does a 16x16mb 9800 pro flasehd to XT go for? 280$
     
  15. Chai

    Chai Administrator Staff Member

    What do you mean by that? :think:
     
  16. BridgeBoy

    BridgeBoy Newbie

    Ok, here's an update on my experiences. By the way, Chai, the link to this thread from your guide appears to not be working for some reason. You may want to check it out. I also posted this in another forum (Rage3D), so I am just copying and pasting it here. I also have a link to pictures I have hosted of the different stages of the cooling modifications I made; just look for the link at teh bottom of this post. Again, I am just copying and pasting here, so it may seem a little out of context; but please feel free to let me know any ideas or suggestions you may have:

    Ok, I have a question concerning my first modification and overclock of a graphics card. I have a 128MB ATI-built Radeon 9800 Pro with the R360 core, Samsung 8E-GC2A memory, on the non-XT PCB. I have flashed to a 128MB XT BIOS with this:

    http://www.bricard.net/files/GigaCu...msung.E-die.bin

    I have replaced the stock GPU cooling with a Thelmaltake Volcano 10+ copper heatsink that you can see here:

    http://www.thermaltake.com/coolers/volcano/si/a1671.htm

    EDIT: See the link at the bottom of this post for pictures of my mod.

    I have lapped the heatsink using 400->600->800->1000->1500->2000 wet/dry sandpaper with water and dish soap. I modified it to fit the PCB's stock mounting holes and have attached it with small bolts and nuts with nylon washers to keep the nuts and bolts away from the PCB.

    I have also mounted heatsinks to all eight memory chips. I used gold heatsink grease/compound on the GPU and memory chips....a very thin layer.

    Here's the thing, at the new XT default speeds of 412/365 I can run Aquamark03 and 3DMark03 with no apparent artifacts (I scored about 39,000 and 6,900 respectively, I think). However, I can not play DOOM3 or run RTHDRIBLE without severe artifacts everywhere. ATI Tool only wants to down clock the core to about 400 MHz or so before it stops detecting artifacts. However, it will let the memory go up over 380 MHz with no artifacts.

    When I open the case and touch the copper cooler on the GPU it is cool or mildy warm at best…even while all the artifacting is happening. Same thing with the heatsinks on the memory chips. The back of the PCB opposite of the GPU is also cool; so if the cooper heatsink did not have good contact the GPU would certainly be heating up the PCB on the back side I would think. So it “seems” to be cool, yet I'm getting artifacts everywhere in RTHDRIBLE and DOOM3.

    Oh yeah, and of course, I do not get temperature monitoring or the overdrive feature because I do not have the XT PCB with the appropriate temperature monitoring components.

    Now I was able to play DOOM 3 last night for about 6-hours straight when I down clocked to 395 Core and left the memory at 365.....very stable with no artifacts. So this would suggest that I am NOT getting sufficient cooling at the higher core clock rates, BUT, to the touch the card seems to be plenty cool. Especially if I leave my case cracked open.......and I feel the heatsinks right when the GPU is getting taxed hard by one of these benchmark programs. I also have the copper heatsink lapped pretty well and torqued pretty tightly to the board with bolts (nylon washers also) until I can no longer see any slivers of light coming through between the "shim" (the rectangular strip of metal that surrounds the GPU in the middle)....so I believe this would be evidence that I have a good contact with the GPU; wouldn't you think?

    Although the sticky’d guide on this page called “Video Card Cooling for Noobs and Newcomers!” says this: “This shim is usually slightly higher than the gpu surface.” Mine appeared to be just the opposite……the GPU appears to be slightly HIGHER than the shim….which would make sense….this would allow you to tighten the heatsink down until it bears on the shim and would insure a snug fit to the GPU without allowing you to crush the GPU. Having the GPU LOWER than the shim does not make any sense at all to me…..then you would never have good contact with the GPU…..now I’m no expert, but, it just makes sense that the designers would have the GPU say 0.5mm higher than the shim…and the shim would just be there for insurance against crushing the GPU and for a wider foundation to stabilize the heatsink.

    Now like I said, I have never done this before and this is my first time; but I am inclined to think that removing the shim is a bad idea. I think it serves a well needed purpose of protecting the GPU, AND, it would seem it would also insure just the right optimum amount of pressure and contact with the GPU; BUT, I continue to have this artifact problem while I read dozens and dozens of other people overclocking their cards WAY ABOVE the stock XT speeds of 412/365 without any problems…..so I don’t know what to think.

    So I am here, with my hat in my hand, asking for help from any experienced people who think they may know what my problem is. My thoughts are that removing the shim altogether would be a bad idea and unnecessary. Having a 285-gram copper weight perched above a half-inch square GPU doesn’t sound very safe…..it should have the wider foundation of the shim to stabilize it…..but that is just what I am thinking…..but this is my first time. My only other thought is that maybe I need to shave down the copper heatsink where the rectangular shim makes contact so that it can contact the GPU a little better? But, like I said, if I had to bet, I’d bet that the relationship in elevation of the shim to the GPU is already perfectly matched by design…..with the GPU being ever so slightly higher than the shim already. So, my “guess” is that shaving or sanding down the heatsink where it contacts the shim will not help anything, but only make it more risky for applying too much pressure to the GPU. And yes, I know that the lightweight stock aluminum cooler did not contact the shim at all, because it had a small raised portion right where the GPU was…..which makes the shim meaningless with the stock cooling solution……so my guess is that the shim was designed for a bigger, flat-bottomed heatsink, and then they came up with the marketed stock cooler at a later date after the boards were already designed.

    For the record, here is the rest of my relevant system information:

    Dell Dimension 8250; 2.4 Ghz P4 w/ 533 Mhz FSB, 512 KB Cache
    1-Gig PC1066 RDRAM
    128MB ATI Radeon 9800 Pro running at 395/365 flashed with: "GigaCube.9800XT.128MB.Samsung.E-die.bin BIOS"
    SB Audigy 2 (all updated drivers)
    Omega Cats 2.5.67 (4.8's)
    DirectX 9.0c
    Windows XP SP2 (completely updated)

    Please see this link for pictures of my mod throughout the process:

    http://community.webshots.com/scripts/misc.fcgi?action=invitePickup&uri=album/184867254Uabeib

    Just click on "No Thanks, Just Show Me the Photos" when you get there.

    Any replies will be greatly appreciated!!!!
     
    Last edited: Sep 8, 2004
  17. BridgeBoy

    BridgeBoy Newbie

    I'm having trouble editing my post above without it inserting crazy characters all through it....but it seems that teh link for teh BIOS I mentioned it it must have gotten corrupted. Anyway, it is the gigacube 128MB BIOS for Samsung memory that Chai recommended in his guide. I will try to link it again here:

    http://www.bricard.net/files/GigaCube.9800XT.128MB.Samsung.E-die.bin
     
  18. shank69xo

    shank69xo Newbie

    Yes I did underclock it, kind of. What I did was edit the XT bios to run at 400/365. That is the fastest the card would run with out artifacts. Which I dont understand at all, because it is a XT in disguise. But anyways, the flash went good, and windows recognized it as an XT. So I played Doom 3 for 12 hours straight, after doing a messload of benchmarks with 3DMark03. And all wen t well, until it locked up in the middle of Doom3. I had to reboot. And when I did, my PC would not recognize the GFX card anymore. So I tryed to put the old original BIOS back on the card, and flashrom gives me an error "Adapter 0 not found". I tryed a few other flash programs too, with the same results. So I was lucky enough to still have my receipt, and was able to get a new one.

    Now my question is, do you think my edited BIOS killed the card, or was it just a bad card? I would think if it was my edited BIOS, then it would messed up right off the bat, not 12+ hours later. I got a new card today, and I am going to see if this one runs at XT speeds, hopefully it does. But if it doesnt, can I just underclock the XT bios using ATI tool, instead of hacking an XT bios to run at the speed my card works on?

    Thanks Chai for responding too.
     
  19. Chai

    Chai Administrator Staff Member

    To summarize both questions posted above, yes, you have already hit the limit of your card. I myself face the same problem. No matter what cooling I use for the card, it will not do anything above 400MHz. This is a very normal issue. Maybe that's the reason why they are rebadged XT cards... Who knows...
     
  20. shank69xo

    shank69xo Newbie

    Okay, my dead card could only reach 400 core. I got a replacement today and have been messing with it for about 2 hours now. This new PRO, which is the exact same model as the one that just died, runs great at XT speeds of 412/365. No artifacts whatsoever in 3DMark, Doom3, or the ATItool artifact scan. And the temp is lower too. After gaming for an hour, it is only 64 degrees. So I am about to flash it with XT BIOS. I will post results after awhile.

    I think you were right about my first card reaching its max. And like yours my max core was 400. I never tryed running the memory any higher than 365 though. So I am just wondering if you think my edited BIOS killed the first card, or was it just probably a bad card?

    Well Im off to flash this new card with XT BIOS now. Thanks Chai for the help and input thus far.
     

Share This Page