NVIDIA GeForce To Quadro Soft-Mod Guide

Discussion in 'Reviews & Articles' started by Dashken, May 9, 2008.

  1. generalRage1982hrv

    generalRage1982hrv ARP Reviewer

    why dose quadro gpus cost 3x more than regular cards
    lets say like this
    memory on cards are better (well more bandwith than on regular cards), it have twice more ddr on it, they are optimized inside hardware to work much faster than regular card
    and you say that
    yeah only ultra can do same performance as 4600 but
    if you are planing to work on "lets say a grafics" you will pay 1000-2000$ for gpu like 4600 ? nah i dont think so
    you would rather get high end card like 8800GTX and convert to quadro for 300-500$ so it would do the same thing but faster than regular card like GTX

    nah people just was not thinking about it
    maybe because of this cpu&gpu race these days
    most of reviews was trying to get info about who is making what inside industries
    riva tuner haves that option after g80 card was released but no one was interested in puting it on web (well maybe because nvidia is sueing everyone who plays with their hardware and software)
     
  2. BluesmanI

    BluesmanI Newbie

    Well, correct me if I'm wrong, but after reading the technical specs of both geforce 8800gtx and fx4600 i come to the conclusion, they're completely the same. So normally the modded 8800gtx should be as powerful as fx4600, not 8800 ultra as you mentioned.

    And btw, their price is not that much higher because of the greater use of memory parts. The price is mostly defined by the professional, certified drivers you get for quadros.
     
  3. generalRage1982hrv

    generalRage1982hrv ARP Reviewer

    uh you must be electronics engineer and programing guru
    well if you trust that give us solid proof (like flash your gpu with quadro bios)
    thank you


    p.s. i was trying to tell you what goes and what do not go well you need little bit to experiment if you wanna show me something
    i have tryd and i have seen fx4600 and GTX there is slight difference on pcb
     
    Last edited: Jul 31, 2008
  4. boonelee23

    boonelee23 Newbie

    Missing Nstrap on Geforce 8800GT

    Hi,
    Just brought my Gigabyte Geforce 8800GT 512MB and it's G92 chipset. Unfortunally I was unable to see Nstrap tab on my RivaTuner. So can any one help me please.
    Thanks

    BooneLee
     

    Attached Files:

  5. generalRage1982hrv

    generalRage1982hrv ARP Reviewer

    HI boonelee
    when drivers are issue riva tuner wonth work corectly
    how i see you are using beta drivers
    reinstal driver with older series but whql ones
     
    Last edited: Aug 5, 2008
  6. Chai

    Chai Administrator Staff Member

    I don't think you can see Nstrap tab in Vista, right?
     
  7. dude2

    dude2 Newbie

    Can you still use a G80 8800GTS to emulate a FX 4600?

    It is quite a learning experience reading "NVIDIA GeForce To Quadro Soft-Mod Guide Rev. 3.0". I bet it opens a door for a lot of beginners who would like to try something but dare not to in the past.

    I had a problem at the early phase though:
    The latest Nvidia Quadro 4600 driver pack that I downloaded has no nvdisp.inf file packed but only packed with an nv4_disp.inf file. In that file all G80 8800 series cards are not listed. Does it suggest THE END OF EMULATION OF QUADRO?

    Another question:
    It is not clear to me is how to identify the COMPARABLE Geforce card for its Quadro counterpart. In the example, there are multiple G80 Geforce cards such as 8800 GTX, 8800GTS, 8800 Ultra listed along side with G80 Quadro FX 5600 and 4600. How do you know which Geforce card can emulate a specific Quadro FX card without causing any h/w problem? For example, can you use any G80 Geforce card such as 8800GTS 320MB to emulate a FX4600 while these two cards have different memory bandwidths, 320-bit versus 384-bit?

    Also, even if an 8800 GTS can emulate a Quadro FX 4600 without causing any h/w trouble, it can only improve D3D type 3ds Max's performance but not on OpenGL type Maya, right?

    Hope these questions are not too far off...:roll:
     
  8. generalRage1982hrv

    generalRage1982hrv ARP Reviewer

    hi dude2
    what driver you wanna use ??
    here is my list of moded drivers
    they are working with any card
    http://forums.techarp.com/graphics-...e-drivers-all-gpus-including-mobile-ones.html

    fx4600 is slowest from series so you can compare it with GTS cards
    there is a difference in all cards but this is closest that you can get from your card
    it will boost your card by few percents in Maya but also how you know that openGL based programs is best to use with AMD/ATI cards mostly because they are beter in openGL than nvidia
    it is all because drivers programming nvidia dont wanna support openGL mostly because their programing sux in that section
    i dont wanna play with driver cracking so we dont get any legal threats from nvidia
    if you wanna drivers like that check guru3d.com they are posting all of shit like that
    one more thing i think today nvidia should relise driver that supports physX on g80 cores
    well it would be fun to edit this one LOL
     
    Last edited: Aug 5, 2008
  9. dude2

    dude2 Newbie

    How to pick the right Geforce card and perf boost issues

    Any card which is not as expensive but can be dual-used for gaming and preliminary drawings and animations. I just did not see the 8800s or the Geforce cards in general got listed in the official Quadro FX4600 169.96 driver pack's nv4_disp.inf file, and that alarmed me like: "Is it the end of the emulation of FX4600?"
    What is the difference between the 177.xx series drivers that are posted for downloading in your referred link and the official nVidia 169.96_quadro_winxp2k WHQL driver? :confused:

    Can you use any G80 Geforce card such as 8800GTS 320MB to emulate a FX4600 while these two cards have different memory bandwidths, 320-bit versus 384-bit? Or, only the 384-bit 8800 Ultra or GTX can emulate Quadro FX4600?!
    How close is close enough that will not cause h/w problem? :roll:

    I guess those Quadro specific h/w features such as OpenGL Logic Operations may be not carried over to the Geforce cards and which makes the emulation a tough task. Isn't it?
    However, I am impressed to see the performance benchmark of a soft-modded 8800GTS 320MB on 3ds Max in your latest report. Does the 3ds Max performance boost come from the Quadro Maxtreme driver, or will 3ds Max's performance boost be as much impressive with Discreet's OpenGL driver? Would there really be any performance boost on Maya by this soft-mod?

    BTW, do you think ATI HD3850 or HD3870 will be a better good dual-use card? How much can they match up with FireGL? :rolleyes:
     
  10. generalRage1982hrv

    generalRage1982hrv ARP Reviewer

    you can emulate any of g80 series to quadro

    difference in those two drivers is not that small
    it is all about driver optimizations right
    test them both to see wich one is beter for you

    i almost never playd with ati cards mostly was playing with hard mods
    but never with software
     
  11. Hi

    i just acquired this Geforce NVIDIA_G92.DEV_0611.1 = "NVIDIA GeForce 8800 GT", but i can't softmod it as a NVIDIA_G92.DEV_061A.1 = "NVIDIA Quadro FX 3700" becouse the low level System tweak windows shows only the Fan tab. Am i doing something wrong?
     
  12. dude2

    dude2 Newbie

    A follow-up message posted on Autodesk

    Beyond emulating and applying Quadro drivers on GeForce cards, I believe making sure what are the h/w limitations of GeForce cards is of essential importance. Therefore, I posted a message on Autodesk.

    Please refer to :arrow: h_t_t_p://area.autodesk.com/index.php/forums/viewthread/15350/
     
  13. generalRage1982hrv

    generalRage1982hrv ARP Reviewer

    as i can see you think that quadro driver should boost gpu to max ?? nah you are wrong
    it is just boosting 3dmax and rest of programs
    game are different thing you will lose 5-15% on performance there mainly because driver is a code that tells your gpu what to do and in this case is boosting 3dmax and rest of programs (not games)
    so people stop asking stupid questions i allready gave answare
    if you are trying to show your self smarter than others you are doing it on wrong place
    your ignorence is showing you that you are ...........

    PEOPLE THINK ABOUT WHAT ARE POSTS FOR ???
    WELL FIRST READ THEM ALL AND YOU WILL FIND YOUR ANSWARE THERE
    SO NO MORE ASKING SAME QUESTIONS OK?
     
    Last edited: Aug 8, 2008
  14. dude2

    dude2 Newbie

    I was searching an answer on h/w limitations of GeForce cards based on the following reasons:

    (1)Some articles revealed the softmod limitation of GeForce 7000,8000, and 9000 series

    >>
    GF 6800 was the last one that could be softmoded to Quadro fx 4000. I did it. Sure it was faster in SPECheatYouTest in OpenGL, but under regular work in Max in D3D there was no speed difference. Not because the softmod failed (since it didn’t and SPECheatYouTest will certify that in its cheating OpenGL methodology), but because there is no real difference in the cards themselves- hence the mod was even possible.
    After it, nvidia locked the softmoding out, but by than, no one should really have cared. D3D was a long standing 3DS Max part and it worked fully with all the recent cards- geforce 256 (the very first geforce) and up.<<
    extracted from: h_t_t_p://area.autodesk.com/index.php/forums/viewthread/15350/

    (2)For GeForce and Quadro, if the D3D part is not different and the OGL part is hardware locked, then how much OGL functions are locked out in GeForce?

    Bearing these two reasons in mind and leaving alone the emulation for a while, I started researching on how much difference in performance and functionality there is between Quadro and GeForce cards.

    Below is what I have gathered so far:

    [Difference on D3D and OGL between GeForce and Quadro]

    *nVidia's email reponse -> Users can generally use 8800 GT on 3ds Max or Maya type drawings or animations and GeForce cards do support the same D3D and OpenGL functions just like Quadro FX series, but both NVIDIA and the CAD vendors strongly recommend against doing this.

    *Autodesk's 3ds Max/Maya web site -> GeForce users may experience various refresh, display and stability problems and inadequate performance.

    *One experienced user's input -> There is no difference in practice on D3D 3ds Max; OGL is not fully practiced.


    Need to find:

    1. from NVidia:
    How will the nVidia locked out hardware part affect the functionality of GeForce cards? Could it only affect OGL but not D3D?

    2. from AutoDesk:
    What are those various refresh and display problems? Can Autodesk elaborate on these problems? Are these problems common and reproducible on all GeForce cards? Or, does the “may experience” imply VARIOUS REFRESH AND DISPLAY PROBLEMS MAY NOT SHOW UP ON HIGH END GEFORCE CARDS?

    Anyone on this forum is more than welcomed to add his/her findings or correct found errors before I sum up my questions and send them to NV and AD. :cool:
     
    Last edited: Aug 13, 2008
  15. Adrian Wong

    Adrian Wong Da Boss Staff Member

    As far as I know, the limitation is not in the hardware but the driver itself. The GeForce and Quadro cards are functionally the same AFAIK. The only differences would be the labels and of course, the BIOS (or any other identifier) that NVIDIA uses to discriminate between a true Quadro and a GeForce.
     
  16. dude2

    dude2 Newbie

    Technical Brief "NVIDIA Quadro vs. GeForce GPUs"

    NVidia customer support only told me that...
    >>users can generally use 8800 GT on 3ds Max or Maya type drawings or animations and GeForce cards do support the same D3D and OpenGL functions just like Quadro FX series, but both NVIDIA and the CAD vendors strongly recommend against doing this.<<

    However, recently when I asked NVidia about how updated the Technical Brief "NVIDIA Quadro vs. GeForce GPUs" is, NVidia admitted that Tech Brief is to be updated to reflect the change. Based on that document though, there are some h/w functionality differences.

    May I know where you have learned that they are only different on labels and BIOS?
     
  17. generalRage1982hrv

    generalRage1982hrv ARP Reviewer

    well how i see you are asking auto cad and nvidia staff about some things
    as i told you before difference is not that big . why? mostly because i had that card on my hands and i made some tests with it
    it is same as GTX. why is same as gtx? pcb and it parts are 99% same only difference is on ddr3 chips and few more resistors (differnce in resistors ohm-s mostly because of different ddr3 chips and voltage chips)

    before posting any more questions
    you need firts to tell me who are you and why all of this b***** are you asking
    if you are trying to make any point or to show us what are
    you aiming on, tell us without any b*****
     
    Last edited: Aug 14, 2008
  18. dude2

    dude2 Newbie

    know it too well and eager to share?

    If anyone has a GeForce card or has tested it thoroughly and know the difference between GF and Quadro, welcome to share the knowledge and let us know the difference even there may be only 1 or 2 percents difference between them.

    Devil is in the detail, thus why asking.
     
  19. generalRage1982hrv

    generalRage1982hrv ARP Reviewer

    as i see that you dident check anything and you are making truble LOL
    dude2 first read guide compleatly
    than your low percepcion would see that users did test it LOL
    and that users posted their results :haha: :haha:
     
  20. PsYkHoTiK

    PsYkHoTiK Admin nerd

    Hey lets mind the language there. :mrgreen:

    Keep posts clean... :beer: :arp:
     

Share This Page