Nvidia says the Intel CPU is "dead"

Discussion in 'Processors, Motherboards & Memory' started by generalRage1982hrv, Apr 28, 2008.

  1. generalRage1982hrv

    generalRage1982hrv ARP Reviewer

    as we all know for some time nvidia is talking about making a cpu
    if they make a cpu it will have large hit to market
    because ati/amd will be crushed by that and they will lose war against intel and nvidia
    first evidence of nvidias work on cpus is g80 a core that can be reprogrammed and that can be used as CPU (Russians have proven that when they used 8800GTX in sli for cracking)
    so it would not be strange to see a nv in cpu world
    as some of us knows that intel is making 8 core and 16 core cpus and that some of them will be lunched at the end of this year
    would that be enough if nv comes out ?
     
  2. The_YongGrand

    The_YongGrand Just Started

    How about the X86 license? They need that one too from the Intel guys... :shock:
     
  3. generalRage1982hrv

    generalRage1982hrv ARP Reviewer

    well if they are acting like this they maybe already have it
     
  4. Mac Daddy

    Mac Daddy Pickin' Da Gitfiddle

    Yes I read this before work Cheers on the link Bro and its pretty scary isn't it. Intel is strong and I don't think they will topple but this is going to hurt and already suffering AMD big time :think:

    NVidia is a HUGE company and I don't think they are just blowing hot air on this one ...
     
  5. peaz

    peaz ARP Webmaster Staff Member

    The thing is... looking at how blardy hot and power hungry the current GPUs... do you want to run them as your CPU? It's just crazy. Even IBM's cell architecture is a more sound technology that's highly scalable.
     
  6. Zenphic

    Zenphic Newbie

    Hahah, Nvidia is sure daring, but I doubt that they would make huge waves in the CPU world if ever they ventured in it
     
  7. ZuePhok

    ZuePhok Just Started

    dude, throw a copy of MS word on it, and the g80 will run like a dog. a cpu is damn good@branch predicting, and current gen hybrid x86 chip is OOO based.
     
  8. generalRage1982hrv

    generalRage1982hrv ARP Reviewer

    take a look on gpu core
    it is more complex than a cpu core and it have much higher bandwidth(flow of data) and much higher speeds because shader processors are not working like cpu processor each separately
    they are working linked and that is giving more power and data calculation than a cpu cores
    it is not hard to use old g80 core to work like a extension to cpu and that will force more data calculation and more hmmmmm simply power
    so if they use that technology on cores it will kill everything in its way
    OOO?????
     
  9. The_YongGrand

    The_YongGrand Just Started

    Oh goodness, looks like we can't simply look down on nVidia's offering this time. They seem to be getting powerful and could have been threatening Intel's position as well.

    On the other hand, I love the idea of budget systems - they are usually appealing to many folks who uses computer as a web-surfing, music jukebox and movie theatre system. :mrgreen:

    But the strange thing is they are going with VIA processors? They might not be fast, but they might be having ultra-low voltages, or maybe very low heat output (only use small heatsink). :mrgreen:
     
  10. Adrian Wong

    Adrian Wong Da Boss Staff Member

    IMHO, the industry may eventually move to ray tracing. Our April Fool joke (Tech ARP - Ray Tracing To Debut In DirectX 11) is based on ongoing work at Intel on that aspect. In fact, I foresee them pushing for a shift to ray-tracing in the future.

    That would really ruin NVIDIA's plans... although I did speak to an NVIDIA guy who said that NVIDIA is also "looking" into ray-tracing... :think:

    If that's true.. then the future GPU will be a lot more like CPUs than the current GPUs.
     
  11. peaz

    peaz ARP Webmaster Staff Member

    Still the way a GPU is designed today is just not meant to work as well as a general CPU. GPU does very good stream processing, pushing through loads of operations of pretty much the same kind.

    that's just exactly the opposite of a CPU which needs a lot of branch predictions as phok noted in order to perform well.
     
  12. Adrian Wong

    Adrian Wong Da Boss Staff Member

    Yup.. which means if the industry switches to ray tracing... NVIDIA will have to change tack real fast.
     
  13. generalRage1982hrv

    generalRage1982hrv ARP Reviewer

    that is true guys
    but what if they come to cpu market with strong cpu
    what will hapen to AMD ????
     
  14. Adrian Wong

    Adrian Wong Da Boss Staff Member

    AMD? They will have to really buck up... or sink. :(
     
  15. generalRage1982hrv

    generalRage1982hrv ARP Reviewer

    it would be bad to see them fall down
    if that happen we will see massive rise of prices on computer parts and that is wrong
    only rich people would be able to buy new stuff
     
    Last edited: Apr 30, 2008
  16. Zenphic

    Zenphic Newbie

    I agree that GPU cores have a lot of potential, but I doubt that its technologies can be easily adapted to a CPU. Afterall, AMD does own ATI, and if GPU to CPU was easy enough, then AMD should have done some advances by now
     
  17. Adrian Wong

    Adrian Wong Da Boss Staff Member

    Yeah, let's just hope they buck up real quick. :mrgreen:
     
  18. Adrian Wong

    Adrian Wong Da Boss Staff Member

    Well, even if the industry does transition to ray tracing, it will take many years. In the meantime, games are still rendered in raster.

    If AMD knew that the future would be ray tracing for sure, I don't think they would have bothered merging with ATI. :mrgreen:
     
  19. generalRage1982hrv

    generalRage1982hrv ARP Reviewer



    here is some interesting stuff for you guys
    >>>Nvidia's David Kirk on CUDA, CPUs and GPUs<<<read full interview here
     
    Last edited: May 1, 2008
  20. Mac Daddy

    Mac Daddy Pickin' Da Gitfiddle

    I gave this a quick read and bookmarked it to read more thoroughly later .. THX for the post Bro :D
     

Share This Page