User Panel
Posted: 9/2/2015 1:28:03 AM EDT
For the last couple of days the GPU war got really really hot with DX12 benchmarks. To make a long story short AMD beating NVIDIA in DX12 and Vulkan API. AMD under $300 cards (290X) beating NVIDIA top of the line like 980ti ($700) and Titian ($1000+) in DX12/Vulkan. I don't think there is an easy way out for NVIDIA. NVIDIA next generation will come out next year but AMD Asynchronous Compute/Shaders tech belongs to AMD. AMD Mentle API is now integrated to DX12 and Vulkan which take full advantage of AMD Asynchronous Compute/Shaders.
Also keep in mind all console used AMD GPU with Asynchronous Compute/Shaders. XBOX ONE will be DX12 and all major AAA will use Asynchronous Compute/Shaders in the coming years. It gives a huge performance boost to old GPU tech. In a nutshell: Nvidia will beat AMD in DX11. AMD will beat Nvidia in DX12. You can google more on the subject if you like. https://www.reddit.com/r/pcgaming/comments/3j1916/get_your_popcorn_ready_nv_gpus_do_not_support/ http://www.overclock.net/t/1569897/various-ashes-of-the-singularity-dx12-benchmarks http://wccftech.com/amd-r9-290x-fast-titan-dx12-enabled-3dmark-33-faster-gtx-980/ |
|
[#1]
So, any best bang for the buck amds out there using this tech?
Txl |
|
[#2]
Quoted: So, any best bang for the buck amds out there using this tech? Txl View Quote All AMD use this tech since XBOX1/PS4. All consoles use AMD GPU Asynchronous Compute/Shaders. PS4 is using Asynchronous Compute/Shaders in their game which made XBOX1 the weaker console. MS build DX12 to level the playing field. DX12/Vulkan will have AMD Mentle API integrated into them to take full advantage of AMD GPU. The GPU going to get bloody in the coming days. Newegg is getting a high return of NVIDIA cards since the story broke out. Google to find more on the subject but keep in mind all sides have a high stake in the next generation API. |
|
[#4]
Nvidia did a lot of work on their cards to work with DX11. They focused a lot on DX11. AMD did not.
The 290X is a competitor to the 780 wccftech is not a reliable source. Everything from them should be taken with the tiniest grain of salt. Most of their information is false and incorrect. |
|
[#5]
Quoted:
For the last couple of days the GPU war got really really hot with DX12 benchmarks. To make a long story short AMD beating NVIDIA in DX12 and Vulkan API. AMD under $300 cards (290X) beating NVIDIA top of the line like 980ti ($700) and Titian ($1000+) in DX12/Vulkan. I don't think there is an easy way out for NVIDIA. NVIDIA next generation will come out next year but AMD Asynchronous Compute/Shaders tech belongs to AMD. AMD Mentle API is now integrated to DX12 and Vulkan which take full advantage of AMD Asynchronous Compute/Shaders. Also keep in mind all console used AMD GPU with Asynchronous Compute/Shaders. XBOX ONE will be DX12 and all major AAA will use Asynchronous Compute/Shaders in the coming years. It gives a huge performance boost to old GPU tech. In a nutshell: Nvidia will beat AMD in DX11. AMD will beat Nvidia in DX12. You can google more on the subject if you like. https://www.reddit.com/r/pcgaming/comments/3j1916/get_your_popcorn_ready_nv_gpus_do_not_support/ http://www.overclock.net/t/1569897/various-ashes-of-the-singularity-dx12-benchmarks http://wccftech.com/amd-r9-290x-fast-titan-dx12-enabled-3dmark-33-faster-gtx-980/ View Quote Meh. I'm sure with future drive updates will make up the difference on existing products. I doubt AMD will hold up to the upcoming pascal gpu's from nividia. And the amd performance increases under dx12 still don't change the additional power consumption and higher temps of amd cards vs their nvidia counterparts. |
|
[#6]
Quoted: Meh. I'm sure with future drive updates will make up the difference on existing products. I doubt AMD will hold up to the upcoming pascal gpu's from nividia. And the amd performance increases under dx12 still don't change the additional power consumption and higher temps of amd cards vs their nvidia counterparts. View Quote View All Quotes View All Quotes Quoted: Quoted: For the last couple of days the GPU war got really really hot with DX12 benchmarks. To make a long story short AMD beating NVIDIA in DX12 and Vulkan API. AMD under $300 cards (290X) beating NVIDIA top of the line like 980ti ($700) and Titian ($1000+) in DX12/Vulkan. I don't think there is an easy way out for NVIDIA. NVIDIA next generation will come out next year but AMD Asynchronous Compute/Shaders tech belongs to AMD. AMD Mentle API is now integrated to DX12 and Vulkan which take full advantage of AMD Asynchronous Compute/Shaders. Also keep in mind all console used AMD GPU with Asynchronous Compute/Shaders. XBOX ONE will be DX12 and all major AAA will use Asynchronous Compute/Shaders in the coming years. It gives a huge performance boost to old GPU tech. In a nutshell: Nvidia will beat AMD in DX11. AMD will beat Nvidia in DX12. You can google more on the subject if you like. https://www.reddit.com/r/pcgaming/comments/3j1916/get_your_popcorn_ready_nv_gpus_do_not_support/ http://www.overclock.net/t/1569897/various-ashes-of-the-singularity-dx12-benchmarks http://wccftech.com/amd-r9-290x-fast-titan-dx12-enabled-3dmark-33-faster-gtx-980/ Meh. I'm sure with future drive updates will make up the difference on existing products. I doubt AMD will hold up to the upcoming pascal gpu's from nividia. And the amd performance increases under dx12 still don't change the additional power consumption and higher temps of amd cards vs their nvidia counterparts. The problem is hardware. It can't be fix with drivers updates. AMD have at least 4 years lead on the technology. Tech sites aren't sure if Pacal will have Asynchronous Compute/Shaders. Nvidia will have to do a complete change of their architecture in Pascal to create their version of Asynchronous Compute/Shaders. There no easy way for Nvidia to get out of this. They invested a lot of money and time in DX11. http://www.overclock.net/t/1569897/various-ashes-of-the-singularity-dx12-benchmarks/1870 They did like I did at first. They looked at the documentation. According to the documentation, Maxwell 2.0 supports Async Compute/Shaders through HyperQ. It is very hard to find documentation on Maxwell 2 because there is no White paper published by nVIDIA. This is why I used the Kepler papers, wccf did the same. The problem is that Beyond3D's test has shown that Maxwell 2 isn't using Mixed mode at all. It is behaving like Maxwell instead. Maxwell doesn't support Mixed mode. The Oxide developer stated that, as far as he knew, Maxwell couldn't do Async. The controversy has grown, rather than shrunk, with the tests Beyond3D are doing because now their findings are point towards the ability of Maxwell 2 to handle compute loads Asynchronously (32 Compute) but not Graphics + Compute (31 Compute + 1 Graphics). This is what is most interesting thus far. I don't think WCCFtech has grasped just what is happening right now. As far as a software scheduler, I also mentioned this in relation to Kepler. One of the big changes between Kepler and Fermi was the removal of a Hardware scheduler. This is why Kepler/Maxwell and Maxwell 2 use up less power (not because they're designed more efficiently per-say). By placing the scheduler in software, you can fine tune the scheduler using the driver. This is why nVIDIA has far more leg room under DX11 to fine tune the driver and derive a boost in performance. GCN, on the other hand, relies on a Hardware scheduler. A Hardware Scheduler is better for DX12, because the API is closer to metal leaving less room for shader replacements and other forms of driver intervention. What we have with Kepler/Maxwell/Maxwell 2 are cards which are fine tuned for DX11. What we have with GCN are cards fine tuned for Vulcan, Mantle and DX12. Now while nVIDIAs Maxwell/2 cards can support more DX12 features (the same was true of the GeForce 6800 Ultra) that doesn't necessarily translate into better performance. The nVIDIA architectures lack the Compute Parallelism performance now unlocked by the new APIs. For the first DX12 titles, this might not be too much of a problem, but Pascal would need to be a completely revamped architecture on this front. We saw nVIDIA take the first steps with Maxwell/2 towards that direction. AMD, on the other hand, are set to strike with an architecture which will further boost their lead in this area. We can't speculate as to whether Greenland or Pascal will be better, but we can mention that AMD have far less architectural changes needed in order to derive incredible Vulcan and DX12 performance. nVIDIA, on the other hand, needs a huge overhaul of its architecture in order to achieve the same result. |
|
[#7]
Quoted: The problem is hardware. It can't be fix with drivers updates. AMD have at least 4 years lead on the technology. Tech sites aren't sure if Pacal will have Asynchronous Compute/Shaders. Nvidia will have to do a complete change of their architecture in Pascal to create their version of Asynchronous Compute/Shaders. There no easy way for Nvidia to get out of this. They invested a lot of money and time in DX11. View Quote View All Quotes View All Quotes Quoted: Quoted: Quoted: For the last couple of days the GPU war got really really hot with DX12 benchmarks. To make a long story short AMD beating NVIDIA in DX12 and Vulkan API. AMD under $300 cards (290X) beating NVIDIA top of the line like 980ti ($700) and Titian ($1000+) in DX12/Vulkan. I don't think there is an easy way out for NVIDIA. NVIDIA next generation will come out next year but AMD Asynchronous Compute/Shaders tech belongs to AMD. AMD Mentle API is now integrated to DX12 and Vulkan which take full advantage of AMD Asynchronous Compute/Shaders. Also keep in mind all console used AMD GPU with Asynchronous Compute/Shaders. XBOX ONE will be DX12 and all major AAA will use Asynchronous Compute/Shaders in the coming years. It gives a huge performance boost to old GPU tech. In a nutshell: Nvidia will beat AMD in DX11. AMD will beat Nvidia in DX12. You can google more on the subject if you like. https://www.reddit.com/r/pcgaming/comments/3j1916/get_your_popcorn_ready_nv_gpus_do_not_support/ http://www.overclock.net/t/1569897/various-ashes-of-the-singularity-dx12-benchmarks http://wccftech.com/amd-r9-290x-fast-titan-dx12-enabled-3dmark-33-faster-gtx-980/ Meh. I'm sure with future drive updates will make up the difference on existing products. I doubt AMD will hold up to the upcoming pascal gpu's from nividia. And the amd performance increases under dx12 still don't change the additional power consumption and higher temps of amd cards vs their nvidia counterparts. The problem is hardware. It can't be fix with drivers updates. AMD have at least 4 years lead on the technology. Tech sites aren't sure if Pacal will have Asynchronous Compute/Shaders. Nvidia will have to do a complete change of their architecture in Pascal to create their version of Asynchronous Compute/Shaders. There no easy way for Nvidia to get out of this. They invested a lot of money and time in DX11. http://www.overclock.net/t/1569897/various-ashes-of-the-singularity-dx12-benchmarks/1870 They did like I did at first. They looked at the documentation. According to the documentation, Maxwell 2.0 supports Async Compute/Shaders through HyperQ. It is very hard to find documentation on Maxwell 2 because there is no White paper published by nVIDIA. This is why I used the Kepler papers, wccf did the same. The problem is that Beyond3D's test has shown that Maxwell 2 isn't using Mixed mode at all. It is behaving like Maxwell instead. Maxwell doesn't support Mixed mode. The Oxide developer stated that, as far as he knew, Maxwell couldn't do Async. The controversy has grown, rather than shrunk, with the tests Beyond3D are doing because now their findings are point towards the ability of Maxwell 2 to handle compute loads Asynchronously (32 Compute) but not Graphics + Compute (31 Compute + 1 Graphics). This is what is most interesting thus far. I don't think WCCFtech has grasped just what is happening right now. As far as a software scheduler, I also mentioned this in relation to Kepler. One of the big changes between Kepler and Fermi was the removal of a Hardware scheduler. This is why Kepler/Maxwell and Maxwell 2 use up less power (not because they're designed more efficiently per-say). By placing the scheduler in software, you can fine tune the scheduler using the driver. This is why nVIDIA has far more leg room under DX11 to fine tune the driver and derive a boost in performance. GCN, on the other hand, relies on a Hardware scheduler. A Hardware Scheduler is better for DX12, because the API is closer to metal leaving less room for shader replacements and other forms of driver intervention. What we have with Kepler/Maxwell/Maxwell 2 are cards which are fine tuned for DX11. What we have with GCN are cards fine tuned for Vulcan, Mantle and DX12. Now while nVIDIAs Maxwell/2 cards can support more DX12 features (the same was true of the GeForce 6800 Ultra) that doesn't necessarily translate into better performance. The nVIDIA architectures lack the Compute Parallelism performance now unlocked by the new APIs. For the first DX12 titles, this might not be too much of a problem, but Pascal would need to be a completely revamped architecture on this front. We saw nVIDIA take the first steps with Maxwell/2 towards that direction. AMD, on the other hand, are set to strike with an architecture which will further boost their lead in this area. We can't speculate as to whether Greenland or Pascal will be better, but we can mention that AMD have far less architectural changes needed in order to derive incredible Vulcan and DX12 performance. nVIDIA, on the other hand, needs a huge overhaul of its architecture in order to achieve the same result. Bold text. /thread And yet here you are spouting off that curret new (and flagship) Nvidia cards are obsolete by a hot, power hungry, last Gen AMD card (which is false). Lol...ok |
|
[#9]
Oxide is a PR mouthpiece for AMD. Remember when Mantle and the Starswarm benchmark came out? All the AMD fanbois came out and touted it was the end of Nvidia. Then it was revealed that Oxide had disabled Deferred Context. Once that was switched on and Nvidia had done some driver tweaks they showed just how far ahead of AMD they was.
Now it's another round come about and again the fanbois are getting that tingle up their legs. Once Nvidia works out it's DX12 drivers and there is more game benchmarks that are released then, and only then will anybody know for sure. |
|
[#10]
Pascal is coming in the first quarter of 2016. That means within 3 months or so it goes to the factory for production. It development dated before DX12 was announced. It use the same type of basic architecture as previous Nvidia generation. They aren't going to reinvent the wheel that been working so great in DX11. I seriously doubt Nvidia have their own version in Pascal.
This is 100% a hardware problem for Nvidia. Compound with AMD API integrated into DX12 and Vulkan to take full advantage of AMD GPU and not Nvidia. So it going to stack against Nvidia hard in DX12. Than there is G-Sync which Nvidia just lost to AMD Free-Sync. Suck for people that bought high end monitor using G-Sync. A lot of Nvidia customers are pretty pissed off now. They want to start a class action suit against the company. https://www.reddit.com/r/pcgaming/comments/3jfgs9/maxwell_does_support_async_compute_but_with_a/ https://www.reddit.com/r/pcgaming/comments/3jfgs9/maxwell_does_support_async_compute_but_with_a/ Let me start with this: MAXWELL DOES SUPPORT ASYNC SHADERS/COMPUTE. View Quote But it software emulates it. The driver is sending the compute workload to the CPU for it to process while the GPU is processing graphics (link below). It's a clever trick to claim feature "support", one that breaks down when a game either needs those CPU cycles or has lot of Async Compute that it floods the CPU causing a massive performance loss. This is why Oxide had to disable Async Compute in their test. It would have tank performance even harder. just some unfortunate complex interaction between software scheduling trying to emmulate it which appeared to incure some heavy CPU costs AMD Simplified: Asynchronous Shaders https://www.youtube.com/watch?v=v3dUhep0rBs |
|
[#12]
all this talk about 980ti and im still stuck in last century with 980 sli.
|
|
[#13]
Nvidia will rework whatever they have to, if it's that big of a deal (it isn't). AMD will be back where it belongs at the bottom. AMD cards and CPUs blow.
|
|
[#16]
Seems like AMD lucked out that their architecture choices in previous generations is now suddenly a good decision for the next gen of DirectX. DX means nothing for current consoles, they can't change much. It also matters little for games that are out now or soon to come out because they've already made their engine and renderer choices.
|
|
[#17]
Xbox One is supposed to get DirectX 12 support this November, the PS4 has had a number of games already use async compute. Should be interesting to see how much this helps cross platform games.
Not surprised though that AMD is ahead of nVidia in this area though, they do introduce things earlier than nVidia does. From the Radeon 9700pro supporting tesselation (called truform), having the first unified shader GPU (xbox 360), etc. They just do a much worse job of developer relations and especially PR. From a cpu standpoint they don't compete that well against Intel, especially since Intel can out-manufacture most other fabs. But considering that Intel's primary development center for their Core CPU's is in Haifa, the recent Iran nuclear deal might take care of that for them ;). |
|
[#18]
AMD has a four year tech lead on Nvidia? What dream land are you living in? LOL Might as well have said forty years because it would be just as blatant a lie.
Posted Via AR15.Com Mobile |
|
[#20]
|
|
[#22]
Quoted:
AMD has a four year tech lead on Nvidia? What dream land are you living in? LOL Might as well have said forty years because it would be just as blatant a lie. Posted Via AR15.Com Mobile View Quote lol this. also this happens every time a new generation comes out. AMD/ATI has the best chips for 6 mo or so then intel/nvidia pass them up for the couple years till the next change. |
|
[#23]
Quoted:
AMD has a four year tech lead on Nvidia? What dream land are you living in? LOL Might as well have said forty years because it would be just as blatant a lie. Posted Via AR15.Com Mobile View Quote Yup, you can't even forget the HDMI 2.0 support in the Fury cards... wait, AMD did. 4k cards indeed. Granted I didn't like Nvidia's pricing for me to run my setup, but like I have choice in the matter with AMD obviously being 4 years ahead of me. PEAK LEAD TIMES PEOPLE!!1!ONE |
|
[#25]
Dredging this one back up:
http://www.pcper.com/reviews/Graphics-Cards/Fable-Legends-Benchmark-DX12-Performance-Testing-Continues 1080p - 980Ti wins over R9 FuryX 4k - 980Ti wins over R9 Fury X HMMMMMM |
|
Sign up for the ARFCOM weekly newsletter and be entered to win a free ARFCOM membership. One new winner* is announced every week!
You will receive an email every Friday morning featuring the latest chatter from the hottest topics, breaking news surrounding legislation, as well as exclusive deals only available to ARFCOM email subscribers.
AR15.COM is the world's largest firearm community and is a gathering place for firearm enthusiasts of all types.
From hunters and military members, to competition shooters and general firearm enthusiasts, we welcome anyone who values and respects the way of the firearm.
Subscribe to our monthly Newsletter to receive firearm news, product discounts from your favorite Industry Partners, and more.
Copyright © 1996-2024 AR15.COM LLC. All Rights Reserved.
Any use of this content without express written consent is prohibited.
AR15.Com reserves the right to overwrite or replace any affiliate, commercial, or monetizable links, posted by users, with our own.