AMD's GPU market share continues to decrease but recent benchmarks suggest that its cards benefit greatly from Windows 10's DirectX 12.
DirectX 12 is meant to provide low level access to hardware and essentially bypass the driver optimization process that has been one of NVIDIA's key strengths over the past few years. AMD's hardware, going by metrics such as shader and transistor counts, has typically been more powerful than NVIDIA's. At launch, however, AMDs drivers are less optimized which means Radeon cards will often lose out to less powerful GeForce cards at the crucial first rounds of reviews. Over time, the Radeon cards will eventually increase performance concomitant to their hardware capabilities.
Case in point, the R9 290 series which traded blows with the 780 at launch is, based on early DX12 game benchmarks, competing well with the 980TI, the fastest single GPU card available*. NVIDIA, however, need only adjust the pricing of its SKUs to reflect a smaller premium at similar performance levels to compete. After all, most games available are still DX9/10/11 where the best performance - though not necessarily best performance for the price - still comes from NVIDIA.
The following was a post from my old site from 2014.
Yesterday I was chatting with a fellow geek in Survivor City 1 and he mentioned that PC gamers using AMD cards ought to be worried on account of an NVIDIA development called GameWorks.
From what I understand, NVIDIA’s GameWorks is basically a contract that lets game developers get more performance from NVIDIA cards through special software and collaboration with NVIDIA programmers. The alleged downside is that any optimizations developed in the program may not be shared with NVIDIA’s rival, AMD.
There are many ways to translate this. Forbes blogger Jason Evangelho decided to write about NVIDIA’s program in his clickbait article entitled “Why ‘Watch Dogs’ Is Bad News For AMD Users — And Potentially The Entire PC Gaming Ecosystem” 2
He mainly cites Robert Hallock from AMD marketing 3 and an article by Joel Hruska from the website ExtremeTech. 4 The gist is that NVIDIA’s position as market leader allows it to force developers to optimize for NVIDIA cards while legally precluding those optimizations from AMD cards.
The result? AMD cards end up looking unnecessarily worse in game reviews which means AMD’s market share and profitability deteriorate – perhaps to the point of bankruptcy. The PC gaming landscape will be ruled by NVIDIA and consumers will be living under a regime of worse price/performance products.
It’s a compelling strategy at first glance. In fact, it’s so compelling, that just about every company that’s been in a position of market leadership has tried it. The long-term result, however, usually backfires because executives forget the key to business success, i.e., delivering the most value for the customer.
Big Blue: A short case study
IBM introduced their personal computer in 1981 and by 1983 it beat out Apple, Commodore, and a whole host of other companies to became the market leader. 5 The IBM PC not only popularized the term “PC” but was responsible for the success of companies like Intel and Microsoft. As the company that literally created the PC ecosystem, you’d think IBM would still be calling the shots today. IBM left the PC market in 2004 but its personal computing division had been dead for much longer.
Some commentators think that IBM lost because the IBM PC used non-proprietary hardware from Intel and non-proprietary software from Microsoft. 6
I’d argue that it was precisely IBM’s use of non-proprietary hardware and software that gave its PC, and eventually PC-clone makers like Dell and HP, widespread market share. Developing your own OS and processor is extraordinarily expensive. The IBM PC was slower than its chief competitor, the Apple II, but it was cheaper. In today’s dollars, it was $800 cheaper. It was also open which meant that anyone could develop hardware and software for it without licensing or patent worries. After a short while, companies like Compaq figured out ways to provide IBM PC-compatible computers for even less. IBM’s market share plummeted. IBM’s response? Go proprietary.
In 1987, IBM introduced their PS/2 which would, if all went according to plan, use IBM’s OS/2 and AIX operating systems along with their proprietary Microchannel (MCA) bus. The mass of companies that crystallized to develop hardware for the IBM PC’s open ISA bus would now be forced to pay license fees for MCA. In IBM’s mind, If people didn’t buy IBM PCs, at least IBM would get a cut through MCA.
In response, IBM’s competitors created their own alternatives without license fees. Those buses, EISA and VESA Local Bus, not only enjoyed far greater success; their the modern day descendant, PCI, enjoys unquestioned supremacy today. EISA and VESA Local Bus weren’t necessarily better than MCA, but they were good enough and most importantly, cheaper.
It’s the same dynamic that led to VHS’s triumph over Sony’s proprietary Beta format and USB’s dominance over Apple’s proprietary Firewire. 7
NVIDIA seems determined to repeat IBM’s misstep here, though not in GameWorks, but in G-Sync. G-Sync is a serious advance in graphics technology that essentially offers lag-free V-Sync. It’s proprietary and will be available through licensing fees. AMD’s response, FreeSync, requires no licensing fees and will be part of upcoming monitor display standards. Is there any question which implementation will prevail? 8
GameWorks: Glide Strikes Back
As for GameWorks 9, the tech industry provides a particularly trenchant cautionary tale in the name of GLide.
3dfx, the company that developed Glide almost single-handedly created the whole market for gaming cards with the introduction of the Voodoo card. At the time, it was a revelation. 3D games were fairly new and graphics card companies tended to focus on 2D performance with 3D as an afterthought. 3dfx focused almost entirely on 3D performance and the Voodoohumiliated its competitors. 10 Part of the reason the Voodoo was so much faster was Glide. If you were a game developer, you could extract the most performance from the Voodoo by using Glide, proprietary software routines designed for their graphics cards. Sound familiar?
Within a decade, 3dfx and Glide were extinct.
Glide and GameWorks offer(ed) real advantages. Games could work with the video card directly and harness its full capability rather than working through an interpreter. It’s the same reason that consoles are able to outperform PCs with equivalent hardware. So why did this approach live on in consoles but was left interred with 3dfx’s bones?
The refrain should be familiar now. What happened was 3dfx’s competitors rallied around Microsoft’s DirectX approach. DirectX is a set of standards that allow games to work closely with hardware. These standards are created through collaboration with video card companies and game developers. With DirectX, game developers have an idea what sort of graphics technologies most PC gamers are able to use and video card manufacturers know which technologies to optimize their hardware for and perhaps which technologies to advocate for inclusion in the next DirectX version. As DirectX wasn’t tied to any particular set of graphics hardware, unlike Glide, it would allow for similar performance at lower cost.
However, unlike DirectX, Glide was available immediately. No need to wait for the standard to be finalized and for video card manufacturers to go through the process of designing and producing cards adhering to the standard. Glide pretty much guaranteed 3dfx the gaming crowd’s dollars until DirectX cards started shipping. It was a textbook case of sustained economic advantage.
The arrival of DirectX wasn’t unmitigated good for the consumer though. One downside to the DirectX approach is that video card makers are less able to create competitive advantage through features. 11 For example, there’s no point in creating a powerful tessellation accelerator if DirectX doesn’t support tessellation. Instead, video card companies compete on increased performance in standard DirectX technologies, say Transform and Lighting (T&L) performance12, which directly translates to better gameplay since PC gaming is almost exclusively a Windows affair and Windows means DirectX.
But if DirectX development ends up lagging (or is nonexistent), then companies might be see an opportunity to develop new features with corresponding proprietary APIs. That’s what 3dfx did and it took the introduction of NVIDIA’s Riva TNT in 1998, two years after 3dfx’s Voodoo 13, for DirectX cards to catch up. 14 After that, Glide was finished
Does AMD really believe GameWorks will enjoy a fraction of the success that Glide did?
Firstly, the DirectX response to 3dfx’s Glide was basically a crash program starting from the ground up. NVIDIA, with a 65% discrete GPU market-share, is trying to develop an API that will compete with the very mature DirectX which still has a near 100% market-share in PC gaming. In fact, Microsoft has already introduced DirectX 12 which includes the sort of optimizations that GameWorks and AMD’s counterpart Mantle provide. 15
Secondly, even ignoring DirectX 12, when we compare NVIDIA’s relative strength to 3dfx’s, we can see just how unlikely Evangelho and Hruska’s narrative becomes.
- 3dfx had 80-85% of the market at Voodoo’s peak 16. NVIDIA’s cards represent 65% of the discrete video card market with AMD at 35% 17.
- The Voodoo was significantly faster than the Virge and Rage card offerings from S3 and ATI. It was 300% faster in Mechwarrior 2, 400% faster in Shogo, 300% faster in Quake etc.18 NVIDIA’s near-top end card at this time, the $700 GTX 780Ti, is less than 25% faster in theGameWorks optimized title WatchDogs versus AMD’s $600 Radeon 290X. 19
Thirdly, game developers for the Xbox One and PS4, which are essentially AMD computers, won’t use GameWorks at all. The real question is whether they will use AMD’s GameWorks equivalent, Mantle, or Microsoft’s DirectX 12. 20 and they’ve adopted the version of open-source religion that believes that the marketplace is incapable of defending itself from “monopolies”. To be fair, that’s a widespread belief.
Notes:
- This is a location in the game DayZ: Overwatch that contains all kinds of goodies. No one actually survives there very long ↩
- http://www.forbes.com/sites/jasonevangelho/2014/05/26/why-watch-dogs-is-bad-news-for-amd-users-and-potentially-the-entire-pc-gaming-ecosystem/ ↩
- https://twitter.com/Thracks ↩
- http://www.extremetech.com/extreme/173511-nvidias-gameworks-program-usurps-power-from-developers-end-users-and-amd ↩
- Managing Technological Innovation: Competitive Advantage from Change, 310, http://www.amazon.com/gp/search?index=books&linkCode=qs&keywords=9780471225638 ↩
- There’s some truth to that considering IBM enjoys success in the high end computing business with their POWER microprocessors and AIX operating system. The PC space is radically different. Home users are willing to tolerate slightly worse performance and reliability if prices are low enough and the experience is substantially the same. Few users buy Xeon processors with ECC RAM let alone ten thousand dollar POWER systems with full RAS capability. The only reason businesses buy POWER, or Intel’s Itanium, is because slightly worse reliability can end up costing millions. Power/Itanium is a bargain in those situations. For home PC users, particularly gamers, it’s not worth it. Personally I think ECC and ZFS provide a great deal of reliability in relation to the marginal cost which is marginal in both the common and economic sense. ↩
- This isn’t to say that open formats always triumph over closed ones e.g. MP3 vs OGG. Windows vs Linux ↩
- G-Sync’s cost is not only in licensing fees e.g. a G-Sync monitor will cost a bit more than an identical monitor without that feature, but currently uses an expensive hardware solution to implement. FreeSync uses technology already widely deployed in laptops. Further, the sorts of systems that will benefit most from G-Sync type technology, i.e., low powered machines that cannot deliver high frame rates, are Intel and AMD based. I don’t want to be harsh on NVIDIA because without G-Sync, lagless V-Sync would either never have come to market or might have been introduced much later ↩
- Not related to the overpriced arcade chain ↩
- In particular, the S3 ViRGE, which was sold because computer makers could put a tick next to “3D accelerator card” for very little cost ↩
- Which approach is better? I’d have to agree with the marketplace and go with DirectX ↩
- T&L became part of DirectX 7.0 (or Direct3D as it might have been called back then) ↩
- http://www.maximumpc.com/article/features/voodoo_geforce_awesome_history_3d_graphics ↩
- A review of the TNT from Tom’s Hardware back in 1998! http://www.tomshardware.com/reviews/nvidia,87.html ↩
- http://www.pcworld.com/article/2109596/directx-12-vs-mantle-comparing-pc-gamings-software-supercharged-future.html ↩
- http://www.techspot.com/article/653-history-of-the-gpu-part-2/ ↩
- As of Q1 2014 http://jonpeddie.com/publications/add-in-board-report/ ↩
- A wonderful walk down memory lane at http://vintage3d.org/ ↩
- http://www.forbes.com/sites/jasonevangelho/2014/05/26/watch-dogs-pc-benchmark-results-multiple-amd-and-nvidia-cards-tested/ ↩
- It’ll be DirectX[/rev]
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.