Tuesday, March 22, 2011

Game Developers want DirectX Out of the Picture According to AMD

Microsoft DirectXIt comes as no surprise to anybody that console graphics pale in comparison to the graphical possibilities of the PC. Compare a card like the GeForce GTX 580 and its 512 stream processor to the 48 units found in the Xenos GPU of the Xbox 360 and you might as well be comparing VHS to Blu-Ray. That doesn't even take into consideration the aging GeForce 7-series that you get with the PS3.

That is why it is so amazing that, even though PC games often look much better than their console counterparts, they do not crush console graphics into oblivion. Why is this, you may ask? Well, it is primarily due to the fact that a majority of games are designed for consoles first and PCs second. AMD, however, says that this could change if PC game developers were able to program PC hardware directly at a low-level as opposed to going through an API like DirectX.

According to Richard Huddy, Worldwide Developer Relations Manager of the GPU Division at AMD, "It's funny. We often have at least ten times as much horsepower as an Xbox 360 or PS3 in a high-end graphics card, yet it's very clear that games don't look ten times as good. To a significant extent, that's because, one way or another, for good reasons and bad - mostly good, DirectX is getting in the way." According to Huddy, one of the most common requests he gets from a game developer is "Make the API go away."

"I certainly hear this in my conversations with games developers," Huddy added. "And I guess it was actually the primary appeal of Larrabee to developers - not the hardware, which was hot and slow and unimpressive, but the software - being able to have total control over the machine, which is what the very best games developers want. By giving you access to the hardware at the very low level, you give games developers a chance to innovate, and that's going to put pressure on Microsoft - no doubt at all. Wrapping it up in a software layer gives you safety and security but unfortunately tends to rob you of quite a lot of the performance, and most importantly it robs you of the opportunity to innovate."

Now you may be thinking to yourself, weren't shaders designed to allow developers to be more innovative with their graphics? Yes, as a matter of fact, they were. The ability to run programs directly on the graphics hardware definitely allows for some flexibility, especially when you get past the fixed-function shaders accompanied by DirectX 8. Yet aside from a few stranger indie games, there isn't a lot holding you back from saying that a whole lot of PC games look a whole lot like one another.

Huddy added, "The funny thing about introducing shaders into games in 2002 was that we expected that to create more visual variety in games, but actually people typically used shaders in the most obvious way. That means that they've used shaders to converge visually, and lots of games have the same kind of look and feel to them these days on the PC. If we drop API, then people really can render everything they can imagine, not what they can see - and we'll probably see more visual innovation in that kind of situation."

For the full story including an Overhead of DirectX performance and Problems with Multiple GPU Architectures head on over to bit-tech.

Source: bit-tech - Farewll to DirectX?

The Security Camera Blog



Get a regular infusion of new security camera technology information, video and news. Easy to subscribe in a reader or by email.


We help you stay in touch with the latest security camera features.

No comments:

Post a Comment