Is the Xbox One 1080p, 60 FPS performance level for higher end games even possible, or is it just a marketing pipe dream where 900p is the more reasonable limit?
In a related report by The Inquisitr, the Watch Dogs native resolution was announced the other day, and neither the PS4 or Xbox One are capable of doing 1080p 60FPS. But while the PlayStation 4 can handle 900p, the Xbox One native resolution was stuck at an oddball 792p, which is something we haven't seen in a major title since the Xbox 360 days (I make it sound so long ago, don't I?)
Some gamers were probably pretty stunned by the decision to remove the Kinect. According to Microsoft's marketing head Yusuf Mehdi the only reason they made the decision was to afford the $400 Xbox One bundle:
"That said, we've heard from a lot of our Xbox fans who say, 'Hey look, I want an Xbox One, but at $499, I probably have to wait a little while before I can afford to get one.' I do think we're going to get people now who move over, and then buy the Kinect later. So I do think the [price point] broadens the appeal and hopefully brings more people to Xbox One sooner."Besides the $100 price difference, the Xbox One 1080p performance, or lack thereof, has been another major negative that has some gamers backing off from buying the system. Even Titanfall dropped in at 792p, and Head of Xbox Phil Spencer claims that removing the Kinect may enable developers to squeeze more blood from the stone:
"I know that developers want to get every bit of functionality out of the box that they can. In conversations I'm having with our partners, that's something that's come up. We need to land the right plan there so that we've ticked off all the boxes to make sure we understand all of the long-term ramifications, but you bring up an idea and a workstream that we're focused on that makes a ton of sense. It's just about when we're ready to make those kinds of calls."Back in January, it was claimed that a Xbox One patch could potentially help the GPU performance because it's claimed that about 10 percent of the graphics resources is reserved for the Kinect and for app functionality. There's also hope that a Xbox One DirectX 12 update may increase efficiency to the point the Xbone could catch up to the PS4.
Unfortunately, DirectX 12 software optimizations are unlikely to help the Xbox One win a graphics comparison with the PlayStation 4. The raw hardware numbers don't lie and the difference in maximum theoretical performance between the two GPUs can literally be measured by the capabilities of almost two Wii U GPU units. We're talking about a raw performance difference of roughly 38 percent, although that ignores the high speed embedded memory and other factors.
Still, what about the 900p vs 1080p performance difference? Let's assume developers can squeeze out another 15 percent and a hypothetical game is currently limited to 900p. If we just go by counting pixels, we're comparing 1600x900 (1,440,000 pixels) to 1920x1080 (2,073,600 pixels), so that means 44 percent more pixels are being rendered. The only way to bridge the remaining performance gap would be to make do with performance tradeoffs by reducing details, which is what happened with Watch Dogs, or be satisfied with framerates under 30FPS, which is how the Tomb Raider Xbox One port was handled. Of course, I'm ignoring how developers will learn to use the new platforms more efficiently, but I'm not expecting miracles.