суббота, 16 июня 2018 г.

welche_xbox_one

Xbox One Edition. Minecraft: Xbox One Edition ist eine am 05.09.2014 erschienene Version von Minecraft für die Xbox One, die von 4J Studios in Zusammenarbeit mit Mojang [1] entwickelt wurde. Sie kostet im XBox Games Store 19,99€, als Upgrade der Xbox 360 Edition 4,99€ [7] . Ein zugehöriges Ankündigungsvideo wurde während Microsofts Pressekonferenz auf der 2013 E3 gezeigt. [8] . Diese Version baut auf der Xbox 360 Edition auf. [2] . Die Xbox One Edition enthält größere Welten, erweiterte Mehrspieler-Features und durch die Xbox One ermöglichte Verbesserungen. [2] Am 20. September 2017 wurde die Xbox One Edition in die Bedrock Edition integriert und heißt nur noch "Minecraft". Das Gameplay ähnelt dem der Xbox 360 Edition , enhält jedoch zusätzlich einige weitere Features und Bugfixes. Der Trailer wurde von Hat Films produziert. [10] Minecraft Wiki. Minecraft content and materials are trademarks and copyrights of Mojang and its licensors. All rights reserved. This site is a part of Curse, Inc. and is not affiliated with Mojang. Der Inhalt ist verfügbar unter CC BY-NC-SA 3.0, sofern nicht anders angegeben. List of Xbox One S HDR Compatible Games. The Xbox One S supports HDR playback for games specifically designed to take advantage of the feature. This list collects the announced Xbox One games which do or will support native HDR play. EditList of HDR Supported Xbox One Games. Battlefield 1 (to be added) [1] Deus Ex: Mankind Divided Elder Scrolls Online: Tamriel Unlimited Final Fantasy XV Forza Horizon 3 Gears of War 4 Halo Wars 2 Hitman Injustice 2 Mass Effect Andromeda NBA 2K17 Pure Chess Ultra Recore (to be added) [2] Resident Evil 7 Sniper: Ghost Warrior 3 Tom Clancy's Ghost Recon: Wildlands Warframe World of Tanks. Note that the Xbox One also supports 4k playback for compatible Blu-ray discs, as well as video upscaling. Xbox One S Comparison Chart Previous. How To Get the Kinect Adapter for Xbox One S. © 1996-2018 Ziff Davis, LLC. We have updated our PRIVACY POLICY and encourage you to read it by clicking here. IGN uses cookies and other tracking technologies to customize online advertisements, and for other purposes. IGN supports the Digital Advertising Alliance principles. Learn More. Console Version. Terraria spiltscreen version for Consoles. PS3/PS4/Xbox 360/Xbox One. Terraria is available for download on the PS3, ($15.00 or £11.99 on the psn shop) PS4, ($20.00 in the pan shop)and the Xbox 360 (£9.99 or $15.00 in Xbox Marketplace). (Also available on the Xbox One Store for 20$ or your regional equivalent.) The console version is similar to PC version 1.2.4, with a few console exclusive items. 1.3 update coming soon! Terraria can be bought from the Playstation Store for PS Vita for $15.00/£11.99. It is almost identical to the PS3 and Xbox 360 edition. As the Vita lacks R/L 2/3 buttons, a manual/smart cursor option can be spotted at the bottom left corner. The vita version lets you crossplay with PS3 users, although the remote play feature is not accessible as of now. Crossplay feature. Crossplay is a feature on Terraria PSVITA edition and Terraria PS3 edition. It will allow users who have bought Terraria PSVITA edition and Terraria PS3 edition to play on each other's world. (EXAMPLE: PS3 users can play on a PSVITA user's world, and PSVITA users to play on PS3 user's world.) This feature will also merge the trophies a user earns on PSVITA edition and PS3 edition into the same trophy list, as long as the user is using the same PSN (Playstation Network) account on both platforms. 3DS and Wii U. Recently, Terraria has been released on 3DS, both on e-shop and to purchase. It has mobile exclusive bosses and items including Lepus. Hardmode can be unlocked, and has Ocram and bosses up to Fishron. It also has the Crimson update but does not have the special crafting items; instead crimson items can be crafted at a normal crafting station. The current version is 1.2 for America and Europe. The game can be very buggy. Also Terraria was released on the Wii U with 1.2.4.1 version. New content. The Console Version has several new items and monsters not available on the PC-Version. They are: Soul of Blight (Dropped from Ocram, material in most console-exclusive items) Spectral Arrow (Ammo) Tutorial Music Box (crafted from the 5 once-Console-Exclusive Music boxes) Vulcan Bolt (Ammo) New Vanity sets. Stronger Versions of old Monsters: And a new, final Boss : Along with it's Servants : Console Version History. Bugs/Glitches. Go here for a complete thread listing the current bugs and glitches, feel free to contribute if needed, as our knowledge of the console game is still growing. Terraria Xbox 360 Gameplay Trailer - Split Screen Multiplayer, New Final Boss, Pets, Music. Xbox One APU reverse engineered, reveals SRAM as the reason for small GPU. For months, Sony and Microsoft fanboys have lined up to hurl insults at each other over which console would pack more hardware, hit higher performance targets, or prove a better design for the next generation. With the two consoles launched, the game-to-game comparisons have mostly come out a wash, with a slight edge for the PS4. But there’ve still been questions about the underlying chip design — which architecture is more efficient, and what unique sauce went into each console? The fine folks at Chipworks have completed their teardown of the Xbox One and given us an answer to that question — and a few puzzles to go with it. The Xbox One die is 363 square millimeters, up from the PS4’s 348 sq mm. The 5% additional space, despite having the smaller GPU core, is mostly due to RAM. The Xbox One contains a whopping 47MB of on-die RAM, and that pushes the die size up considerably. It’s also why Microsoft didn’t have room on the APU for a larger GPU. Xbox One APU die shot, by Chipworks. There are some interesting differences to explore. First, consider the Xbox One’s Jaguar CPU blocks. Like the PS4, it has two quad-core chips — but the Xbox One has a bit of circuitry hanging off the CPU that the PS4 lacks. Here’s a comparison of the Xbox One and PS4 CPU islands. We had to rotate the blocks to line them up identically, which is why the label is reversed. Xbox One (left) vs. PS4 (right) Jaguar CPU blocks. See the block in red? The PS4 doesn’t seem to have an equivalent. What it actually does is unclear. It’s a bit large to be the built-in audio or the IOMMU that HSA theoretically requires. There’s nothing analogous on any of the Kabini floor plans we’ve ever seen. (It’s also possible that this is a Photoshop artifact or deliberate obfuscation. Companies often mask details on die shots. ) Now, over to the GPU. Like the Sony PS4, the Xbox One contains more Compute Units than are actually active on the console. The chip has 14 CUs, 12 of which are turned on, while the PS4 has 18 active CUs out of a 20 on-die. These are disabled to improve yield. Whether Sony or Microsoft might one day choose to enable the CUs in future console versions is an unknown — typically console manufacturers don’t update core specs post-launch, but consoles have been trending towards greater upgradeability over the past two generations. It’s not impossible that this could change. The other mystery? The Xbox One GPU cores are physically shorter than the PS4’s equivalents. I don’t mean the GPU block, which is obviously smaller — one GPU Compute Unit on the PS4 diagram, is 50 pixels wide, 395 pixels tall. On the Xbox One, each Compute Unit is 42 pixels wide, 347 pixels tall. It looks as though Microsoft may have picked a tighter arrangement for its GPU core, again possibly to save the maximum amount of space and make room for as much SRAM on die as possible. Speaking of SRAM, the arrangement of the Xbox One’s was a considerable puzzle when Microsoft unveiled the console architecture. According to the company, the Xbox One doesn’t really have a 32MB contiguous cache, but four 8MB cache blocks instead. There are two blocks of cache to the right of the GPU and a smaller block to the left. This smaller block is possibly used for cross-CPU communication. A die shot of the Xbox One APU, showing AMD’s maker tag. It’s hard to tell exactly how the Xbox One’s 47MB of claimed SRAM fit into the floor plan, however. We know that the CPUs in question contain a total of 512K of L1 and 4MB of L2. If the two blocks to the right are ESRAM, each block should be 16MB, for a total of 32MB of cache there. The GPU should contain 512K to 1.5MB of L2 (512K being standard for a GCN chip of this size, with more L2 if Microsoft choose to boost that capability), and about 224K of L1 in total. That leaves about 10MB of cache missing. If the SRAM block between the two CPUs is that large, it’s far more dense than the SRAM to the right of the GPU. Chipworks also tore into the Xbox One controller, but it’s not that interesting. It has an ultra-low power Freescale microcrontroller and a Cortex-M0+ core. A custom Microsoft WiFi chip handles communication with the mother ship. The chip count here is kept minimal to speed manufacturing and lower cost. A teardown of Kinect should be up and available in the not-too-distant future. The PS4 APU die shot, for comparison with the Xbox One. Different designs lead to similar places. After looking at both the Xbox One and PS4, I think we see companies arriving at the same point through rather different approaches. Both manufacturers chose architecture they felt would allow them to work most effectively. Microsoft invested more silicon in large, low latency caches, while Sony sank more money into raw bandwidth. As far as performance is concerned, this could well end up a tie; as the Xbox One should be able to access data more quickly, while the PS4 can stream sustained data far more effectively. Since game developers can leverage both of those features, the final result could be a wash. Both companies also picked designs that should be relatively easy to migrate to new process nodes. As 20nm technology comes online, we’ll probably see refreshes in 12-18 months. It won’t surprise me if the first SSD designs start to pop up then, too — there’s too much potential upside in a premium SKU with solid state storage for either company to ignore the possibility. Post a Comment Comment. So PS4 has 32 ROPS compared to 16 on Xbox One, and 384 more GPU cores…so then performance wise, how can this be a tie? But Caching™! It’s magical! And obviously developers are also magical entities that write their multi-dozen gigabyte games entirely in assembler in order to properly make use of these extra features. Not to mention you can totally fit an entire frame of 32 bits 1920×1080 pixels uncompressed in that <50MB ondie cache, let alone 60 of them so you have a single second of cached material. There's a reason L1/2/3 cache never goes past 12MB, and it's not because it's hard to make. Microsoft is going to get bitten in the arse with this move. You do not have to write code in assembler to take advantage of cache. Cache controllers do that via a layer of abstraction so the programmers do not have to worry about it. And one would not use the L1 cache for display frames because you are not likely to use that data again very soon. Cache would be for memory information that is expensive to access via the hard drive or RAM datapaths and is used frequently enough to warrant grabbing it and holding onto it. Only time will tell how its performance gain helps against the speedier data-paths in the playstation4 but as of launch, games look pretty damn similar to me (maybe excluding the few with a resolution disparity, but that’s a developer issue as the box and ps4 both support 1080p even though several titles were released for both that underperform in that category). This whole argument about trying to show a marginal advantage to the PS4 goes contrary to consumer intelligence and behavior. The PS4 is running practically all games at 1080p @ 60 fps (frames per second) while the Xbox ONE is running practically all games at 720p @ 30 fps. The problem with the Xbox ONE is not that it cannot generate games at 1080p @ 60 fps, but that it does so only with games such as Forza 5 that does not require much GPU power because a lot of its environment is made up of static images. Practically all cross platform games are running on the Xbox ONE at 720p @ 30 fps. The Xbox ONE’s anemic graphics throughput of 1.2 TFLOPS versus the PS4’s 1.84 TFLOPS is showing up early in the fact that cross platform games are overwhelmingly running on the PS4 at a native 1080p resolution at 60 fps in contrast. I am not saying that one should not buy the Xbox ONE, but to deceive consumers as many in the press are doing into believing that the two systems are closely matched is unconscionable. The bottom line is that most gamers because of the prevalence of social media are well informed about the two consoles and are already voting with their dollars. They already know that the technical specs of both systems are substantially different and that such difference cannot result in the same outcome. And the reports are indicating that the performance difference between the PS4 and Xbox ONE is vast. Some have argued that up-scaling DVD movies to 1080p is marginally identical to native 1080p, yet consumers in overwhelming numbers are buying Blu-ray movies. FIrst off, I do not wish to get into some idiotic fan boy argument. I will, however, reply to you since you seem to have taken my comment completely out of context. I was replying to the guy above me, who was making an erroneous computer engineering statement regarding cache. I was simply explaining that cache is used to speed up computation by reducing the need to go to the memory datapath as often which can significantly increase performance of a machine and that programmers DO NOT have to do anything to take advantage of it because the hardware controller is in charge of making the cache decisions. All your impressive data about how awesome the PS4 is does nothing contradict my factual statement about how cache works and the fact that only the future will tell if a system with significantly sized cache memory can overcome the deficit of the slower main memory they decided to use. Now I will address your computer engineering mistakes. Firstly, The xbox has an GPU output of 1.32 TFlops and indeed the PS4 has 1.84TFlops. However, Adding pipelines and having an impressive computational speed does NOT change speedup over another system LINEARLY. This is the huge misconception going around the fanboy wars these days. The statement goes something like: “Well PS4 has 2x this and 2x that, so its 50% faster than the Xbox”, and it is an asinine one to make. Calculating speedup when comparing two machines is incredibly complex and takes into factor alot of things which is why the ACTUAL speedup is currently unknown but MANY tech sites, like this one, are calling it close to a wash (based on how current x86 PC machines work). It is because you do not experience linear gains from adding things, the actual gains are MUCH slower than that. You may well be right that the PS4 will come out to be a crazy more powerful machine later, but its just not true now, nor does any of the computer science point to this being the case. So, the bottom line here is that the ‘PRESS’ for whatever stake you think they have in it, are not deceiving the people, but rather are simply reporting what they know and not speculating wildly as you are and many fanboys across the interwebs are prone to do. That my friend is deceptive. Finally, I accept your point that, “The bottom line is that most gamers because of the prevalence of social media are well informed about the two consoles and are already voting with their dollars” but think I should point out that it seems to go contrary to your opinion that the PS4 is significantly better. Seems like if people are indeed ‘well informed’ and ‘voting with their dollars’ and if we are to take this as a metric as to how much better one machine is over another, than the two systems are indeed very evenly matched as they have nearly identical launch sales. Just wondering from a sales perspective, didn’t the X1 launch in 13 countries whereas the PS4 only launched in North America? The sales may have been nearly identical, but to me it seems like the X1 only balanced out because the sheer amount of countries it released in. Doesnt really matter because his “idea” that consumers are voting with their dollars which is better is a wash, and anyone that has any idea about consumerism knows better than that.. Last gen we had the PS3/360/Wii well the Wii outsold both PS3/360 was it really the more powerful better gaming console. No it wasnt when you look at it from the graphical power side of the consoles but it still sold more why. (Money) Because historically and on average. Consumers tend to lean towards value and cost effective products to save money that can perform the functions just like the expensive brands. His idea that the PS4 runs all games or most games in 1080p 60fps and the XB1 running all in 72030fps is just flat out not true. He is clearly into fabricating facts to bolster his Sony biased opinions therefore losing his validity to publicly state any real knowledgeable opinions or facts. Money isn’t the focus, it’s value. There was some value in casual-friendly Nintendo-gaming when pretty much everyone had skipped the Gamecube generation, was in the mood for a Nintendo fix and their console was half the price of the competition. Now the Wii U is totally bombing even though it’s $100 or more cheaper than the competing current-gen consoles. Why? Because what’s the value in a system that plays less games, and plays what it does get at half the resolution of the PS4? Customers chase value, not whatever’s the cheapest. Bang for your buck, son, and the PS4’s the king of that right now. While that’s true, there are still other factors you need to consider: – Initial “Sales” are usually filling distribution channels and retail shelves. Actual sell-through to customers will not be known for a few months. To me that indicates Microsoft may be filling shelves more than selling to consumers. Distributing to 13 countries required them to fill more pipelines. – This seeming Microsoft negative can actually backfire on Sony. If Sony can’t deliver, their current higher demand will go stale. So they have to grow their distribution and satisfy their demand quickly in order to grow their initial momentum. Initial sales are more about distribution backed up by marketing to then get the products off the shelves. By March or April we’ll know for sure whose winning. I’m afraid though the real winner will wind up coming from left field. When it comes to 720P and 1080P gaming, an iPad Air and iPhone 5s coupled with AppleTV is more than good enough and has its own advantages. Nearly identical launch sales with Xbox One released in 13 countries and the PS4 in just 2 at time of this article. I’d say it is asinine of you to disregard the fact that consumers are speaking with their money. Pretty obvious, isn’t it? As of Dec 1st the PS4 has sold 2.1 million units in 32 countries. XB1 has sold as of the 11th of Dec 2 million units in 13 countries. So actually the PS4 is available in way more areas than the XB1 and only has a slight higher figure. If the XB1 was available in the same 32 countries the XB1 would be leading the sales war as of now with ease. Basic math dictates. So you’re saying the ps4 launched in two countries with limited supply, and some how to this date magically has launched in 48 countries world wide one month later ? Yet you can’t buy one in stores because they’re sold out? Sorry buddy ps4 launched in more than 2 countries at the time of this article. your wrong there money was speaking to the consumers the cheaper console is the one they go for for just that reason. Wow. A year later, that guy (like many PS fanboys) looks really dumb. Right now we are seeing games that are pretty much identical with maybe a slight difference in native resolution or a few fps. I’m perfectly fine with that as I dont think either are a big deal. Interesting post, you obviously know what you’re talking about. Cheers. lol what are u saying (The PS4 is running practically all games at 1080p @ 60 fps (frames per second) while the Xbox ONE is running practically all games at 720p @ 30 fps.) whatttttt. dont lie to the people fanboy only 3 xbox one games runing at 720p call of duty ghost, killer instinct, dead rising 3, ryse runs at 900p same with battlefield 4 at 900p with day one actualization. the rest of games are runing at 1080p fifa 14, need for speed rivals, kinect sports rivals, ufc 14, crimson dragon, forza 5, lococycle, fighter within, nba live 14, zoo tycon, power star golf, madden nfl 25, etc. @Andrew: you caring this much is so sad. except for the fact that only one game thus far has been ran at 720p 30 frames per seconed all the rest were ran at either 1080p 60 frames or 1240 80 frames. This isn’t L1 cache, it’s SRAM. It’s not taken advantage of by default like L1/2/3 cache is, you have to make your own instruction cache. And even if it was, there’s hardly ever more than 10MB used by these instructions, really putting 50MB of that stuff ondie is a huge waste of space that could’ve been used to put in more actual processing power. Not to mention the PS4 solves the problem better than the Xbox One; They’re using GDDR5 which just has much more bandwidth overall, versus the much slower DDR3 in the Xbox one so you don’t have to cache things, the L1/2/3 cache is plenty. The reason that the games aren’t 1080p really just is because the hardware inside just can’t handle these games in 1080p. That AMD APU is a customized but otherwise generic off the shelf AMD Kaviar which will be available for laptops in a few months, and we’re talking about games that you need a pretty beefy rig for to max out, there’s 30GB+ of textures in both BF4 and COD:Ghosts, the bandwidth demands are insane. They look good, but that’s also the reason they won’t run maxed out on consoles, because they’re really just commodity PCs. That said, you’d need a side by side comparison to really spot the differences, hardware these days is powerful enough to run all these games with almost similar polygon counts and texture sizes, the difference is in the details, stuff that’s left out like volumetric fog, more realistic shadows, etc. And all that said, I still heavily prefer developing for PCs, if only for the fact I can debug on the same rig I’m writing the software and I don’t have to go through all kinds of convoluted SDKs in order to just make something work like it does for me on my PC when I just hit F5. The RAM and cache situation is a bit more complicated than you would like to lead people to believe. First off CPUs love low latency and GPUs love high bandwidth. Because of this Sony has crippled the PS4’s CPU with DDR5 and Microsoft has crippled the X1’s GPU with DDR3. What does this mean? This means better CPU performance in the X1 and better GPU performance in the PS4. But… The X1 actually has a higher CPU and GPU clock speed than the PS4 and it’s higher GPU speed actually makes it perform better than if they had just unlocked it’s two additional compute units due to additional CUs not adding performance linearly (decreasing reward for additional CUs). As everyone should know GPU’s, ESPECIALLY AMD’s, are highly CPU dependent (this is why high end i7s are used for GPU benchmarks, to eliminate CPU bottlenecks) which means that the PS4’s GPU may turn out to be bottlenecked by it’s CPU. As for Microsoft’s bandwidth problem, they added the high speed SRAM as a cache to help solve this and the GPU can actually access data from both the SRAM and the DDR3 at the same time making it’s total accessible bandwidth larger than either of them separately. The X1’s total bandwidth is estimated to be near (slightly above or below) that of the PS4. Now on to other things. Audio: Microsoft has done great here, it’s custom gaming audio block (SHAPE) is capable of 512 audio channels, taking over virtually all of the audio processing and providing excellent quality audio. The PS4 on the other hand has an audio block capable of only 200 channels, offloading much of it’s audio processing to it’s slower CPU and providing reduced audio quality compared to the X1. Compute engines: These are part of the GPU and help with physics and AI. The problem with compute engines though is that they can only be used while the GPU isn’t busy rendering graphics which it will be doing almost all the time. The X1 is capable of a total of 16 commands at a time while the PS4 is capable of a total of 64. This is more of a wash though because it’s extremely unlikely that anything besides exclusive titles will be able to use even 16. There is only one title out right now on either console that uses any, it’s Killzone (PS4 exclusive) and it only uses 1 out of 64. Move engines: Only the the X1 has one and it has a total of 4 move engine processors. Move engines greatly reduce the overhead of transporting data between the CPU and the GPU, freeing up clock cycles for more important things (improving CPU and GPU performance). It also helps transfer data to the SRAM. In the end due to Microsoft’s customizations, the X1 has an advantage in CPU power and not quite as much of a disadvantage in graphics as many would like you to believe. The PS4 has a small advantage in graphics which probably won’t pan out to much in the end due to the consoles not being that far apart and developers programming to the lowest common denominators. “The reason that the games aren’t 1080p really just is because the hardware inside just can’t handle these games in 1080p.” This is wrong and you know it. Both consoles are more than capable of displaying games in 1080p60. The problem is that the developers haven’t had enough time with the finalized hardware in order to more fully utilize the hardware. Infinity Ward actually came out about a month ago saying that the X1’s hardware is capable and they were planning on 1080p60 for Ghosts but they lowered it last minute to 720p to keep it at 60fps because of the change in the finalized hardware. The PS4 version was actually shipped with 720p60 single player but they upped it last minute with a patch to 1080p60. PS4 doesn’t even run BF4 at 1080p60, it runs at 900p60. I guess that means that the PS4 sucks too huh? Or maybe, just maybe it’s like the last generation and it will take time for the devs to figure things out. In the end you probably won’t notice a difference between the two besides their exclusive titles so just buy whichever has your favorite exclusives. You’re right about the part that GPUs love bandwidth (because they have to fetch all the textures every frame) and CPUs love low latencies (because they have to perform in real time).. But: “The X1 actually has a higher CPU and GPU clock speed than the PS4 and it’s higher GPU speed actually makes it perform better than if they had just unlocked it’s two additional compute units due to additional CUs not adding performance linearly” -Yes, but the minimal overclock that the Xbox one has does not at all make it compete with the PS4. More CUs makes a much larger difference than the slight difference in clockspeed. “As everyone should know GPU’s, ESPECIALLY AMD’s, are highly CPU dependent (this is why high end i7s are used for GPU benchmarks, to eliminate CPU bottlenecks) which means that the PS4’s GPU may turn out to be bottlenecked by it’s CPU.” -False. GPU steering frameworks, such as DirectX, are heavily CPU dependant. The reason benchmark rigs use high end CPUs is because well, why the hell would you use a low end CPU if you’re going to benchmark speed? This is just as true for Nvidia as it is for AMD. OpenGL has a much lower dependency on the CPU, however, the Xbox One does not support it (they’ve disabled it in the software/firmware), whereas the PS4 does. The GPU does not at all have to communicate with the CPU, that’s what they invented DMA for. ” As for Microsoft’s bandwidth problem, they added the high speed SRAM as a cache to help solve this and the GPU can actually access data from both the SRAM and the DDR3 at the same time making it’s total accessible bandwidth larger than either of them separately. The X1’s total bandwidth is estimated to be near (slightly above or below) that of the PS4.” -So what you’re saying is, my HDD has 10GB/s+ bandwidth, seeing it would be added up to the speed of the RAM? Sorry but, that’s not how it works. It’s nice you have a 47MB cache in which you can stuff some variables and the like, but it will hardly impact the performance and will certainly not make up for the sluggish speed of DDR3 vs GDDR5. “Audio: Microsoft has done great here, it’s custom gaming audio block (SHAPE) is capable of 512 audio channels, taking over virtually all of the audio processing and providing excellent quality audio. The PS4 on the other hand has an audio block capable of only 200 channels, offloading much of it’s audio processing to it’s slower CPU and providing reduced audio quality compared to the X1.” -Games hardly ever use more than 100 channels at the same time, and they’re usually mixed within the game engine into a single channel for the soundcard. It’s a nice feat but I doubt it makes much meaningful difference. “This is wrong and you know it. Both consoles are more than capable of displaying games in 1080p60. The problem is that the developers haven’t had enough time with the finalized hardware in order to more fully utilize the hardware.” -No. Sure you could squeeze out more performance out of pretty much anything out there by optimizing and rewriting and it takes a lot of effort to do so, but in the end, these consoles simply miss the raw horsepower in order to run it at 1080p out of the box at 60FPS. They lowered the resolution in order to guarantee 60FPS with the current state of the engine, which given time may improve, but really that does not change the fact that the consoles are not fast enough to keep up with the engine. It’s kind of like saying “My Toyota Prius can keep up with your Ferrari! Just, I haven’t tweaked the engine enough so I’m stuck at half the speed you are going” or, “My 8 year old PC can’t run that game on highest settings right now because the game isn’t optimized enough yet”. TECHNICALLY you’re right, but really it’s just the same thing as “the consoles are too slow” put in other words. Both consoles suck. They’re DRM-restricted commodity hardware that could be running just about anything you could imagine if it didn’t have the console sticker they’re wearing. They’re both. Just. PCs. Slow ones, too. In a few months you’ll be able to get that exact chip, minus the customizations, in LAPTOPS. Average laptops. But yes, you won’t notice a difference because that would cause the lesser party to bitchslap the fuck out of the developers with lawyers. Hardware wise, the PS4 is better than the Xbox one, but I wouldn’t touch either with a stick myself. Jaguar has a DDR3/DDR4 and a GDDR5 compatible controller. It could be that processor allocated memory uses a slightly different data path. It wouldn’t be that hard to allocate blocks of ram to DDR3 spec and maintain unified memory. Jaguar has a DDR3/DDR4 and a GDDR5 compatible controller. It could be that processor allocated memory uses a slightly different data path. It wouldn’t be that hard to allocate blocks of ram to DDR3 spec and maintain unified memory. The PS4 is actually 15 % faster CPU wise. This is incorrect. The SRAM is handled automatically if the developers choose to allow it. Xbox-OS will interface the cache through the memory pipeline seemlessly as they would allocate any memory resource. This was confirmed by the development team already. Well perhaps, I can imagine it would be, but as I mentioned earlier: “And even if it was, there’s hardly ever more than 10MB used by these instructions, really putting 50MB of that stuff ondie is a huge waste of space that could’ve been used to put in more actual processing power. Not to mention the PS4 solves the problem better than the Xbox One; They’re using GDDR5 which just has much more bandwidth overall, versus the much slower DDR3 in the Xbox one so you don’t have to cache things, the L1/2/3 cache is plenty.” GDDR5 isnt as fast as you think :/ It has a much higher latency than normal ram. If anything both companies are fucking stupid for using only one type of ram. they wouldve been better off if they both cut the amount of ram in half and used both DDR3 and GDDR5. Then they wouldnt have to do so much stupid shit to make them work. And the ps4 fanboys talking about having more GDDR5 ram being better are idiots. the shitty gpu in the ps4 would never ever ever get high enough fps to utilize that amount of vram. then again they didnt even use a cpu gpu combo they used apus. Which i dont really like apus. Oh and saying GDDR5 ram is faster than DDR3 is stupid because its untrue. GDDR5 ram “is faster” because its not like normal ram…. it takes a long time to do anything but it can work in parallels. For a cpu its detrimental. It just takes one cpu based ram loving game for everyone to see the faults in the ps4. If you want to get technical, the PS4 actually does have two types of RAM, it has 256MB DDR3 on top of the GDDR5. The Xbox one instead, has 50MB on-die cache. Technically, the PS4 should perform better in the case of graphics, mainly because of the extra shader cores, and the Xbox One slightly better in the case of CPU load, because of the on-die cache and lower latency RAM. However the Xbox One runs a full Windows OS, and uses a hypervisor to multitask between things like TV watching and playing DRM’d games, so that would in turn affect the CPU performance again. The PS4 in turn, seems to reserve specific cores and memory for its ‘own purposes’. But in the end, they’re both very mediocre desktop hardware put in mediaplayer packages to serve as DRM enforcement units. I could not care less which of the two is faster, even my slowest laptop outruns them with so much ease that it really makes me wonder why people still bother with these DRM selfbondage leashes. It would only outrun them slightly if you got the fastest laptop out there…. which is only slightly faster than a mid range amd setup…. Okay, as I suggested above, there are things people aren’t thinking about: These are the same GPU’s and same CPU families, with varying intricacies. Like the article suggests; Microsoft and Sony took slightly different avenues to arrive at the same point. Memory speed, everything. Capability-wise, there isn’t a single thing the PS4’s 7870 can do that any other 7xxxx series cannot, they are the same family and they are both DX11.1+ capable. EVen, had the Xbox used a 5XXX GPU, the same could be said…but performance would be slower on the 5XXX. Any DX11 card is essentially, the same. So, we know the PS4 has 16 extra ROP’s that doesn’t translate into anything more than higher pixel count. Same capabilities, higher pixel count. I say “theoretical” because we don’t know if Microsoft has some amazing method in which they can makeup ground e.g. some kind of amazing scaling technology etc who knows. It’s all in how you use the HW. Now, there is a debate in just how much you can notice on 1080p TV, in terms of pixels. 4K isn’t going to be affordable anytime soon and by time that happens, we’ll be halfway or more through this cycle and the next console rumors will be around the bend…only then, may you begin to notice the picture difference and that’s where I believe Sony did a better job future proofing their system and when you will see the benefit of 16 more ROP’s. 384 more shaders? That means better quality lighting and shadowing, that’s negligible, again, it won’t translate into a major difference, if you compare the two, unless, you make a habit out of parking yourself in front of a wall and staring at every crack and surface detail…if so, maybe you ought to be a game developer and not a gamer. The PS3 had smashing specs. Far beyond the CPU that is even in my PC today. On paper, the PS3 was leaps and bounds ahead. The CELL could even double as a GPU and produce HDR (see Heavenly Sword) and make up for what the GPU lacked…couple that with more promising particle effects and physics…etc. etc. This generation, it’s just a difference in shaders and pixel count. Despite, the PS3’s specs on paper, Microsoft games still ended up looking better and if not, identical. Software is the biggest varying difference. Drivers, OS, SDK’s…patents/ methods, all of which we know MS excels at. So, what we know is PS4 has better GPU, but what we DON’T know is that if Microsoft has this “amazing, magical piece” somewhere tucked inside Xbox One. Again, PS3 and Xbox 360 were completely different machines, and it took 8 years for PS3 to catch-up and exceed 360’s render performance and quality. We all know that, it’s a moot point in PS4 and Xbox One discussion. I don’t care about what one can notice on a 1080p or 4K TV, it all depends on their own personal environments and preferences. We are talking about the capabilities of PS4 and Xbox One on a 1080p TV. Today, PS4 and Xbox One are almost exactly the same. You are dismissing 16 more ROPS and 384 GPU cores of PS4 which will make a noticeable impact if used properly in cross-platform titles, but at the same time speculating that Xbox One “might” have something amazing and magical somewhere that will pull it ahead of PS4’s rendering performance? PS4 will always be ahead of Xbox One, period. Now how Sony and Microsoft enable developers and what these developers do with these machines will be seen in the coming years. I’m not dismissing PS4’s more RAW power and bandwidth, I’m saying it does not equate to more capabilities. The APU’s are of the same family. My point, in extending back to the HD5000 series, is merely to suggest that even those cards can compete, when software is being written for it. It’s all DX11, for instance. Sure, the PS4 has more ROP’s which translate into more bandwidth, which translates into more pixels and polygons/ higher resolution. There are 384 more shaders, for better shaders effects, but that’s negligible difference compared to the PS3/ 360 era, which, surprisingly was close as well, all things considered. All I’m saying, is, with the selected hardware (and who knows what software methods/ patents), they can come up with a slightly scaled down ratio, still well above 1080p at lower polygon counts and the differences would be negligible. You tell me what what 1.83 TFlops vs 1.31 will make. At the end of the day, it does not translate into capabilities, it translates into more polygons and pixels (resolution). When Microsoft quoted “The PS4 cannot do anything the Xbox One cannot do” they were correct. It’s all the bloody same GPU family. Now, if the PS4 has a DX12/ OpenGL 5 (?) comparable chip in their unit, then you can take what I said and throw it out the door, but the fact of the matter is, they are capable of the same things, with pixel/ polygon count being the differing factor. At the end of the day, software is going to be the determining factor. The little bits and pieces that are different are expendable really, they won’t make a difference. The PS4’s hardware is faster, period. What DOES make a difference is that Microsoft is known for making developing for their platforms like delicious fresh cake compared to the ‘moldy bread from last week’ that the Playstation’s SDKs have been so far. Microsoft is sitting on -THE- best developer tool out there; Visual Studio (fellow developers will agree. Unless they’re Richard Stallman or Linus Torvalds of course). I’m pretty sure we’ll see much more especially smaller titles for the Xbox One considering how easy it will likely be to compile and debug stuff on it. But hey, since we’re talking about what’s easy to develop for, you know what has a billionfold more software available right from the box than either these consoles will ever have for the exact same price? PCs! Let me just take another moment here to rub in everyone’s faces that these “Consoles” are really just crippled laptops (AMD APUs are commonly found in low-midrange laptops) that will only ever run the few OEM-endorsed titles that will appear in the next few years that a console ‘cycle’ lasts. BRING ON THE BUTTHURT. YOUR TEARS ARE DELICIOUS. I’ve been waiting for the PC market to merge with the console market for ages. Its like they keep reinventing the wheel. I love it how this guy was defending ps4 from the very beinging and then when someone comes along and shoves facts in his face about both consles being almost alike and proving it through facts, he gose “PC Is way gooder and all have buy PC” dude for real….gaming started on consoles and it’ll always be better on consoles.. WHY. TITLES! I can name a list of at least 100 games that PC never saw and will never see. And pc are better true! BUT who plays anything other then MMO on a pc anyways?Like what 50 people? The true essence of gaming has always been on consoles. Hey, Psst, Consoles are PCs these days. Literally, identical hardware. Except a PC you can upgrade and customize. Did you know you can hook up console controllers to a PC? And did you know you can get a steambox with specs equal to that of a PS4 for less money? The only reason the titles stay on the consoles is because Sony, Microsoft and Nintendo would hate to lose their monopoly to dictate the price and quality of modern games, and they pay big money to developers to simply not release those games on the PC. If everyone would get off those consoles and see what PCs can do, developers would just give those 3 companies the boot and release their otherwise console-only games on the PC. And ehh, those that don’t, you can always emulate. PS3 emulator isn’t too far off, and PS4 emulator is quite simple since they’re PC architecture (x86 CPU, AMD chipset and GPU if you want fancy nerd terminology), The PS1 was unique hardware, the PS2 somewhat too, the PS3 was just powerpc/cell which is off the shelf hardware (used in pre-2007 Macs), the PS4 is literally a PC, with an AMD laptop chipset. The only reason consoles still exist is because people are too stubborn and afraid of the unknown to try a PC for gaming. Newer ps3 vs 360 games tend to look better on the ps3 now when it comes to thing like lighting and particle. Not a huge deal, however why is the ps4’s custom chips to offload things like streaming, downloading and video encoding being ignored in some earlier post like the one mention xbox one having more audio channels ? These reporters are going to owe their readers a big apology when their consoles end up performing significantly worse. How then can they explain the fact that practically all cross platform games are running at 1080p @ 60 fps on the PS4 while these same games are running at 720p @ 30 fps on the Xbox ONE. Could they please explain to gamers how the PlayStation 4 graphics throughput at 1.84 TFLOPS versus the Xbox ONE’s throughput at 1.2 TFLOPS make it a wash? Are they saying that though the PlayStation 4 comes with an ultra-fast 8 GB of GDDR5 graphics memory, it does not end up having a significant performance advantage over the Xbox ONE which comes with the much slower 8 GB of DDR3 PC memory? Are they trying to tell us that the Xbox ONE’s science fiction 32 MB memory overcomes the overwhelming hardware advantage of the PS4, even in light of the fact that the PS4 has 18 CUs (Compute Units) versus the Xbox ONE’s 12 CUs? Am I starting to read articles at tech sites that are no longer making sense? It is starting to appear as if a lot of these tech articles are being written in Microsoft’s PR division and handed out for general consumption. It could be that, or it could be that these tech writers actually know what they are talking about and you don’t. At the least, your claim that all XBOX ONE titles run at 720p/30fps is ludicrous and easily refuted (Forza – 1080p/60fps, Ryse – 900p/30fps, Battlefield 4 – 720p, 60fps, etc). You can’t even recite undisputed facts accurately and we’re supposed to lend credence to your opinion? Um….I don’t think so. A game that needs only 1.3 TFLOPS will run on either system fine. That will be the cross-platform target for now. In the near future those same games will reduce base requirements by perhaps 0.1 or 0.2 TFLOPS due to optimization. This will allow more to be done in newer games using the freed resources and thus better games will emerge. Developers say the PS4 is hard to optimize for without going to hardware-level instructions ala Mantle, which is NOT a common practice and will take time to catch on. Microsoft on the other hand is always optimizing their tools and working directly with development studios. At least in the immediate future, Microsoft will likely take the lead on optimization. When the Xbox One edition of a game requires 0.5 TFLOPS less to do the same things as the PS4 edition, that gap in raw hardware power will mean nothing. The way Microsoft’s acting, this is likely going to be the case in about a year or maybe two if there’s complications. Effectively, this leaves both consoles at about the same output ‘power’ and nobody will have any reason to complain about either side. Gamers can rejoice as making games exclusive will be dumb since there’s no perceived reason for one console over another. Glitter and rainbows for days, son. However Microsoft will still have more features with hardware support (such as video chat while gaming, multi-way video chat for parties, Twitch-specific quality h.264 encoded streaming video, etc.) where Sony will have to either sacrifice system power to compete or will have to admit defeat like they did with cross-game party chat systems on the PS3. I’m not saying those features will ultimately matter to everyone, but the availability is appealing and (as with 360 parties) may prove invaluable in rare cases. This could be the real area where we see one system take over the other. Yep, it’s going to come down to software just like the last generation. It doesn’t matter if you have higher hardware specs (on paper) if the other guy is able to utilize his hardware more efficiently. LOL mentally ill retard. HURR MS HAS BETTER SOFTWARE! Complete delusional fanboy tard fail. Twitch-specific quality h.264 encoded streaming video, etc at least for this one doubt power would have to be sacrificed from the gaming end unless the dedicated chip for encoding theses stream they have is some how deficient. The video chat might be different though I don’t see why it would take more resource to achieve than the windows on the xbox unless they are just lazy on the implementation. For me I am more interested in seeing the exclusive tittles for the consoles than perfect 1080p resolution though. I have often leaned towards sony’s first party studios on this but that can change. A game that needs only 1.3 TFLOPS will run on either system fine. That will be the cross-platform target for now. In the near future those same games will reduce base requirements by perhaps 0.1 or 0.2 TFLOPS due to optimization. This will allow more to be done in newer games using the freed resources and thus better games will emerge. Developers say the PS4 is hard to optimize for without going to hardware-level instructions ala Mantle, which is NOT a common practice and will take time to catch on. Microsoft on the other hand is always optimizing their tools and working directly with development studios. At least in the immediate future, Microsoft will likely take the lead on optimization. When the Xbox One edition of a game requires 0.5 TFLOPS less to do the same things as the PS4 edition, that gap in raw hardware power will mean nothing. The way Microsoft’s acting, this is likely going to be the case in about a year or maybe two if there’s complications. Effectively, this leaves both consoles at about the same output ‘power’ and nobody will have any reason to complain about either side. Gamers can rejoice as making games exclusive will be dumb since there’s no perceived reason for one console over another. Glitter and rainbows for days, son. However Microsoft will still have more features with hardware support (such as video chat while gaming, multi-way video chat for parties, Twitch-specific quality h.264 encoded streaming video, etc.) where Sony will have to either sacrifice system power to compete or will have to admit defeat like they did with cross-game party chat systems on the PS3. I’m not saying those features will ultimately matter to everyone, but the availability is appealing and (as with 360 parties) may prove invaluable in rare cases. This could be the real area where we see one system take over the other. it’s sony.. they always fuck it up. Because sites these days are overly diplomatic to the point of absurdity, and will pretend that “720p and 1080p are pratically the same”, “fill-rate doesn’t matter”, “advanced effects are over rated” blah blah blah. They make the PS3 CELL argument, which really does not apply when the systems are almost identical expect that PS4 just has more. Ease of development doesn’t matter either, apparently no one seems to remember that PS3 was heavily criticized because of its split pool of memory, that combined with a GPU with a lower fill-rate (looking at you 16 ROPs) just made multi-platform development a nightmare. “But the Xbox One has Direct X…”, what you mean the API with like 20 years of legacy support baggage? You mean the API that hasn’t been updated pretty much since Windows 7 released? Sorry but it’s really annoying when it has been nothing but praise for Sony’s new SDKs (see PS4 and Vita), but journalists still go on about “Well Microsoft has the software advantage…” In conclusion, in a way Microsoft and Sony took two different approaches to arrive at the same place, and that place was 8 GB of system memory. Microsoft took the safe route of sticking with DDR3 and adding embedded RAM on the chip, ultimately causing the die to be bigger (higher cost) and having less rendering overhead (smaller GPU). Sony made a gamble that they could secure enough 512 MB GDDR5 modules to meet the demand of a global market, and based on the massive initial units available, I’d say that their higher performing, more developer friendly gamble paid off. #DealWithIt. It is not even close to a tie…. This article is old. Now that they have been out a while it is easy to see that PS4 crushes the X1. As every 3rd party game almost looks and plays better on the PS4 at 1080p/30or60fps meanwhile X1 90% of the time is always only 900p/30fps. Also the 1st party games on the PS4 look stunning like a PC on high settings (not ultra but high) Different rendering methods. The PS4 is, in a PC fashion, pulling in large 3GB+ textures into it’s high-bandwidth, plentiful, GDDR5 memory and crunch it out with more muscle power, using it’s more plentiful stream processors. Hence, “Raw power” and “higher bandwidth.” The Xbone is designed to parallel process tiles(tiled resources), which is much smaller data. Each tile is really not much larger than 150×150 pixels in size. The ESRAM’s low latency and divvied 4*8MB pools of on-die cache are the key. Keep in mind, that with tiled resources rendering method, you are only concerned with rendering what is in view, and not everything to your sides, outside of peripheral, or behind you. so the amount of data transferring is not the entire cache pool of 3-7GB of texture data that make up the entire world. You’re literally just drawing small tiles from that pool of data and rendering only what’s needed of it. Think about how you transfer files to a USB thumb stick. If you dropped several 1-2Kb files on to even a USB 1.1(2000 era) rated drive, those files will be at the mercy of latency before bandwidth was ever a concern. However, if you drop a media file, say, a small movie that is between 700MB-4GB+ onto a stick, the reverse becomes true. Now, where this all evens out, is we have to transfer several of thousands of those small files. So it has to be coordinated and distributed in a very balanced fashion with no disruption, and that’s why, again, the divvied ESRAM, move engines and DDR3. In theory, it is said that TR should allow for left over GPU, since we’re not concerned with the old ways of rendering, which required brute power and lots of muscle and everything crunching data at once. I’d watch the Microsoft tiled resources demo. Sure, MS could have opted GDDR5, instead of ESRAM, but TR wouldn’t work in as refined a manner. It requires ultra-low latency, for seamlessness. GDDR5 is typically going to be closer to 11ms latency, ESRAM is virtually zero and DDR3 is around 5-7ms. So, technically, even the PS4 and PC can do TR, but the difference being that with the extra move engines and ESRAM built into the Xbox, the Xbox is literally designed for batch processing(parallel processing), much like a server. There have been many rumors, for years, talking of ESRAM technology for APU equipped PC’s, namely. Ultimately, the Xbox One and PS4 could benefit from unlocking those remaining CU’s that are currently locked away. It would benefit Xbones TR methodology greatly. Adding two more CU’s wouldn’t mean much when rendering in the more traditional “muscle” sense, but TR works differently, and that’s a massive leverage, we’re talking consistent 1080p @60FPS, in most titles. Yep, as I have always said, it’s about how the data is handled and what you do with the software. Ignorant kids automatically consider the PS4’s 50% faster performance (once again, just as with PS3) to be endgame in the contest. Not even close. That 50% performance boost (actually, slightly less, since, Microsoft’s speed boost), could work out to serve as a mere portion of headroom. It just depends on things like SDK’s, drivers, game programming and operating system . People get lost in the specs on paper. They are admirable and exciting, but that’s about as far as it should take you. Here’s why: Sony’s PS3 had a supercomputing chip in it and it was supposed to have two of those, initially, but Sony had budget issues and opted, last minute, for an off the shelf GPU. On paper, the PS3 had the power to crush anything…even, the PC (had it only more RAM) and what happened during the PS3 vs 360 war? Microsoft was often times ahead, if not directly neck-to-neck. Microsoft’s unified design was easier to develop for and make the most out of. Resulting in snappier, crisp, high resolution gameplay…the biggest contributing factor? They have the best SDK’s and software all around…drivers, OS etc. In fact, he same SDK’s are used for PC, which had been used for years. One article I read, back in 2007, named Sony’s SDK’s buggy and hard to use. I even recall one developer quote, “The PS3 is a complete waste of everyone’s time and a pain in the ass.” That’s a direct quote. What good is the worlds fastest automobile engine, if the ECU is shit? I’m sorry, but you will find that Microsoft will, once again, make more out of less….just by exercising what they know with PC development, DirectX (the Xbox’s namesake) operating systems, data handling, drivers and networking. We’re so soaked in this eternal PS4/ Xbox war that we don’t even look at the most threatening contender…poised to throw PC’s back at the top of gaming and looking very promising, I might add: Steambox. Heck, Microsoft and Sony don’t even hate each other as much as we hate choose to hate them. They’re probably on each others speed dial. They obviously coordinated and settled on a hardware standard to make it easier to develop/ port between their two systems. You don’t think they just coincidentally selected the same hardware, do you? Microsoft has taken the hit for bad PS2/ PS3 ports to Xbox/ Xbox 360, Sony has taken a hit for bad Xbox/ PC ports to their machines. Sony sells products with Windows on them (Vaio’s), Microsoft uses Sony’s Blu-Ray technology…the list goes on. doesn’t sony use openGL for there PS4. also the XBOX 360 was better than the PS3 because the PS3 was really hard to program/optimise for. Also there is no gaming company that will give one console better graphics than the other except if they get paid to do it they both optimise the game until it its a decent FPS. Historically, they’ve predominantly use OpenGl and I think OpenCl now, too. Last I read, they were using a modified version of DX11. Whether, that means they are modifying existing DX11 components at Microsoft’s discretion, or the supposed, “acquired feature sets,” I don’t know. When I last read on it, Sony fans were taking that as sort of an insult, forcing Sony to comment with something to the effect of “acquired/ borrowed features” or something like that, to sugar coat the matter. I wouldn’t blame them for using it, it’s what 98% of the game industry optimizes their games for. I’m expecting SteamOS will bring OpenGL back into the picture on a larger scale, again, for the first time since the turn of the millennium. You used to have a choice in what you wanted to run your games in, OpenGL or DX, by modifying INI files. The only time I’ve seen the option are in ID games, anymore…as they actively support the Open seen, but I may be wrong. I also wouldn’t blame Sony if, now, they have to use Microsoft’s XNA or “equivalent” or whatever “borrowed” software they may have…it’s what developers are used to. Anyway, the point being, they are paying each other royalties. Most everything uses USB. Microsoft is one of the founding companies, along with Intel and IBM, who established USB. imo, it’s all moot.. the new features in dx 11.2 came from OpenGL in the first place, both consoles can use either. I don’t mind OGL, great for spitting out decent games fast. But for those truly great AAA titles, that stand out and define a platform, OpenGL is where it’s at and worth the extra week of development. anyways.. none of those cool features we leaned about the new ogl and dx11.2 will likely be seen in games for another 5-6 years. The supposed ‘new features’ you speak of were under development by Microsoft during the early SDK phase of DirectX 10. Things like tiled resources were a Microsoft thing first, stolen and implemented by OpenGL coders for fame, standardized later, and then (and this is the important part) never incorporated into hardware-compatible solutions. In DirectX 11.1/11.2 all of these features are either directly processed on dedicated graphics hardware or have the ability to be easily loaded into hardware compute systems with little loss in efficiency. Tiled resources in particular is a joke when done the software-only way as OpenGL requires. It takes an extra 20% longer to draw a texture than if you just brute-forced large textures using tons of vRAM the hard way. The original DirectX version though works flawlessly on the current set of cards (nVidia 700 series for example) and allows the benefits of almost 50:1 vRAM savings over traditional texturing methods. Battlefield 4 on PC (and allegedly XB1, though that’s a flaky rumor) makes use of many of these DX 11.2 features. With them being. on the XB1, I expect a huge rise of these features on PC versions of games as those features easily translate to a console when porting. We won’t wait half a decade to see these features. They’re being used in development-stage games as we speak. I’m not trying to sound condescending or anything, just stating the facts. Ever since the new millennium OpenGl has been stealing ideas and cloning practices from DirectX more and more if only to stay relevant. I’m actually hoping SteamOS makes the main contributors to OpenGL realize that they can and should do their own thing rather than mimic what’s successful. I for one foster the idea of competition, so long as it leads to progress. Isn’t Sony using a modified FreeBSD OS? So wouldn’t it be easier to use OpenGL? They aren’t using DirectX, only emulating certain feature sets in hardware-level code (similar to AMD’s own Mantle API). Microsoft specifically stated they wouldn’t allow ‘the competition’ to use the most up-to-date version of DirectX in a console. AMD chips use both. * supports both. it doesn’t directly mean that sony is allowed or uses DX. Sony uses OpenCL, but their wrapper is capable of DirectX 11 (though at significantly reduced efficiency). None of that overly matters since in another year or two virtually all AAA developers will be using the low level API instead of the DirectX/OpenCL wrapper for both consoles and it will cease to be relevant. i assume you mean openGL (graphics lib). openCL(computing lib) is computing and will not be replaced by mantle. I don’t know about OpenGL but they do use DX11. PS3 did things 360 could not. No games on 360 look as good as Killzone, Uncharted, Heavy Rain, or The Last of Us. killzone is ps3 only thats why it had better graphics because they optemised it only for the ps3. like i said if a company makes a game for ps4 and xbone than they will look the same (exept maybe for the resolution). they will not make and extra investion for special textures and optimisation if they don’t get paied for it. maybe if it is also ported to the pc than there might be 3 different qualety textures. PS3 and Xbox360 were apples and oranges when it comes to writing code. PS4 and Xbox One both are like Pomegranate, PS4 has more delicious seeds in it. They have the same architecture, Xbox One OS can be “easily” ported to PS4 and vice versa, same can’t be said of PS3 and Xbox360. All this talk of DirectX is irrelevant when both consoles’ game OS is already close to the metal (supposedly). It all comes down to how much headroom there is and what developers will do with it. Xbox One will hit the wall sooner than PS4. This generation, we can easily tell which one is more powerful already. Sony doesn’t like to optimize or develop new tools unless they absolutely need to. They’re infamous in developers’ eyes for handing out PS3 dev kits, a manual in 100% Japanese, and then saying “Figure it out yourselves.” Microsoft is lauded for their developer support post-launch. Saying the Xbox One will hit the wall first is a joke, right? Multiple developers have already stated the Sony’s IDE is considerably better than Microsoft’s right now. Not to mention Sony did wonders with the PS3 IDE after the first couple years. You really have no idea what you’re talking about. The 360 had it’s own problems with eDRAM post launch. It took a good 2 years after launch for them to finally get most games to 720p because of the fact that 10 MBs wasn’t enough capacity to fit a HD image. Developers had to learn tiling and Microsoft had to refine the feature in their API. It wasn’t all sunshine and unicorns for Microsoft. Uncompressed, a 1080p image requires 8Mb of space. There were indeed 1080p games (though primitive–Earth Defense Force for example) within months of the system’s launch. The real issue was not the eDRAM itself but the developers misusing it, and that came down to a tools issue. Still a problem, but it wasn’t inherent to the eDRAM’s capacity. Both systems had issues. Both current consoles do as well, to be blunt. However the PS2 and PS3 both have a history of Sony only waiting until an exclusive partner threatens to jump ship before they invest in required development tool updates (and in the latter case, system firmware updates). Microsoft has had their last two systems go on record as the most developer friendly by many studios, and the reasoning almost always was directed to Microsoft’s willingness to improve their platform. I can get behind Microsoft on that viewpoint, as a more attractive tool set and support line usually leads to more developers, and thus more games with which to rake in royalties. I’m actually baffled at how Sony has handled things in the past. It’s just bad business. Basically, any massively impressive titles on the PS2 and most early-era PS3 ones were completely due to the developers and not Sony. Kudos to those guys, actually. That’s not an easy pedestal to climb atop. Multiple developers have already stated the Sony’s IDE is considerably better than Microsoft’s right now. Not to mention Sony did wonders with the PS3 IDE after the first couple years. You really have no idea what you’re talking about. The 360 had it’s own problems with eDRAM post launch. It took a good 2 years after launch for them to finally get most games to 720p because of the fact that 10 MBs wasn’t enough capacity to fit a HD image. Developers had to learn tiling and Microsoft had to refine the feature in their API. It wasn’t all sunshine and unicorns for Microsoft. ..Would still rather have a PC for the same price with pretty much identical hardware that does a thousand if not a billionfold more right out of the box than these OEM-crippled DRM players ever will in their entire lifetime. Bring on the fanboy tears. Well you can buy a PC AND consoles. I don’t see an issue here :-) 500$ without Windows and stuff will get you a nice PC. Towards 1000$ a powerful gaming PC. Wait 2 years and it will provide much more power then the consoles (which then will be 100 – 150$ less) Why would I still get a console that does only part of what my PC does if I get a PC? I could spend that extra money on a PC that not only matches, but outperforms the consoles in terms of performance. For under 400$ I have a PC that contains almost the exact same hardware as these two consoles (AMD quadcore CPU, HD7850 GPU- which is faster than the one in the APU-, 8GB RAM, power supply, case, and 1TB ‘hybrid HDD’), without being restricted to the few overpriced OEM-endorsed game titles that’ll appear for it in the next few years. Not to mention I can upgrade it for cheap. I don’t get consoles. They’re literally crippled PCs with a few extra (useless) parts to look shiny, and a trademarked name. They made sense in the 90s when they contained unique hardware that offered an edge over computers. In 2013? Not so much.. You get consoles to play exclusive games. Example GTA5 is on the PS3 and Xbox360, not on PC. Or maybe Gran Turismo, Heavenly Sword, Tekken, DOA….the list is long. Which I only find more disgusting and gives me even MORE reason to boycot consoles, because those games would run just fine on any PC. The only reason ‘console exclusives’ exist is because people keep falling for that joke. The list is long, and full of shit. 90% of console exclusives get press simply because they are exclusive, not necessarily because they are any good. The best games are all multi-platform (GTA5, in a few months). There’s barely 2 exclusives per console that’s worth playing. Hardly justifies spending 400+500=$900 (not to mention the more expensive games). Also, I would rather support cross-platform games than encourage companies to keep games out of gamers hands based on the device they own. I recommend you research before you make your arguments. GTA5 is comming to PC in 2014. It will also LOOK way better. I said its not on PCs right now, not that it wont come. Read my post before making a stupid post. I see. During the whole console gen there are always some exclusives that makes it worth buying though for me, in addition to the PC :) Another thing as said below by rahuldey85 some genres like fighting games are rare on PC. You know what the difference is between a PC and a console? Software. That’s all there’s to it. Literally if these consoles weren’t locked down you could install Windows on them right now. Or Linux. Or even Mac OS X if you wanted to. When a company says they will make their game a ‘console exclusive’, they’re really saying they’re going to put a special bit of DRM inside it so that you CAN’T run it on a PC, rather than ‘focus on building it for a console’. You can do EVERYTHING, every last thing you can do on a console on a PC. Gamepad? Hell, just hook up your Xbox controller to your PC. Works fine. Want splitscreen? Just hook up two of them. Consoles are a market of cartels and contracts between various companies that make a limited normal PC look special in order to sell games at a higher price, for a device that will get replaced in its entirety including ALL games written for it in a matter of years. You know what happened to the games that were written in 1994 for the PC? I can still play them today, on my PC, same disk, same everything. Because PCs don’t get deprecated and forgotten, they get upgraded. That’s why I boycott consoles. I hate the philosophy behind it and everything they stand for. If I buy a device I want to own it. I see. During the whole console gen there are always some exclusives that makes it worth buying though for me, in addition to the PC :) Another thing as said below by rahuldey85 some genres like fighting games are rare on PC. Why would I waste my money on a console when I have a more powerful device?? Except this isn’t even remotely true. If you want identical hardware you’re looking at around $550-$600 to match the PS4. Of course, that doesn’t overly matter since optimization makes that more like $700-$800. My sincere advice is that you flush $400-500 down the toilet, and pretend you bought a console. “As far as performance is concerned, this could well end up a tie; as the Xbox One should be able to access data more quickly, while the PS4 can stream sustained data far more effectively. “ XB1’s ESRAM basically compensate’s for its use of DDR3 RAM which has 68 GB/s of bandwidth, the The ESRAM has 102GB/s of bandwidth. PS4’s GDDR5 has 176GB/s of bandwidth. What you have is a situation where you have to dedicate resources to get the maximum bandwidth out the XB1, as you are working with slow 8GB of DDR3 and faster 32MB of ESRAM. The PS4 has one very fast pool of 8GBs GDDR5 RAM. The trade-off is cost. The RAM costs $28 more in the PS4, though the ESRAM makes the weaker XB1’s APU cost $10 more. So a delta of $18. One thing not mentioned are the APU’s ROPs, which is 32 for the PS4 and just 16 for the XB1. AMD puts 32 ROPs on all their 1080p targeted PC GPUs, meaning that the XB1 really wasn’t designed for 1080p to begin with. From a spec perspective, there really is no competition in regards to performance, the PS4 is a good deal ahead and its shown that in real world gaming examples on multiplatform games. The XB1 on the other had is clearly focused on Kinect and cable-TV pass-through, which it feels is worth $100 more. Access latency != bandwidth. Who’s talking about latency? Issue here is bandwidth, its the whole point of having that extra ESRAM. It doesn’t matter though, there is a reason why any mid-high end graphics card uses GDDR5 right now. 32 MBs of eSRAM is useless for most graphic applications. It can’t even fit a 1080p image, much less more than a handful of 1080p textures at a time. It can’t make up for 8 GBs of GDDR5. Ever. It helps, but minimally. Wait a year for the next cards to emerge and the first wave of DX 11.2 games are churned out. You’ll see games using only 500-700Mb of vRAM on those cards, but producing results similar to 2-3Gb engine/card setups. If you want to brute force huge textures back and forth in vRAM you might need loads of space as well as high bandwidth. If you use 1/10 the size per texture you need not only 1/10 the vRAM but technically only 1/10 the bandwidth (though some processes benefit do higher bandwidth like advanced AA). Bandwidth means nothing if you optimize your data systems. DX 11.2 won’t be used until Microsoft takes away the Windows 8.1 exclusivity. 90% of PC gamers are still using Windows 7. Why the hell would developers code for DX 11.2? They won’t. I know Microsoft is trying to force everyone on Windows 8, but this will backfire on them. DX 11.2 won’t be used for the forseeable future. Also, you’re referring to tiling and your statement is complete and utter bullshit. No offense, but it is. Tiling only works when you’re dealing with non changing scenarios. It also has ridiculous pop in. It’s practically useless for any games other than flight sims (which is why they used a flight sim to demo it). Bandwidth still means plenty. DX 11.2 won’t change that. Pull facts out of sources and not your ass. According to Steam 64% of gamers use Windows 8, and there’s no reason not to upgrade to 8.1 since it’s free. I can’t help that you have an infantile attachment to an older OS. When DirectX 11 came out exclusively for Windows 7, most gamers made the switch from XP as well. According to you 154% of gamers use something newer than XP, so by your own goofy logic you prove that (more than!) every single gamer grew out of that “XP is still better” phase once a viable reason presented itself. Tiled resources works perfectly fine in. studio. The only time we ran into pop-in was when running in software on OpenGL, as I explained. DirectX now allows us to render everything in a scene, including dynamically generated textures and shader systems. , with no humanly-perceptive pop-in. No offense, but your entire statement is bullshit when you haven’t put your hands on the tech yourself and I have. Bandwidth does mean plenty. Many changes in DX 11.2 make it mean less-plenty. The bandwidth argument reminds me of uninformed gamers building PCs, namely in the early naught years. The common phrase of the time was “More RAM is always better.” The first part ever recommended for upgrade when a game was running slow was always to add more RAM. At the time no game used more than 2Gb of RAM and your OS barely used half that as well, yet everyone exclaimed that 8 or 16 Gb was absolutely required. The truth? You only need as much as you actually use. In the mentioned case, 4 Gb was more than enough, and you could focus on. RAM with your savings rather than get more. Bandwidth is the same. Yes, you need a decent amount. No, you don’t need more and more. Rather, you need to code such that you don’t fall into an exponential trap and bottleneck yourself. Optimization unclogs the bandwidth bottlenecks and makes any unused bandwidth like unused RAM storage–dead weight. Yes you may need to use it in the future, but you also may find a way to use even less at the same point. Effectively the options when you need more are to just brute force and get faster pipelines (monetarily costly) or to bunker down and re-code more efficiently (time-wise costly). So does more bandwidth mean anything? No if you optimize, and yes if you’re lazy. It’s really simple. Wait a year for the next cards to emerge and the first wave of DX 11.2 games are churned out. You’ll see games using only 500-700Mb of vRAM on those cards, but producing results similar to 2-3Gb engine/card setups. If you want to brute force huge textures back and forth in vRAM you might need loads of space as well as high bandwidth. If you use 1/10 the size per texture you need not only 1/10 the vRAM but technically only 1/10 the bandwidth (though some processes benefit do higher bandwidth like advanced AA). Bandwidth means nothing if you optimize your data systems. Whose talking latency? I am. The author. I’m talking about latency. Caches are used to mitigate the impact of lower bandwidth memory subsystems. A memory controller capable of speculative prefetching can have information from memory queued up and ready to go while the CPU is yanking data out of cache. In gaming and on the desktop, latency matters more than bandwidth. Graphics is a special case (GPUs have typically relied on high bandwidth, high latency connections), but clearly that 32MB of EDRAM serves as a buffer for both GPU and CPU. The Xbox One is designed to hide its lower bandwidth DDR3 by relying on a large, high-speed, low-latency cache. The PS4 is designed to maximize bandwidth by using a unified GDDR5 memory pool for CPU and GPU. The PS4’s approach makes it simpler to program (a point we’ve addressed in other stories), but there’s nothing intrinsically broken about the Xbox One’s choice. Whether or not it turns out to have been the right choice? We will have to wait and see. You gotta be kidding me? Its amazing someone with no technical knowledge is writing this article. There’s a myth that every new memory format brings with it a latency penalty (That GDDR5 has more latency than DDR3). The myth is perpetuated by the method upon which latency labels are based: Clock cycles. Consider, latency is a function of clockspeed when RAM is concerned,667mhz, DDR2 has LOWER latency than DDR3 (7ns vs 10.5ns). DDR 333mhz has a latency on 6ns. If latency was an issue DDR ram would be used. This is because cycle time is the inverse of clock speed (1/2 of data rates). In real world applications GDDR5 has equivalent latency as DDR3 (its only when you measure it as the inverse of clockspeed that you get skewed latency figures). Also, the XB1 uses ESRAM NOT EDRAM. This is a HUGE difference. If the XB1 APU was using EDRAM it would have a much smaller footprint. For the XB1, the ESRAM itself isn’t faster than the GDDR5 RAM. Its slower, even when you add ESRAM + DDR3, you’re still getting sub-parity in speeds. Worse, you only have 32MB of ESRAM, and now you’re juggling these bandwidth around and have to move your data between multiple slow pools of memory. With the PS4 you have the GDDR5 that can be accessed by both the CPU and GPU, more importantly, with HSA, CPU and CPU memory resources can be accessed by each other. Really, proof is in the pudding, look at the multiplatform games. The XB1 is struggling by a factor of 50-100%. Those are real world examples, not talk on paper. You’re right about the ESRAM / EDRAM bit. Need to fix that. That’s relative RAM latencies measured in nanoseconds. The idea that each generation of RAM has higher real-world latency isn’t a myth. It’s how things work. (Apologies for scale on that image). Here’s why I maintain what I’ve said about the latency issue. 1). Cerny says GDDR5 latency isn’t “particularly” higher on the CPU side. For the GPU, he notes that GPUs are designed to be latency tolerant. He doesn’t say the two or equivalent, or deny that the GDDR5 latency is high on the GPU. This directly relates to 2). 2). The memory controller AMD uses for all of its GCN cards is *very* high latency. This article doesn’t show a GCN card, but let me promise you — I’ve specifically run these tests and pulled the figures. They *don’t* get better. Even R9 290X has extremely high latency compared to NV hardware, Intel hardware, or even an APU. Now, it’s not clear what memory controller Sony is using for the PS4, but presumably they modified the GPU memory controller. Unless they seriously fixed the latency issue, it’s going to bite. Finally: On-die cache should offer markedly better latency than off-die GDDR5. If the GPU has direct access to that 32MB of cache, it *better* be faster than accessing off-die RAM — if it isn’t, MS really screwed something up. 3). The only reason for MS to implement a giant cache on-die is if they thought it would give them some sort of advantage. That’s common sense. They didn’t build it because they thought it would look pretty. Absent the kind of benchmark data I doubt we’ll ever be able to see, it stands to reason that the Xbox One should offer better access latencies and the PS4 obviously has an advantage in sheer bandwidth. The PS4 is also going to be easier to program, another mark in its favor. We know why Microsoft went with the DDR3+ESRAM route. Because they needed 8GBs of memory to run 3 operating systems, snap function, HDMI-IN pass through, and run Kinect. Sony’s original plan, according to Digital Foundry, was to go with 4GBs of GDDR5 RAM. What happened was that GDDR5 prices dropped to push in 8GBs. Still, the GDDR5 RAM in the PS4 costs $28 more than the DDR3 in the XB1 according to the iSuppli IHI breakdown. Being that the bill of materials of the XB1 is already high with the Kinect, going with cheaper DDR3 is desirable. So the reason that MS went with DDR3 is COST not latency. As far as latency itself and its impact on performance, we see several examples of DDR3 and GDDR5 performance differences and they are huge. The problem with having multiple pools of memory is that data needs to be read-written-refreshed. The bandwidth itself is 102GB/s. Meaning that the ESRAM is only acting as a scratchpad on the XB1. The puny 32MB size of the ESRAM is also a problem, if you look at the Intel’s Haswell setup it uses DDR3+E 128MB of ESRAM. And outside of budget graphics solutions, every other performance solution is moving to GDDR5. Which is why AMD’s CPU/GPU Kaveri is moving to GDDR5 as well. So AMD too are moving the GDDR5 for their future APUs. You can’t compare against Haswell’s L4 cache — that pool of EDRAM *is* an L4, not just a buffer for the graphics card. I don’t think there’s any evidence to suggest the 32MB buffer on the Xbox One is analogous, though I agree it’s not clear exactly what MS uses it for (and yes, they try to fudge the bandwidth). I’m not convinced it’s actually a contiguous block of cache — they describe it as 4x8MB in their own documentation, which means hitting max bandwidth may involve striping data across all four cache blocks simultaneously. Point of order: I’ve seen Kaveri hardware. There’s no evidence of Kaveri “moving” to GDDR5. AMD’s FM2+ platform for Kaveri is still a DDR3 platform, and there’s no GDDR5 on-die with the APU. The reason I bring up Haswell is because Intel knew that they needed 128MB of on-die cache. 32MB is insufficient, and the uses for it become very limited. Think about the memory required for a 1080p frame (1920 * 1080 = (2073600*24)/8=6.22MB with no AA. For 60 frames, that’s 373MBs of memory (again, without AA): thats data that you have to be copying in and out of ESRAM. This small ESRAM size is a large reason (along with the 16 ROPS) that a lot of games are running 720p. Haswell uses 128MBs of cache because Intel figured that to do proper 1080p gaming, its a bare minimum. Yes, Haswell is EDRAM. As I said above, EDRAM is much smaller than ESRAM. EDRAM has 3x the space saving of ESRAM. My point was that MS should have gone with larger EDRAM rather than smaller ESRAM for its XB1. The 360 had 10MBs of EDRAM if you remember. In terms of Kaveri APU, GDDR5 will NOT be, never will be, “on-die”. GDDR5 (or DDR3) is never integrated onto the die itself. There was initial rumbling of an FM3 Kaveri APU moving to DDR4/GDDR5. Which would make sense. Again, DDR3 is a huge bottleneck for APU performance (this is the reason why MS added the ESRAM). AMD’s APUs don’t have ESRAM like the XB1, and are more similar in design to the PS4. GDDR5/DDR4 is clearly the way to go. As far as Kaveri goes, I can tell you now, having seen the shipping hardware — Kaveri is FM2+ on DDR3-2133. AMD is still mulling the DDR3-2400 question — I think we’ll see it shipping in half-supported mode. AMD has not unveiled any plans to move to DDR4, or even shown a CPU / chipset map out past December 31 2014. DDR3 is a huge bottleneck for integrated GPU performance, but that’s *always* been true.The 68GBps of bandwidth on the Xbox One is puny compared to the PS4, but it’s 2x what Kaveri’s GPU will have. AFAIK, the only mainstream desktop part on DDR4 next year is Haswell-E. I think even Intel’s mobile chips are staying on DDR3 through the end of 2014. They couldn’t go with edram because of inability for Global Foundries to fabricate edram on the 28nm node. Nintendo was able to because Renesas Electronics could fabricate on the 40nm node. Could TSMC? Perhaps but there may have been a number of reason why TSMC was not chosen. @Joel Its not just about what technology would be optimal, its about what your business partners are able to provide. You seem to be confusing a lot of different information. That image you cited shows memory cell cycle time in ns, there are very different types of latency when RAM is concerned, CAS, RL, RL-tRCD, etc. Latency in RAM is not a singular value. Also, GDDR5 memory in the PS4 runs at 1375MHZ (5500MHZ effective), the DDR3 in the XB1 uses 1066 (2133MHZ effective). So there is “running at a high enough clock speed over DDR3” BUT. Increasing clockspeed actually INCREASES latency. And comparing DDR3 and GDDR5 latency based on clock speed in the greater problem you’re encountering, they are fundamentally different types of memory. The image you cite exactly supports an argument I’m making. In it you see DDR4 performance relative to DDR3. The thing is DDR4 actually doubles the latency over DDR3. Here is an article about DDR4 RAM’s latency. Another good article: Basically, what you have is the fact that RAM is getting more complex. The reason why you see RAM performance skyrocketing as latency increases is because controller handles that complexity. Latency in itself doesn’t have an impact on performance, its a red herring, its part of the larger design of the memory itself. The reality is latency isn’t a major factor in performance difference between the XB1 and PS4. If MS could afford it, they would have gone with GDDR5 over DDR3+ESRAM, but they needed to keep cost down due to the inclusion of the Kinect. The real performance gap is the lack of GPU compute cores, and half the ROPs. No amount of latency is going to make up for real hardware deficiencies, Right. I’ve written up the DDR4 latency problem, and why you can’t expect DDR4-2133 to suddenly blow DDR3-2133 away (or even necessarily equal it.) It sounds like you’ve mistaken a point of mine, however. Looking back, I could’ve worded this better. When I said that performance could end in a draw, I was referring specifically to memory subsystem performance, not total system performance. Even if the Xbox One has a latency advantage (and I still think it probably does), that doesn’t mean the Xbox One is the faster (or even ‘just-as-fast’) platform. Even if there’s an offset between those two characteristics, the PS4 also has (though I believe this is unofficial) more ROPS, more TMUs, and more total bandwidth. GPUs generally *aren’t* particularly latency sensitive. The PS4, in my opinion, has a significant *overall* hardware advantage, even if the Xbox One’s ESRAM cache helps balance its reduced bandwidth. As you’ve said, a 32MB ESRAM cache doesn’t magically fix reduced ROPs, or fewer GPU cores. It should be noted that according to Mark Cerny Sony at point had the PS4 had GDDR5 + 1088GB/s EDRAM. They dropped it due to the added complexity. As Sony has learned with the PS3, non-first party developers aren’t going to invest in developing exotic hardware. As far as DDR4, these CPU/GPU are going to have to move to DDR4 and higher latency memory. Can’t stay on DDR3 forever. However, latency as a whole is not a singular factor. There are latencies between each part of the memory module, its design that overcomes this. The same thing happened from DDR2 to DDR3 as latency too went up. There is a good video that discusses this: In regards to the PS4, I personally think that when DDR4 prices are good, and speeds are fast, that the PS4 can move from GDDR5 to DDR4 as long as the technical speed requirements are met. Down the road this could be a huge advantage in terms of price, last-gen memory prices go up as they get fazed out. DDR3s days are numbered. I would be very surprised to see Sony change something so fundamental about the core architecture post-launch. Heck, I think it’s just barely possible that they might tweak clock speeds or enable the compute units that are currently disabled at some further date. Overhauling the memory uarch would be a much bigger deal. I think we’ve seen the end of exotic console hardware from Sony. The Vita is basically common ARM chip with a PowerVR SGX, and the PS4 is AMD x86 APU without any fancy frills like ESRAM. I would suspect a PS5 to follow that x86 route, and have more logically continuity in their hardware like Apple’s iPhone3,4,5, etc. There doesn’t seem any reasonable way that Sony will revert back to odd custom hardware like Cell, EmotionEngine, etc. For this reason, I think Sony has left a good deal of fungibility in their components and supply chain. I don’t think the APU will change, or any part of the core lgoic,but bigger faster HDD, and possibly faster RAM, that is, if all the minimum specs of GDDR5 is met. Games wouldn’t run faster, it would merely be a substitute years down the line when GDDR5 is no longer mass produced. It can get difficult and costly to source old memory. So its really just a supply chain benefit. I think you’re right about the lack of custom hardware. In the case of the Xbox One in particular (since it runs a version of Windows that’s virtually guaranteed to be based on the same W8 kernel), the difference is now entirely software based. On the Xbox, that’s an even thinner gap. But Cell was a really odd duck. Prescient in some ways — you could make a strong case that Cell’s SPEs were a forerunner of programmable GPUs (and I believe the Frostbite engine uses them for graphics processing). But overall, Cell was always tricky to program. Moreover, for whatever reason, all the money Sony sunk into its development didn’t give it a huge leg up over Xbox last cycle. Apparently my reply to you got eaten. Dammit. Ok, long story short. Reasons why I’m reasonably certain DDR3 offers improvements over GDDR5 in the latency department. 1). The RAM latency hit isn’t a myth. It’s quite real. See here: Unfortunately that’s the largest size I can find, but it’s far from mythical. Mark Czerny has said that the PS4’s CPU memory latency isn’t particularly higher, and that GPUs are “latency tolerant.” That’s not the same as saying there’s no difference. 2). AMD’s GCN memory controller has really high latencies compared to NV. Far higher than what we see on an APU, even. Even on Hawaii, memory latencies are 200 – 500ns, compared to half that or less for NV cards. 3). MS didn’t build a 32MB ESRAM cache (I’ll fix the EDRAM reference) because they thought it looked pretty. It’s common knowledge that it compensates for the relatively limited bandwidth of the DDR3-2133 controller. When I look at the Xbox One, I see an APU backed up by a fast, low latency cache, followed by a lower-latency DDR3-2133 memory bus with relatively limited bandwidth compared to the competition. When I look at the PS4, I see a higher latency, higher bandwidth design that’s simpler and likely easier to optimize. What’s the final real-world gap? Don’t know. Probably never will. Sure, theoretically the PS4 GPU looks to be better but where I’m not so sure is how the bus contention of several units will affect the real performance. Framebuffer DMA, Shaders, CPU, Audio and IO all access the GDDR5 randomly and each time it needs to start a new memory burst sequence. Who can really say here outside AMD and maybe MS/Sony to how much stalling this leads in the GPU or CPU. At the moment I “assume” that the PS4 can reach higher FPS but the XB1 might be better at generating stable FPS. One thing that is consistent about the Xbox ONE, and it is that cross platform games are running consistently at 720p @ 30 fps. Yet the PS4 is running practically all of these games at 1080p @ 60 fps. And where some instability was observed such as with Call of Duty, it was later learned that the game’s graphics engine was at fault demanding a lower frame rate than what the PS4 was attempting to generate. In other words, the PS4 was overpowering the games modest requirements. But Forza 5 on the Xbox ONE has given some gamers hope that 1080p @ 60 fps is achievable for running most games in the future. But the problem is that Forza 5 has so much static images like buildings, roads, cars, and embankments that not much graphics power is needed. So this thinking about the Xbox ONE is as bad as imagining that creating new worlds in the cloud will somehow enhance the game play. The cross platform games require more graphics intensive processing such as dynamically generating waves in the oceans, clouds, explosions, battles, and scenes such as having airplanes fly between clouds; and it is in these instances where the Xbox ONE stumbles. Developers are already showing that the simplicity of developing games for the PS4 is allowing them to showcase their games at their best, and that they will not take the unfair decision to limit the game’s performance on the PS4 so as not to make the Xbox ONE look bad. I think eventually, the XB1 will get 900-1080p more frequently than it does now, but the 16 ROP situation will probably limit them more than the lack of compute units. As the Digital Foundry breakdown went into, Forza 5 really isn’t much of a feat being that it has baked-in lighting and cut-out audiences. There are clear sacrifices that were made to get 1080p60. As you’ve said, the biggest issue I see is the difficulty of programming for the XB1. This is the same mistake Sony make with the PS3. Exotic hardware is a disadvantage. I can’t imagine that if the XB1 gets a lower installed base than the PS4, many developers will put the resources to get the most out of the system. Even the expensive and late PS3 did well in continental Europe and Asia, if the XB1 loses those markets and North American and UK, its going to be an uphill battle to get developers to devote larger resources to making a game look good on a system with a smaller installed base. Ghosts is 720/60 on X1 but on PS4 its 1080p an a non stable framerate of 60 but drops as low as 30s. This was the early report, but subsequent reports have revealed that the Call of Duty’s graphics engine is old and that the drops is due to developer oversight. So the game’s old graphics engine is being updated to better interface with the PlayStation 4’s graphics processor. Not trying to be condescending in any way since we are all prone to making mistakes, but one thing I try to do when given a piece of information is to use common sense. The fact that the graphics hardware on the PS4 is 50% more powerful (i.e. faster) than the Xbox ONE, and the fact that no other game has experienced instability while running on the PlayStation 4 is a strong indication that the problem is most likely not with the PS4. But the facts have shown that the problem is indeed with the Call of Duty game. Thanks for bringing this point up because most gamers may not be aware of the latest news update on this issue. That’s not true. The framerate does drop to high 40s at some points, but the major reason why you see non-stable framerate is because the PS4 is actually showing framerate higher than 60 fps often. Digital Foundry addressed this already. There’s supposed to be a patch that fixes it. We’ll see what happens. Ghosts is poorly optimized on both systems. Battlefield 4, on the other hand, is 720p on the Xbox One and 900p on the PS4. It also runs at a more stable framerate on the PS4 (though both consoles have dips). Either way you look at it, the PS4 is obviously doing better right now. Infinity Ward outright admitted that they didn’t properly optimize the XB1 version. The PS4 version simply required less optimization due to more standardized system architecture. The PS4 edition has no issues related to ‘overpowering’ the engine. It’s a legit issue in the way they accessed the dynamic vertical sync function of the GPU. When the game needed vSync it didn’t trigger properly, and similarly had a large delay in disengaging. There’s apparently a patch in the works to fix that. Infinity Ward outright admitted that they didn’t properly optimize the XB1 version. The PS4 version simply required less optimization due to more standardized system architecture. The PS4 edition has no issues related to ‘overpowering’ the engine. It’s a legit issue in the way they accessed the dynamic vertical sync function of the GPU. When the game needed vSync it didn’t trigger properly, and similarly had a large delay in disengaging. There’s apparently a patch in the works to fix that. Doesn’t explain the differences with Battlefield 4 or why the difference is so large between the Xbox One and PS4 with COD Ghosts. The PS4 has 125% more pixels. That’s a huge difference. Even if they completely ignored the eSRAM, the difference shouldn’t be that large on similar hardware. I’ll agree that the Xbox One is capable of more, but their API is a mess right now. They need to get their IDE in a better state and then we’ll see a bit more parity. PS4 will still beat them out. That won’t ever change. It has more powerful hardware period. I think going forward we’ll see more of a difference in effects rather than resolution though. Effects make more of a difference anyway visually. More power is useless dead weight if your application (or game) doesn’t require it. Optimization, again. Battlefield 4 on Xbox One looks almost picture-perfect to the PC edition, just running 720p The PS4 edition has mismatched textures, blurred surfaces, simplified physics (namely explosion particle simulations and ‘flickering’ fire sprites), less surface detail shaders (displacement reduced to normal mapping alone), lower quality ambient occlusion (less accurate, depth-independent scale), and the list goes on. The equivalent is something like a mix of medium and high settings on PC, but with the texture mismatching still unexplained. So is there an explanation for how they got more resolution out of the PS4 edition? Yeah, they cranked down the settings. The art style caries the game most of the way but a discerning eye for detail (and likely a PC player anyway) would have good cause to complain. COD was not fully optimized on Xbox One in time for release, and was admitted directly by an Infinity Ward press release. Since both editions look largely the same apart from resolution, I can only speculate that the Xbox One version could indeed have have been able to render at a higher resolution if given the time to optimize properly. Does Microsoft need to improve things on their end? Yes. That’s my whole point. Optimization, be it by the hardware developer or the software developer, is always a major factor in console output capabilities. Raw power on the other hand has proven to at best be a quick bandaid for a long term issue rather than a full solution, as the PS3 best represents. Even with more potential power to tap, it still took better tools and optimization before the PS3 started regaining market share via proper exclusives. Sony proved that more power and no support can lead to years (three!) of stagnant development when your weaker competition is making their lower power quotient get more bang for its buck. Completely wrong, stupid, and delusional. The Order is now easily the best looking next gen game, and PS4 multiplats consistently run better by a significant margin. PS4 Hardware Advantages: +6 CUs, +560 GFlops, +16 ROPs, +8 ACEs/CQs, and faster unified memory. PS4 OS may also have less overhead or reserves. Doesn’t explain the differences with Battlefield 4 or why the difference is so large between the Xbox One and PS4 with COD Ghosts. The PS4 has 125% more pixels. That’s a huge difference. Even if they completely ignored the eSRAM, the difference shouldn’t be that large on similar hardware. I’ll agree that the Xbox One is capable of more, but their API is a mess right now. They need to get their IDE in a better state and then we’ll see a bit more parity. PS4 will still beat them out. That won’t ever change. It has more powerful hardware period. I think going forward we’ll see more of a difference in effects rather than resolution though. Effects make more of a difference anyway visually. Not to mention the huge amount of processing power needed on PC to run it at 1080p 60 FPS due to the poorely programmed engine. Yeah, that really was surprising. Assassin’s Creed IV also ran horribly unoptimized on PC (AC3 settings ran well, new filters and effects could single-handedly take my GTX 770 to half the framerate) This isn’t too unexpected though. These games are running preliminary engines for next-gen. It’s like they were ported to the new hardware (or PC) after they were made for the older consoles. In six months we’ll start seeing more engines dedicated to DX 11.1/11.2 and the new consoles. It’ll get better in time. Just look at Just Cause versus Just Cause 2. One ran amazingly slow, the other was rather optimized, and both used the same base engine. :/ Correct. The Order is now easily the best looking next gen game, and PS4 multiplats consistently run better by a significant margin. PS4 Hardware Advantages: +6 CUs, +560 GFlops, +16 ROPs, +8 ACEs/CQs, and faster unified memory. PS4 OS may also have less overhead or reserves. The X1 is actually able to access both the eSRAM (150GB/s real world) and DDR3 (60GB/s real world) at the same time not one at a time like you say so it’s total bandwidth is both of them combined. This provides more than enough bandwidth for the X1 which only requires about 120GB/s. Also the low latency of the DDR3 gives the CPU better performance than DDR5 and the CPU is clocked higher, both of which reduce the CPU bottleneck with the GPU (also clocked higher). Then there’s the audio block (512 channels vs 200 on PS4 which offloads much of it to it’s slower CPU) and the move engine (4 processors, PS4 doesn’t have one) which eliminates almost all of the overhead from data transfers between the CPU, GPU, and memory, freeing up clock cycles for more important things. Bullshit. Xbox has slower mem width in real world game performance. Idiot napkin math “Add the numbers” is useless. Well the Steam box is better!! Why would I waste my money on a console when I have a more powerful device? PlayStation fanboys must be so used to getting egg on their face over the last 6 years, and its about to happen again. lol. It’s hilarious when people who don’t understand the article at all try to play fanboy in the comment section. I totally agree and by reading you’re above post you don’t have a clue. Its like saying all 8 mega pixel cameras are better than all 7 mega pixel cameras, that just isn’t the case. Your response to my logical argument above is a fallacy of false analogy comparing resolution to memory bandwidth? Try again. Only this time either address my entire argument with a logical response, or don’t bother. We know the specs of the consoles. The PS4 has better specs. Unless you have proof that the Xbox One runs faster in benchmarks (and it won’t since the PS4 has the more developed API right now) then there’s no logical reason why the PS4 won’t outperform the Xbox One. You’re dealing with mights, I’m dealing with the most likely scenario. Especially since third party games seem to agree with me. Keep trying though. You’re argument is based on picking numbers in a Top Trumph style to suit the PS4, you are forgetting the APU chip for the Xbox one is more expensive than the PS4, why is that we wonder, maybe its better designed for developers and Direct X. And where’s you’re benchmark proof?? LOL. It’s more expensive therefore it must be better! Flawless logic! Logical argument?! HA! Sony fanboys like you don’t even know the meaning of that!! and your HATER LEVELS IS OVER 9,000. The Order is now easily the best looking next gen game, and PS4 multiplats consistently run better by a significant margin. PS4 Hardware Advantages: +6 CUs, +560 GFlops, +16 ROPs, +8 ACEs/CQs, and faster unified memory. PS4 OS may also have less overhead or. Ugh, more ridiculous equality articles. There is no tie when it comes to RAM. GDDR5 is MUCH better when it comes to most graphical uses. Sure, the Xbox One will be able to multitask better, but that’s not needed when you’re playing one game. The PS4 will not only stream textures and reflections faster, but it will stream more or higher quality ones per second. It makes a huge difference. No amount of make believe will change that. Stop pretending they’re equal. There’s a reason why GDDR5 is used in graphics cards. It’s not because eSRAM is just as good. No. Seriously. It’s not. The PS4 has a better GPU. That just exists. You can’t make the GPUs equal each other. They don’t. The Xbox One has more overhead for Kinect. That gap will narrow over the generation, but it will always be there. There’s nothing equal here. The PS4 is better for gaming. End of story. I have no idea why people pretend otherwise. Just grow some balls and admit it. Stop placating Microsoft here. Maybe apps or TV will take off and they’ll do well there, but as far as games they’re at a hardware disadvantage. That’s the cold hard reality. Also, SSD won’t become the norm anytime soon. Most consumers have no idea what SSD means, but they do understand hard drive sizes. I’d also rather have the extra storage capacity over faster load times. Especially at a quarter of the price. As for your “there is no tie when it comes to RAM” argument, you obviously didnt read the article thoroughly. The XB1 GPU design was created to take advantage of ddr3’s lower latency, which is why they intentionally didnt opt for gddr5. Whereas Sony went for the full throttle (albeit with higher latency) setup, and designed their rig around that. Due to those specific architecture choices, they are roughly equivalent. The One is faster off the line (has an extra 32MB ESRAM over the PS4, and higher clock speed:1.75Ghz to Sony’s 1.6Ghz) , and the other has a higher top speed, to use car parlance. As for the custom GPU, again, your argument is flawed. The GPUs are both custom AMD Radeon setups. And while Sony’s custom chip has a slight processing advantage (more compute units), the MS design ensures the “low latency” tasks of the XB1 can be offloaded to Azure; and since the systems architecture was DESIGNED to take advantage of latency, it effectively evens out in the end. Whatever slight disadvantage MS has in this product, is far more slight than it would generally appear. I dare say the people at chipworks, extremetech, and others are more knowledgable about the subject than yourself. Why not just enjoy gaming, and agree they are fairly similar and on par, instead of the whole “mine is better” tantrum? That said, we agree on your point about solid state drives though. Sony solved the “Latency” issue with an added extra shared memory controller therfor the latency has been reduced to almost nothing, just wait for another year or 2, PS4 will be OP against Xbone sending M$ to the hall of shame. LOL. Sony fanboy! What weak garbage!! Metal Gear Solid 720p to 1080p anyone? LOL. Your combination of ignorance, arrogance, and sheer delusion suggests you’re mentally ill much like misterxmedia and astrograd. I’m sorry if the truth upsets you. Delusional, demented, completely wrong, and mentally ill. According to Thurrott , PS4 has periods of slowdowns and gets choppy. MSFT did their homework , Sony failed. LOL. Delusional idiot. LMAO… 50% more shader and twice the bandwidth = same performance? In what universe do you live in? No voodoo black magic ESRAM or wishful thinking could EVER make up for this huge gap, period. Beside, ESRAM is a pain devs must cope with where PS4 can enjoy near straight port from PC and vice versa due to practically identical architecture: X86 code, shaders and GDDR5. The very simple and undeniable argument is this: When it come to mid to high end dedicated GPU for gaming on PC, how many of them use DDR3 or onboard ESRAM cache? The answer is resounding ZERO. The XB1 is a jack of all trade, master of none, especially when it comes to pure gaming, which is the essence of what defines a video game console. The other easy argument is the XB1 must already compromise on resolution as low as 720p on current titles just to keep 40-60FPS exactly because it lack the required horse power. What will it be 2-3 years down the road when BF5 and GTA VI come out? And in 4-5 years? 720p at low/medium quality settings on every games, that’s what. Just buy a PS4 now and wait 2-3 years to get your XB1, without the NSA approved Spy cam, for 99$ in the discount bin at Walmart to play the handful of REAL exclusives that will never be published of the PS4. I don’t think that accessing some data a bit quicker will mean squat when Devs start using gpgpu on PS4. The lower CU count on the XB1 will prove to be an hindrance. Xbox One can do GPGPU as well, except it has more bandwidth available to do so. The PS4 can do it at 19GB/s whereas the Xbox One can do it at 30GB/s. This was an interesting experiment, it seems that the PS4 is ahead now :) feel free to check out my site http://chromecastdongle.com/ Folks I am not tech savvy enough to comment on the specific hardware comparisons but I will say this, MS did not invest hundreds of millions of dollars to lose out to Sony for a few bucks in hardware choices that were made. The XB1 is a very complex machine which will evolve into what MS intended it to be.What that is maybe known to some insiders but to the majority of consumers it will become clear when the gaming software becomes refined to the true capabilities of the hardware/software. My guess is that XB1 in 1-3 years time will surpass PS4 in the gaming software sphere due to XB1 hardware/software capabilities. Sony’s hardware choices were partly intended to give the impression of a more modern and advanced counsel due in part that MS appears to be weaker. Stupid and wrong “MS couldn’t possibly have thrown gamers under the bus!” assumption. They did, deal with it. PS4 Hardware Advantages: +6 CUs, +560 GFlops, +16 ROPs, +8 ACEs/CQs, and faster unified memory. PS4 OS may also have less overhead or reserves. Its not +6 CU’s. The xbox One has 14, 2 Cu’s are disabled. ANd you know the PS4 has 18 but it is balanced for 14 as well, right? PS4 actually has 20 CUs and 2 disabled, so 18 functional. That’s 18-12, or +6. The disabled CUs are turned off for yields and will never be enabled. All 18 CUs can be used for graphics, 14+4 is a myth. All you fanboy’s & ego feeders are pathetic… Just buy a damn console & play some games (-_-) This article is still a massive load of BS. The Order is now easily the best looking next gen game, and PS4 multiplats consistently run better by a significant margin. PS4 Hardware Advantages: +6 CUs, +560 GFlops, +16 ROPs, +8 ACEs/CQs, and faster unified memory. PS4 OS may also have less overhead or reserves. The Xbox One has 14cu’s but to balance there system they seen a bigger improvement upclocking versus unlocking the 2 cu’s. SOny has 18cu’s but 4 are seperate. The system is balanced for 14 and the other separate 4 are there if Devs want to access them. 1+1 does not equal 2 in console architecture. The reason Xbox got more performance with an upclock vs 2 more CUs is because it’s bottlenecked by its 16 ROPs, while PS4 has 32 ROPs. That leaked slide is a suggestion, not proof that 4 CUs are unable to be used for rendering. Cerny denied 14+4 in an interview. “Mark Cerny: That comes from a leak and is not any form of formal. evangelisation. The point is the hardware is intentionally not 100 per. cent round. It has a little bit more ALU in it than it would if you were thinking strictly about graphics. As a result of that you have an opportunity, you could say an incentivisation, to use that ALU for GPGPU.” As we’re now discovering what’s on paper doesn’t equate to actual performance, and comes back down to the same old argument that the PS3 is more powerful than the Xbox 360… but games looked better on the later… essentially Sony marketing BS! Comparisons of Thief on the Xbox One and PS4, showed up that the later had inferior anti-aliasing (it was actually really bad), which resulted in noticeable jaggies, but also the PS4 had issues loading textures, which made the game look very low res. Revelations that Sony was throwing around the term 1080p to show Killzone Shadow Fall was actually running in Full HD, was actually a partial lie. The multiplayer section of the game actually runs in 960×1080 resolution… which is basically the same as 1280×720… keeping the 1080 bit meant they could maintain the lie. Twitch broadcast resolution also shows the PS4 up even more, with a rather unimpressive 960×520 – yes FIVE TWENTY, resolution, which is much lower than the HD resolution the Xbox One broadcasts in. I’m not really impressed that Sony keep hanging onto full hd like it’s THAT meaningful. It isn’t, games that look good, run well, and are actually great to play matters. For Sony there is no good games, and anyone who says that cross platform games look better on the PS4, you’re delusional. Watch the comparison videos from a neutral standpoint, and you’ll see all that extra power equates to little or no difference, and sometimes even worse image quality!! Mate seriously. You are so full of crap it’s literally seeping from your pores. 1. Yes, early on with 360 and PS3, the 360 had the better quality games. But it was well noted that games developed exclusively for PS3 had graphic levels that 360 games could not match and late into the generation, after developers learnt tricks and techniques on how to develop for the Cell architecture, the PS3 began to get the better quality games graphically (minus a few, like Skyrim) or there was no discernible difference. 2. Thief is 1 game, 1 game that was very poorly optimized for both consoles and released before it should have been. But it was released to try make money as soon as possible. 3. Don’t kid yourself with the Killzone Shadow Fall graphics. It took 4 months for anybody to notice it. 4 MONTHS! of people playing this game online. If it was basically the same as 720p, the difference would have far more noticeable than you are making it out to be. Guerrilla Games used a complex technique to pull this off and I guarantee that it is a technique we will see more of this generation on both Xbox One and PS4. Twitch may broadcast at 520p. But that’s because it requires a far lower bandwidth to broadcast. Not everybody has the upload speeds required to broadcast it, nor the download speeds to stream it at 720p which requires a much higher bandwidth. You are also kidding yourself if you think that 520p is the set resolution that it will stream at, all this generation. A firmware/software update will come to bolster that. Mate, I am sorry but you are the delusional one. 1 game where it runs poorly, but poorer on the PS4 than the Xbox One and 1 game where the resolution in only the multiplayer section was so discernible to being 1080p, that it took 4 months to notice and all of a sudden it’s as if all PS4 games run worse or exact to the Xbox One counterparts. Games like Assassin’s Creed, FIFA 14, Battlefield 4, Call of Duty Ghosts, NBA 2k14, which were the major multiplatform games, have all been shown and proven to run better and look better on the PS4, over the Xbox One. Metal Gear Solid V: Ground Zeroes is the next major multiplatform game due for release I believe and the differences are astonishing. The Xbox One version looks barely better than the 360 version and the PS4 version blows all other versions of out of the water. Hideo Kojima himself, came out saying just that… At the risk of losing sales. On paper the PS4 is more powerful and it has been proven early on. That isn’t to say that this will be the case all throughout the generation. Hell it could be an entire role reversal where eventually the Xbox One versions look best in a couple of years. On paper specs mean a crap tonne. Hence why a GPU like the GTX Titan or the GTX 780 or GTX 780Ti craps all over the GPU’s in the PS4 and Xbox One. They have far more power in them, hence why they cost more than the actual consoles themselves. Please seriously look at both sides of the picture. I have no issue with your purchase of a Xbox One, please don’t have an issue with my purchase of a PS4. We are gamers, we love games. I plan to add a Xbox One to my collection of PC, PS4, PS3, PS Vita, 3DS and 360 later this year because there are some good games coming. Like holy crap man, going through your comment history goes on to show just how much of a Microsoft fanboy you are. It seems in your mind they can do no wrong. is a custom Kaverie chip btw… not F@Guar. its a custom kaverie [NOT KABINI] chip with full hardware coherent HSA…. hence why you cant see or fathom where or why those lil “RED BOXES” are there …. chump. There’s an second APU on the soc… I have both a PS4 Day one and Xbox Day one. Running on a Sony 65inch XBR W850,through a B&W sound bar all connected through AudioQuest chocolate HDMI cables and I have to say both systems look great! I have Froza 5 and it looks beautiful. DR3 looks good for only 720p. I have been playing Need for speed on PS4 and the game looks very pretty! both systems look great! but all the games I will be getting in the future I will be getting for PS4. I like the Xbox one but PS4 just feels better. I have been a long time PC gamer, building my own high end computers. My PC setup is i7 980x 6core @ 4.4Ghz 12GB Ram @ 2000Mhz with Two EVGA GTX Titan Black series GPU cards in SLI on a Dell 27inch Ultra LCD so I can max out all my games at 2560×1440 and I still love gaming on my PS4 Xbox setup. This article is old. Now that they have been out a while it is easy to see that PS4 crushes the X1. As every 3rd party game almost looks and plays better on the PS4 at 1080p/30or60fps meanwhile X1 90% of the time is always only 900p/30fps. Also the 1st party games on the PS4 look stunning like a PC on high settings (not ultra but high) If anyone is confused about why this article suggested that this may be a wash/ tie, let me explain. Sorry, I know it’s two years old. But the questions gnaw at me. Think of these as cars off a starting line. Let’s pretend that both corporations have the same depot that they courier work to, and cars/ shuttles to do this with. Both Microsoft and Sony are equal distances away, it is a straight shot for both, and they have as many cars as they do lanes available. If Microsoft’s team of cars only consist of 3 cars that are capable of departing simultaneously, and they were able to make the trip going 0-100 in 4 seconds in a quarter mile, then Sony has a team of 6 cars that can only go 0-100 in over 10 seconds. Microsoft has the head start and is back sooner, but Sony has more accomplished, per trip. Not the best comparison, but easy enough to understand. Having virtually zero-latency ESRAM and DDR3 makes it so that data can be accessed and fetched faster, therefore keeping things moving along efficiently and fluid, but Sony’s higher bandwidth build has more latency, approximately 7-10ms, due to the nature of GDDR5, but allows for more throughput, so more things can be done at once. Access times depend on things like hardware induced and software induced latency. Going for a low latency build means there isn’t this period where nothing happening. This is why despite 40% raw hardware performance advantages on Sony’s side, on paper, via, technical specs, we are not seeing 40% differences in multiplatform games. Disgruntled fans attribute this to parity locking consoles, but this isn’t the case. I believe console specs are locked down, yes, but consoles aren’t going to be locked to one another. There’s just no way…that’s a bit of conspiracy theory and bitterness/ resentment from those that deem themselves on the superior side of the fence. Today, as far as how things are currently developed, where this hurts Microsoft is in the resolution race. Going back to our car analogy, it’s a matter of not having enough cars and lanes available to move things along, as it were –pushing pixels is an area where bandwidth easily dominates. The interesting thing here, is while Sony focuses on utilizing brut strength to outclass the competition in shear pixels, and constantly fulfill expectations for 1080p, on top of any additional bells and whistles, you often times end up with the advantage in Microsoft’s hands, as the R7 260x in the Xbox is a beautiful 900p GPU that should have plenty left over for bells, whistles and polish. This is why you see reviews, like, “[such and such game] runs 1080p on PS4, and 900p on Xbox, but frame rates are better on Xbox, while visual disparities between the two resolutions are negligible” etc. It has happened the other way around, too, sure, as it also depends on the developer and how each platform was leveraged. In keeping Xbox games locked in at 900p, the Xbox retains enough “umph” to make up for it in other areas. So, yes, on paper the PS4 is a dominant force, but real world results don’t always work out that way, and there are things we aren’t really paying attention to, that we wouldn’t think would matter and tricks that may be employed. The story is also further complicated when it comes to the game being CPU, or GPU bound. PC gamers know of GPU bound and CPU bound games, we all feel it in our individual builds, and are always reminded at how lacking we are on the CPU/ GPU side. Some games stress the GPU, over CPU. While other games stress CPU, over GPU. GPU intense games will always do better on the PS4, by some margin. However, as we experienced with Assassin’s Creed Unity, CPU-bound games will run favorably better on the Xbox. This is something that nobody realizes. Everyone assumes the two are using the exact same CPU side. This is not the case. Microsoft collaborated and jointly put an order in for a highly customized APU that stressed CPU and ESRAM performance, while Sony opted GPU bandwidth and faster throughput on the memory controller side…and, again, faster throughput does not always mean faster in all ways. Going back to the CPU, even with both consoles clocked to the same spec, the Xbox One still has the faster CPU side, and it sits on 5 billion transistors, compared to your more normal 1.5-3 Billion in modern times. Transistors also have their own advantages, such as, lower temperatures, due to requiring less power to power the APU/ CPU, as well as a higher overclock ability and all of this can positively impact CPU caching, as well as IPC performance. In the PC gamers eye, such as mine, both consoles sport very similar GPU. Naturally, were I to have the choice, I’d rather have the PS4’s GPU in my PC, for the simple matter of higher resolution possibilities, but in most new and demanding games, the difference of having a PS4 GPU as opposed to an Xbox’s GPU in my system would not be so profound a difference, and I still will have a hard time hitting 1080p consistently, without too many cutbacks to the graphical effects, such as texture resolution. This is the same as saying that both GPU’s are low end, no matter how you slice it. For instance, if I am trying to run Star Citizen, on my PC, having a PS4’s GPU is not going to help me hit 1080p, in the least. I’ll still be only hitting 900p with medium textures, while the Xbox GPU will only be able to hit 720p at medium settings, or 900p at low settings with some additional effects and post processing disabled. ExtremeTech Newsletter. Subscribe Today to get the latest ExtremeTech news delivered right to your inbox. More Articles. Samsung Begins Manufacturing ASIC Chips for Mining Cryptocurrency Feb 2 New Material Efficiently Generates Hydrogen from Water Feb 2 LG ‘Bootloop’ Lawsuit Settlement: $425 in Cash or $700 Rebate Feb 1 ET Deals Roundup: Best Sellers from Black Friday: XPS 13, Inspiron 14 7000, and more Feb 1 How to Build a PC in 2018: Choosing the Right Components Feb 1. Facebook Twitter About Contact Newsletters Advertise More From Ziff Davis: PCMag Computer Shopper Geek AskMen Everyday Health IGN Offers.com Speedtest.net TechBargains Toolbox What to Expect RSS Feeds Privacy Policy Terms of Use Advertise Accessibility Statement. ExtremeTech is among the federally registered trademarks of. Ziff Davis, LLC and may not be used by third parties without explicit permission. We have updated our PRIVACY POLICY and encourage you to read it by clicking here. Вопросы и ответы об обратной совместимости консоли Xbox One. Часто задаваемые вопросы. Проверьте наш список доступных игр. Кроме того, не забудьте проголосовать за игры, в которые вы хотели бы играть на Xbox One! Нажмите и удерживайте кнопки "Просмотр" и "Меню" одновременно. Да! Да, пока вы все играете в одну и ту же игру. Да. Аватары и достижения будут на месте при выполнении входа на Xbox One. Чтобы удалить все игры и профили Xbox 360, сохраненные на консоли, выполните следующие шаги. Нажмите кнопку Xbox , чтобы открыть гид. Выберите Система > Настройки > Система > Хранилище , а затем выберите Очистить локальное хранилище Xbox 360 . Это не повлияет на игры, сохраненные в облаке, или игры и профили Xbox One. Да. Однако нельзя одновременно выполнить вход более чем на одну Xbox One или Xbox 360. Console Version. Terraria spiltscreen version for Consoles. PS3/PS4/Xbox 360/Xbox One. Terraria is available for download on the PS3, ($15.00 or £11.99 on the psn shop) PS4, ($20.00 in the pan shop)and the Xbox 360 (£9.99 or $15.00 in Xbox Marketplace). (Also available on the Xbox One Store for 20$ or your regional equivalent.) The console version is similar to PC version 1.2.4, with a few console exclusive items. 1.3 update coming soon! Terraria can be bought from the Playstation Store for PS Vita for $15.00/£11.99. It is almost identical to the PS3 and Xbox 360 edition. As the Vita lacks R/L 2/3 buttons, a manual/smart cursor option can be spotted at the bottom left corner. The vita version lets you crossplay with PS3 users, although the remote play feature is not accessible as of now. Crossplay feature. Crossplay is a feature on Terraria PSVITA edition and Terraria PS3 edition. It will allow users who have bought Terraria PSVITA edition and Terraria PS3 edition to play on each other's world. (EXAMPLE: PS3 users can play on a PSVITA user's world, and PSVITA users to play on PS3 user's world.) This feature will also merge the trophies a user earns on PSVITA edition and PS3 edition into the same trophy list, as long as the user is using the same PSN (Playstation Network) account on both platforms. 3DS and Wii U. Recently, Terraria has been released on 3DS, both on e-shop and to purchase. It has mobile exclusive bosses and items including Lepus. Hardmode can be unlocked, and has Ocram and bosses up to Fishron. It also has the Crimson update but does not have the special crafting items; instead crimson items can be crafted at a normal crafting station. The current version is 1.2 for America and Europe. The game can be very buggy. Also Terraria was released on the Wii U with 1.2.4.1 version. New content. The Console Version has several new items and monsters not available on the PC-Version. They are: Soul of Blight (Dropped from Ocram, material in most console-exclusive items) Spectral Arrow (Ammo) Tutorial Music Box (crafted from the 5 once-Console-Exclusive Music boxes) Vulcan Bolt (Ammo) New Vanity sets. Stronger Versions of old Monsters: And a new, final Boss : Along with it's Servants : Console Version History. Bugs/Glitches. Go here for a complete thread listing the current bugs and glitches, feel free to contribute if needed, as our knowledge of the console game is still growing. Terraria Xbox 360 Gameplay Trailer - Split Screen Multiplayer, New Final Boss, Pets, Music. Xbox One Edition. September 5, 2014 [3] Retail Disc - BluRay. November 18, 2014. Minecraft: Xbox One Edition was the Xbox One edition of Minecraft Legacy Console Edition developed by 4J Studios [12] before the Better Together Update. An announcement trailer was shown during Microsoft's press conference at E3 2013. [13] This edition built off the Xbox 360 Edition . [1] On May 22, 2014, Mojang confirmed that the game would be released in August 2014. On August 29, 4J Studios tweeted that the game had been handed over to Xbox for final testing. Minecraft: Xbox One Edition was officially released on September 5, 2014. When the Better Together Update was released on September 20, 2017 , the Xbox One Edition was made unavailable for digital purchase on the Xbox Store, but DLC is still available. You can still, however, buy a retail version of the game. Bedrock Edition is a free download for all digital owners of Xbox One Edition . Owners of the disc version who bought DLC from the digital store, or played at least five hours between September 20, 2016 and January 30, 2018, also received it for free. Xbox One Edition is a separate game from Bedrock Edition and is still playable for anyone who owns it. DLC for Xbox One Edition can still be purchased, and most of it will carry over to Bedrock Edition. When selecting a world in Bedrock Edition, any world from Xbox One Edition can be imported. Initially, Xbox One Edition was planned to no longer receive updates, starting with the lack of an equivalent update to TU58, but the edition later received CU50, an update adding the MINECON Earth Skin Pack, and later updates resumed entirely with CU51, an update equivalent to TU60 which also added all the features from TU58. Compared to Xbox 360 Edition , Xbox One Edition included larger world sizes up to 5120x5120 blocks (36 times larger), 18 chunk render distance, amplified worlds, and enhancements offered by the Xbox One. [1] Despite it being very similar to Bedrock Edition , Xbox One Edition still has all of the Legacy Console Edition exclusive features, with the most notable being the built-in mini games. Because of licensing restrictions, some of the Xbox One Edition DLC is unavailable in Bedrock Edition, and some of it is not on the Marketplace and can only be obtained if it was bought for the Xbox One Edition . For Xbox 360 Edition players who wish to transfer worlds, it is preferable to buy the disc version of Xbox One Edition instead of buying the new Bedrock Edition, and do so before January 30, 2018. With no current method of transferring worlds directly from the Xbox 360 Edition to Bedrock Edition for Xbox One, the only way is to import the Xbox 360 world in Xbox One Edition and then import the Xbox One world in Bedrock Edition after getting it from playing the Xbox One Edition for five hours. It is unknown what will happen to the transfer functionality after the Xbox 360 Edition is updated with new content that does not exist in the Xbox One Edition . The following DLC is only usable in Xbox One Edition and does not carry over to Bedrock Edition, either because of licensing restrictions or because they are for the Mini Games: First Birthday Skin Pack Second Birthday Skin Pack Third Birthday Skin Pack Fourth Birthday Skin Pack Fifth Birthday Skin Pack Battle Map Pack 1 Battle Map Pack 2 Battle Map Pack 3 Battle Map Pack 4 Halloween Battle Map Festive Battle Map Fallout Battle Map Pack Glide Beasts Track Pack Glide Giants Track Pack Glide Myths Track Pack. Mass Effect Mash-Up Pack, Doctor Who Skins Volumes I and II, and Skin Packs 1-6 are also currently not included in Bedrock Edition, but Mojang stated in the Bedrock 1.2 patch notes that they are working on getting the license to add them. If the Minecon 2015 Skin Pack was purchased during the limited time it was available, it transfers to Bedrock Edition and is usable in multiplayer. The Simpsons Skin Pack, as well as the Marvel skin packs, also transfer, but are not usable in multiplayer. In default settings, the controls are always displayed on the HUD. Gameplay - Jump (double tap to fly while in Creative) - Drop item - Open crafting menu in Creative Mode or survival - Open inventory - Place block/use item - Mine/use block or item / - Change the selected block - Move (Moved forward twice in rapid succession) - Sprint (pressed down) - Change camera angle. - Look (pressed down) - Sneak/walk, descend while flying in Creative mode, dismount an entity [14] Interface (Crafting/Smelting/Brewing/Inventory) - Pickup/Drop item - Exit interface - Take half of the highlited stack/show information about the selected block or item - Move selected stack to Chest/Brewing stand, move between inventory and hotbar. / - Move pointer/change selected item. - access creative inventory "if in creative" (pressed down) - crouch/crouch in bed. Minecraft: Xbox One Edition Holiday Pack was released on November 24, 2015 with seven "fan-favorite" DLC included. This edition contains all of the same features of Minecraft: Xbox One Edition along with the DLC bundled in. This bundle is meant to celebrate the Christmas holidays and costs US$29.99. [16] [17] The following DLC is included in Xbox One Edition Holiday Pack : [16] [17] The Minecraft Marvel Skin Packs Bundle was released on November 30, 2015 and contains all of the Marvel Skin Packs: The bundle was meant to be a discount for the Marvel Skin Packs. On December 22, 2015 the bundle was removed from the Xbox Marketplace, thus being no longer available for the Xbox One and the Xbox 360. This bundle costed $6.99 and was available in the Xbox One and the Xbox 360. [18] [19] The Marvel Skin Packs from the bundle still transfer to Bedrock Edition, but they are not usable in online or LAN multiplayer. ↑ abcdXbox, "Minecraft: Xbox One Edition Announce Trailer", YouTube , 2013-06-10. ↑Edwin Evans-Thirlwell, "Minecraft Xbox 360: the five greatest technical challenges". Official Xbox Magazine, 2012-05-25. Accessed 2013-07-29. ↑ abhttps://mojang.com/2014/05/minecraft-on-xbox-one-ps4-and-ps-vita-soon/ ↑ In Portugal and Russia, the PEGI rating is 6+. ↑http://www.cero.gr.jp/search/search.cgi?chkPF=XboxOne&name=Minecraft&txtCP= ↑http://www.gamerating.org.tw/search_product.php?id=6b27e88fdd7269394bca4968b48d8df4 ↑http://grac.or.kr/Statistics/Popup/Pop_StatisticsDetails.aspx?8d5adcc84ca553bb6b0da4a613c3d8ebcecd98f227eaeaf17ed258610ac557e8 ↑http://grac.or.kr/Statistics/Popup/Pop_StatisticsDetails.aspx?c02b4f3494d27f9e9a2f3177c805a052cecd98f227eaeaf17ed258610ac557e8 ↑http://portal.mj.gov.br/ClassificacaoIndicativa/jsps/ConsultarJogoForm.do ↑https://store.xbox.com/pt-BR/Xbox-One/Games/Minecraft-Xbox-One-Edition/582e7bcc-11bc-4702-ab1b-b31566f8e327 ↑https://store.xbox.com/en-NZ/Xbox-One/Games/Minecraft-Xbox-One-Edition/582e7bcc-11bc-4702-ab1b-b31566f8e327 ↑4J Studios, "Happy to confirm we're developing Minecraft PS3, PS4 & PS Vita Editions for Mojang, as well as Xbox 360 & Xbox One Editions!", Twitter , 2013-08-21. ↑"Minecraft: Xbox One Edition Reveal Trailer - E3 2013 Microsoft Conference", IGN , 2013-06-10. ↑http://www.gamefront.com/minecraft-xbox-360-controls-are-surprisingly-not-terrible/ ↑Hat Films, "Minecraft Xbox One Edition E3 Announcement", YouTube , 2013-06-10. ↑ abhttp://majornelson.com/2015/11/24/minecraft-xbox-one-edition-holiday-pack-is-now-available-for-xbox-one/?linkId=19019904 ↑ abhttps://store.xbox.com/Xbox-One/Bundle/Minecraft-Xbox-One-Edition-Holiday-Pack/e983405f-eacc-45b1-b489-d7a924bc29f9? Xbox Live Marketplace ↑https://store.xbox.com/en-GB/Xbox-One/Bundle/Minecraft-Marvel-Skin-Packs-Bundle/bab88306-91e0-4185-a7a5-c2fffdaf0152 ↑https://twitter.com/4jstudios/status/678889627285979137. Minecraft Wiki. Minecraft content and materials are trademarks and copyrights of Mojang and its licensors. All rights reserved. This site is a part of Curse, Inc. and is not affiliated with Mojang. Content is available under CC BY-NC-SA 3.0 unless otherwise noted. H1Z1 Coming to PS4 and Xbox One This Summer, PC Version Getting Split Into Two Games. Q&A: We talk with Daybreak about the huge changes and new versions of the zombie MMO. Last updated by Eddie Makuch on February 5, 2016 at 10:05AM. You are now subscribed. Today, Daybreak Games made some major announcements about its zombie MMO franchise H1Z1. The game is coming to PlayStation 4 and Xbox One this summer, while the PC edition is being split into two distinct packages, each of which will be sold separately, the studio announced today. On PC, H1Z1 is becoming two independent games: H1Z1: King of the Kill and H1Z1: Just Survive. King of the Kill is described as a "large-scale, high-intensity" shooter that includes a number of games, including Battle Royale, which was created by a player who goes by the name PlayerUnknown. The regular H1Z1 game--which has sold 2.5 million copies in its unfinished state, Daybreak said--is changing names and will be known as H1Z1: Just Survive. This is the regular game as you know it. King of the Kill and Just Survive will sell for $20 each when they launch on PC through Steam Early Access. If you currently own or buy H1Z1 on or before February 16, you'll get both games when they separate a day later. H1Z1 currently sells for $20. Separating the games in this way was something that the H1Z1 community asked for, Daybreak chief publishing officer Laura Naviaux told GameSpot in an interview. She explained that it became clear to Daybreak from their own observations and player feedback that people were largely playing the game in two distinct ways (Battle Royale vs. open-world), so it made sense to split them up in an effort to better serve both audiences. "They have completely different communities that have different needs, desires, and wants, and we really want to be able to cater and grow the games to be able to nurture both of those players bases distinctly," she said. "After a lot of thought, we came to the conclusion that we should really split these games into two different products and the player community had actually asked us to do that." When the split happens on February 17, all the items that existing H1Z1 players have will be replicated across Just Survive and King of the Kill. This includes your crates, keys, and items, among other things. Just Survive and King of the Hill exist under the same umbrella, but are developed by two different teams inside of Daybreak games. Just Survive is staying in Early Access until the end of 2016, while King of the Kill will leave Early Access this summer, launching simultaneously with the PS4 and Xbox One versions. A specific release date has not been announced. On console, only King of the Kill will be offered, at least for launch this summer. A Daybreak games representative told GameSpot that the plan is to bring Just Survive, or some form of it, to console. eventually. "We're working on our plans for bringing Just Survive out of Early Access and to console, but no details on timing or additional info at this time." H1Z1 creative director Jens Andersen also confirmed to GameSpot that the console versions of King of the Kill will offer microtransactions, just as the PC edition does. He stressed they will be for cosmetic items only, so as to avoid a "pay-to-win" scenario. "All of the microtransactions that we have right now are vanity-based; there is no power being sold," he explained. "In fact, it's very important for the nature of the King of the Kill experience that players start on an even playing field." With regards to the PS4 and Xbox One versions of King of the Kill as it relates to graphics and performance, Andersen said the goal is to achieve parity for resolution and frame rate. He didn't share any specific targets, but said the team's goal is to make each version look and run as good as possible. Although another Daybreak title, DC Universe Online, supports cross-platform play between console and PC, the studio would not confirm if this will also be true for H1Z1. "Those are things we're figuring out right now," Naviaux said. "We want to make sure the player communities and populations as best we can, so we're taking all of that into consideration to figure out the best path for H1Z1." Some have also remarked that H1Z1 on PC is not the best-looking game out there, so we wondered if Daybreak might be planning to introduce some visual improvements for the full launch. Andersen confirmed this is indeed happening. "Absolutely, in fact we just recently introduced finished some work on toxic gas that is featured in the Battle Royale mode and it looks amazing now," Andersen explained. "Right now in the game it looks pretty janky, to be honest with you. It was a first pass. We are in Early Access. So being in Early Access, we might add things that are fun to the game that prove-out mechanics that aren't yet visually polished to the fidelity level that we expect in the final product." "An example would be, you're going to see an amazing improvement to the gas, the visuals of the gas. We're also working on tree improvements right now. Cool stuff happening both visually and in terms of level design that's really going to enhance the game and make it even higher fidelity than it is right now both in gameplay and visuals." Daybreak, formerly known as Sony Online Entertainment, split off from Sony a year ago this month. H1Z1 is the studio's second game announced for Xbox One, following DC Universe Online. John Smedley, who was CEO of Sony Online Entertainment and Daybreak for a period of time, left the developer at the end of 2015 amid a clash with a hacker. GameSpot will have more on H1Z1 from our conversation with Daybreak in the coming days. For now, let us know what you think of these changes and announcements in the comments below! PS4 vs. Xbox One Native Resolutions and Framerates. This page compares the native resolutions and framerates of PS4 games and Xbox One games. Native resolution indicates the resolution a game is rendered in before any potential upscaling. Most, if not all, PS4 and Xbox One games output at 1080p, but some might not have a native 1080p resolution.  PS4 vs. Xbox One Native Resolutions and Framerates Comparison Chart. This chart is a work in progress; very few companies have gone on record stating the native resolutions of their Xbox One and PS4 games. If you have information to add to this chart in alphabetical order, please make sure it's properly cited. BEWARE OF FALSE INFORMATION!  This page can be edited by anyone with an IGN account. So therefore the information you are seeing is most likely incorrect. Halo 2: Anniversary Campaign 1328x1080p @ 60fps [n] PS4 Pro vs. Xbox One X Enhanced Resolutions Previous. PS4 vs. Xbox One vs. Wii U Comparison Chart. © 1996-2018 Ziff Davis, LLC. We have updated our PRIVACY POLICY and encourage you to read it by clicking here. IGN uses cookies and other tracking technologies to customize online advertisements, and for other purposes. IGN supports the Digital Advertising Alliance principles. Learn More. Get rewarded. You deserve it. Everything you’ve earned so far, plus your current stats, perks, and statement. All the ways you can earn Rewards, including new missions, challenges, and more! Get to know the NEW MyVIP… with even more ways to Reward you for doing what you love. Everything you're talking about, sharing, and playing, plus fun extras that are all about you. Xbox Live Rewards is all about YOU. That’s why we designed the program with you in mind. Whether you’re an avid gamer or a movie or TV buff, we Reward you for doing what you love. And the more you do with Xbox, the more you get. Sign up and start getting rewarded today. Earning is easy. Welcome to Xbox Live Rewards. Join to get access to exclusive offers and Rewards. Turn your dedication into dollars. Redeem your Rewards for digital games, movies, TV shows, and more in the Microsoft Store.

Комментариев нет:

Отправить комментарий

Related Posts Plugin for WordPress, Blogger...