Gaming PS4 vs Xbox One - The suckiest thread in the history of suckyness

Which one will you buy?


  • Total voters
    538

ciderman9000000

New Member
Newbie
Joined
Jul 22, 2006
Messages
29,640
Location
The General
Memory management is one of the most divisive points that separate the two systems. The question must surely be that if GDDR5 is the preferred set-up, why didn't Microsoft choose it? Still cash-rich to the extreme, clearly the firm could afford to pay the premium for GDDR5. We wondered whether it was fair to assume that this higher bandwidth RAM was ruled out very early on in the production process, and if so, why?

"Yeah, I think that's right. In terms of getting the best possible combination of performance, memory size, power, the GDDR5 takes you into a little bit of an uncomfortable place," says Nick Baker. "Having ESRAM costs very little power and has the opportunity to give you very high bandwidth. You can reduce the bandwidth on external memory - that saves a lot of power consumption and the commodity memory is cheaper as well so you can afford more. That's really a driving force behind that... if you want a high memory capacity, relatively low power and a lot of bandwidth there are not too many ways of solving that."

The combined system bandwidth controversy

Baker is keen to tackle the misconception that the team has created a design that cannot access its ESRAM and DDR3 memory pools simultaneously. Critics say that they're adding the available bandwidths together to inflate their figures and that this simply isn't possible in a real-life scenario.

"You can think of the ESRAM and the DDR3 as making up eight total memory controllers, so there are four external memory controllers (which are 64-bit) which go to the DDR3 and then there are four internal memory controllers that are 256-bit that go to the ESRAM. These are all connected via a crossbar and so in fact it will be true that you can go directly, simultaneously to DRAM and ESRAM," he explains.
The controversy surrounding ESRAM has taken the design team very much by surprise. The notion that Xbox One is difficult to work with is perhaps quite hard to swallow for the same team that produced Xbox 360 - by far and away the easier console to develop for, especially so in the early years of the current console generation.

"This controversy is rather surprising to me, especially when you view as ESRAM as the evolution of eDRAM from the Xbox 360. No-one questions on the Xbox 360 whether we can get the eDRAM bandwidth concurrent with the bandwidth coming out of system memory. In fact, the system design required it," explains Andrew Goosen.

"We had to pull over all of our vertex buffers and all of our textures out of system memory concurrent with going on with render targets, colour, depth, stencil buffers that were in eDRAM. Of course with Xbox One we're going with a design where ESRAM has the same natural extension that we had with eDRAM on Xbox 360, to have both going concurrently. It's a nice evolution of the Xbox 360 in that we could clean up a lot of the limitations that we had with the eDRAM.

"The Xbox 360 was the easiest console platform to develop for, it wasn't that hard for our developers to adapt to eDRAM, but there were a number of places where we said, 'gosh, it would sure be nice if an entire render target didn't have to live in eDRAM' and so we fixed that on Xbox One where we have the ability to overflow from ESRAM into DDR3, so the ESRAM is fully integrated into our page tables and so you can kind of mix and match the ESRAM and the DDR memory as you go... From my perspective it's very much an evolution and improvement - a big improvement - over the design we had with the Xbox 360. I'm kind of surprised by all this, quite frankly."

 

ciderman9000000

New Member
Newbie
Joined
Jul 22, 2006
Messages
29,640
Location
The General
Indeed, the level of coherence between the ESRAM and the DDR3 memory pools sounds much more flexible than many previously thought. Many believed that the 32MB of ESRAM is a hard limit for render targets - so can developers really "mix and match" as Goosen suggests?

"Oh, absolutely. And you can even make it so that portions of our your render target that have very little overdraw... for example if you're doing a racing game and your sky has very little overdraw, you could stick those sub-sets of your resources into DDR to improve ESRAM utilisation," he says, while also explaining that custom formats have been implemented to get more out of that precious 32MB.

"On the GPU we added some compressed render target formats like our 6e4 [6 bit mantissa and 4 bits exponent per component] and 7e3 HDR float formats [where the 6e4 formats] that were very, very popular on Xbox 360, which instead of doing a 16-bit float per component 64bpp render target, you can do the equivalent with us using 32 bits - so we did a lot of focus on really maximising efficiency and utilisation of that ESRAM."

How ESRAM bandwidth doubled in production hardware

Further scepticism surrounds the sudden leap in ESRAM's bandwidth from an initial 102GB/s to where it is now - 204GB/s. We ran the story first based on a developer leak of a blog post the Microsoft tech team wrote back in April, but sections of "the internet" were not convinced. Critics say that the numbers don't add up. So how did the massive increase in bandwidth come about?

"When we started, we wrote a spec," explains Nick Baker. "Before we really went into any implementation details, we had to give developers something to plan around before we had the silicon, before we even had it running in simulation before tape-out, and said that the minimum bandwidth we want from the ESRAM is 102GB/s. That became 109GB/s [with the GPU speed increase]. In the end, once you get into implementing this, the logic turned out that you could go much higher."

The big revelation was that ESRAM could actually read and write at the same time, a statement that seemingly came out of the blue. Some believed that based on the available information from the leaked whitepapers, this simply wasn't possible.

"There are four 8MB lanes, but it's not a contiguous 8MB chunk of memory within each of those lanes. Each lane, that 8MB is broken down into eight modules. This should address whether you can really have read and write bandwidth in memory simultaneously," says Baker.

"Yes you can - there are actually a lot more individual blocks that comprise the whole ESRAM so you can talk to those in parallel. Of course if you're hitting the same area over and over and over again, you don't get to spread out your bandwidth and so that's one of the reasons why in real testing you get 140-150GB/s rather than the peak 204GB/s... it's not just four chunks of 8MB memory. It's a lot more complicated than that and depending on how the pattern you get to use those simultaneously. That's what lets you do read and writes simultaneously. You do get to add the read and write bandwidth as well adding the read and write bandwidth on to the main memory. That's just one of the misconceptions we wanted to clean up."

Goosens lays down the bottom line:
"If you're only doing a read you're capped at 109GB/s, if you're only doing a write you're capped at 109GB/s," he says. "To get over that you need to have a mix of the reads and the writes but when you are going to look at the things that are typically in the ESRAM, such as your render targets and your depth buffers, intrinsically they have a lot of read-modified writes going on in the blends and the depth buffer updates. Those are the natural things to stick in the ESRAM and the natural things to take advantage of the concurrent read/writes."

Microsoft's argument seems pretty straightforward then. In theory, Xbox One's circa 200MB/s of "real-life" bandwidth trumps PS4's 176GB/s peak throughput. The question is just to what extent channelling resources through the relatively tiny 32MB of the much faster ESRAM is going to cause issues for developers. Microsoft's point is that game-makers have experience of this already owing to the eDRAM set-up on Xbox 360 - and ESRAM is the natural evolution of the same system.

 

ciderman9000000

New Member
Newbie
Joined
Jul 22, 2006
Messages
29,640
Location
The General
Memory bandwidth is one thing, but graphics capability is clearly another. PlayStation 4 enjoys a clear advantage in terms of on-board GPU compute units - a raw stat that is beyond doubt, and in turn offers a huge boost to PS4's enviable spec sheet. Andrew Goosen first confirms that both Xbox One and PS4 graphics tech is derived from the same AMD "Island" family before addressing the Microsoft console's apparent GPU deficiency in depth.

"Just like our friends we're based on the Sea Islands family. We've made quite a number of changes in different parts of the areas... The biggest thing in terms of the number of compute units, that's been something that's been very easy to focus on. It's like, hey, let's count up the number of CUs, count up the gigaflops and declare the winner based on that. My take on it is that when you buy a graphics card, do you go by the specs or do you actually run some benchmarks?" he says.

"Firstly though, we don't have any games out. You can't see the games. When you see the games you'll be saying, 'what is the performance difference between them'. The games are the benchmarks. We've had the opportunity with the Xbox One to go and check a lot of our balance. The balance is really key to making good performance on a games console. You don't want one of your bottlenecks being the main bottleneck that slows you down."

Tweaking Xbox One balance and performance

Microsoft's approach was to go into production knowing that there'd be some headroom for increasing performance from the final silicon. Goosen describes it as "under-tweaking" the system. Actual in-production games were then used to determine how to make use of the available headroom.

"Balance is so key to real effective performance. It's been really nice on Xbox One with Nick and his team - the system design folks have built a system where we've had the opportunity to check our balances on the system and make tweaks accordingly," Goosen reveals. "Did we do a good job when we did all of our analysis and simulations a couple of years ago, and guessing where games would be in terms of utilisation. Did we make the right balance decisions back then? And so raising the GPU clock is the result of going in and tweaking our balance."

"We knew we had headroom. We didn't know what we wanted to do with it until we had real titles to test on. How much do you increase the GPU by? How much do you increase the CPU by?" asks Nick Baker.

"We had the headroom. It's a glorious thing to have on a console launch. Normally you're talking about having to downclock," says Goosen. "We had a once in a lifetime opportunity to go and pick the spots where we wanted to improve the performance and it was great to have the launch titles to use as the way to drive an informed decision on performance improvements we could get out of the headroom."

Goosen also reveals that the Xbox One silicon actually contains additional compute units - as we previously speculated. The presence of that redundant hardware (two CUs are disabled on retail consoles) allowed Microsoft to judge the importance of compute power versus clock-speed:

"Every one of the Xbox One dev kits actually has 14 CUs on the silicon. Two of those CUs are reserved for redundancy in manufacturing, but we could go and do the experiment - if we were actually at 14 CUs what kind of performance benefit would we get versus 12? And if we raised the GPU clock what sort of performance advantage would we get? And we actually saw on the launch titles - we looked at a lot of titles in a lot of depth - we found that going to 14 CUs wasn't as effective as the 6.6 per cent clock upgrade that we did."

Assuming level scaling of compute power with the addition of two extra CUs, the maths may not sound right here, but as our recent analysis - not to mention PC benchmarks - reveals, AMD compute units don't scale in a linear fashion. There's a law of diminishing returns.

 

ciderman9000000

New Member
Newbie
Joined
Jul 22, 2006
Messages
29,640
Location
The General
"Everybody knows from the internet that going to 14 CUs should have given us almost 17 per cent more performance," he says, "but in terms of actual measured games - what actually, ultimately counts - is that it was a better engineering decision to raise the clock. There are various bottlenecks you have in the pipeline that can cause you not to get the performance you want if your design is out of balance."

"Increasing the frequency impacts the whole of the GPU whereas adding CUs beefs up shaders and ALU," interjects Nick Baker.

"Right. By fixing the clock, not only do we increase our ALU performance, we also increase our vertex rate, we increase our pixel rate and ironically increase our ESRAM bandwidth," continues Goosen.

"But we also increase the performance in areas surrounding bottlenecks like the drawcalls flowing through the pipeline, the performance of reading GPRs out of the GPR pool, etc. GPUs are giantly complex. There's gazillions of areas in the pipeline that can be your bottleneck in addition to just ALU and fetch performance."

GPU Compute and the importance of the CPU

Goosen also believes that leaked Sony documents on VGLeaks bear out Microsoft's argument:

"Sony was actually agreeing with us. They said that their system was balanced for 14 CUs. They used that term: balance. Balance is so important in terms of your actual efficient design. Their additional four CUs are very beneficial for their additional GPGPU work. We've actually taken a very different tack on that. The experiments we did showed that we had headroom on CUs as well. In terms of balance, we did index more in terms of CUs than needed so we have CU overhead. There is room for our titles to grow over time in terms of CU utilisation."

Microsoft's approach to asynchronous GPU compute is somewhat different to Sony's - something we'll track back on at a later date. But essentially, rather than concentrate extensively on raw compute power, their philosophy is that both CPU and GPU need lower latency access to the same memory. Goosen points to the Exemplar skeletal tracking system on Kinect on Xbox 360 as an example for why they took that direction.

"Exemplar ironically doesn't need much ALU. It's much more about the latency you have in terms of memory fetch, so this is kind of a natural evolution for us," he says. "It's like, OK, it's the memory system which is more important for some particular GPGPU workloads."

The team is also keen to emphasise that the 150MHz boost to CPU clock speed is actually a whole lot more important than many believe it is.

"Interestingly, the biggest source of your frame-rate drops actually comes from the CPU, not the GPU," Goosen reveals. "Adding the margin on the CPU... we actually had titles that were losing frames largely because they were CPU-bound in terms of their core threads. In providing what looks like a very little boost, it's actually a very significant win for us in making sure that we get the steady frame-rates on our console."

This in part explains why several of the custom hardware blocks - the Data Move Engines - are geared towards freeing up CPU time. Profiling revealed that this was a genuine issue, which has been balanced with a combination of the clock speed boost and fixed function silicon - the additional processors built in to the Xbox One processor.

"We've got a lot of CPU offload going on. We've got the SHAPE, the more efficient command processor relative to the standard design, we've got the clock boost - it's in large part actually to ensure that we've got the headroom for the frame-rates," Goosen continues - but it seems that the systems's Data Move Engines can help the GPU too.

 

ciderman9000000

New Member
Newbie
Joined
Jul 22, 2006
Messages
29,640
Location
The General
"Imagine you've rendered to a depth buffer there in ESRAM. And now you're switching to another depth buffer. You may want to go and pull what is now a texture into DDR so that you can texture out of it later, and you're not doing tons of reads from that texture so it actually makes more sense for it to be in DDR. You can use the Move Engines to move these things asynchronously in concert with the GPU so the GPU isn't spending any time on the move. You've got the DMA engine doing it. Now the GPU can go on and immediately work on the next render target rather than simply move bits around."

Other areas of custom silicon are also designed to help out the graphics performance.

"We've done things on the GPU side as well with our hardware overlays to ensure more consistent frame-rates," Goosen adds. "We have two independent layers we can give to the titles where one can be 3D content, one can be the HUD. We have a higher quality scaler than we had on Xbox 360. What this does is that we actually allow you to change the scaler parameters on a frame-by-frame basis."

Dynamic resolution scaling isn't new - we've seen it implemented on a lot of current-gen titles. Indeed, the first example in the current generation was on a Sony title: WipEout HD. Impact on image quality can be rough at 720p, but at higher resolutions and in concert with superior scaling, it could be a viable performance equalising measure.

"I talked about CPU glitches causing frame glitches... GPU workloads tend to be more coherent frame to frame. There doesn't tend to be big spikes like you get on the CPU and so you can adapt to that," Goosen explains.

"What we're seeing in titles is adopting the notion of dynamic resolution scaling to avoid glitching frame-rate. As they start getting into an area where they're starting to hit on the margin there where they could potentially go over their frame budget, they could start dynamically scaling back on resolution and they can keep their HUD in terms of true resolution and the 3D content is squeezing. Again, from my aspect as a gamer I'd rather have a consistent frame-rate and some squeezing on the number of pixels than have those frame-rate glitches."

"From a power/efficiency standpoint as well, fixed functions are more power-friendly on fixed function units," adds Nick Baker. "We put data compression on there as well, so we have LZ compression/decompression and also motion JPEG decode which helps with Kinect. So there's a lot more to the Data Move Engines than moving from one block of memory to another."

We've been talking in-depth for over an hour and our time draws to a close. The entire discussion has been completely tech-centric, to the point where we'd almost forgotten that the November launch of Xbox One is likely to be hugely significant for Nick Baker and Andrew Goosen personally. How does it feel to see the console begin to roll off the production line after years in development?

"Yeah, getting something out is always, always a great feeling [but] my team work on multiple programs in parallel - we're constantly busy working on the architecture team," says Baker.
Goosen has the final word:

"For me, the biggest reward is to go and play the games and see that they look great and that yeah, this is why we did all that hard work. As a graphics guy it's so rewarding to see those pixels up on the screen."

 

WeasteDevil

New Member
Joined
Jun 21, 2001
Messages
109,016
Location
Salford in Castellón de la Plana
http://www.eurogamer.net/articles/digitalfoundry-vs-the-xbox-one-architects

By Richard Leadbetter Published Sunday, 22 September 2013

Digital Foundry vs. the Xbox One architects
"There's a lot of misinformation out there and a lot of people who don't get it. We're actually extremely proud of our design."

Two months away from the release of the next generation consoles, many have already made up their minds about which machine offers more gaming power before a single game has been released. Compare basic graphics and memory bandwidth specs side-by-side and it looks like a wash - PlayStation 4 comprehensively bests Xbox One to such a degree that sensible discussion of the respective merits of both consoles seems impossible. They're using the same core AMD technologies, only Sony has faster memory and a much larger graphics chip. But is it really that simple?

In the wake of stories from unnamed sources suggesting that PS4 has a significant advantage over its Xbox counterpart, Microsoft wanted to set the record straight. Last Tuesday, Digital Foundry dialled into a conference call to talk with two key technical personnel behind the Xbox One project - passionate engineers who wanted the opportunity to put their story across in a deep-dive technical discussion where all the controversies could be addressed. Within moments of the conversation starting, it quickly became clear that balance would the theme.

"For designing a good, well-balanced console you really need to be considering all the aspects of software and hardware. It's really about combining the two to achieve a good balance in terms of performance," says Microsoft technical fellow Andrew Goosen.

"We're actually very pleased to have the opportunity to talk with you about the design. There's a lot of misinformation out there and a lot of people who don't get it - we're actually extremely proud of our design. We think we have very good balance, very good performance, we have a product which can handle things other than just raw ALU [GPU compute power]. There are also quite a number of other design aspects and requirements that we put in around things like latency, steady frame-rates and that the titles aren't interrupted by the system and other things like that. You'll see this very much as a pervasive ongoing theme in our system design."

"Andrew said it pretty well: we really wanted to build a high performance, power-efficient box," adds hardware architecture team manager Nick Baker. "We really wanted to make it relevant to the modern living room. Talking about AV, we're the only ones to put in an AV in and out to make it media hardware that's the centre of your entertainment."

We've seen the Xbox One dash and the media functions are pretty cool, but first an foremost, it's all about the games. It's safe to say that there are two major areas of controversy surrounding the Xbox One design - specifically the areas in which it is considered weaker than the PlayStation 4: the memory set-up and the amount of GPU power on tap. Both systems have 8GB of RAM, but Sony chose 8GB of wide, fast GDDR5 with 176MB/s of peak throughput while Microsoft opted for DDR3, with a maximum rated bandwidth of just 68GB/s - clearly significantly lower. However, this is supplemented by on-chip ESRAM, which tops out at 204GB/s. In theory then, while marshalling and dividing resources between the two memory pools will be a factor, Xbox One clearly has its own approach for ensuring adequate bandwidth across the system.

:lol:
 

WeasteDevil

New Member
Joined
Jun 21, 2001
Messages
109,016
Location
Salford in Castellón de la Plana
"Everybody knows from the internet that going to 14 CUs should have given us almost 17 per cent more performance," he says, "but in terms of actual measured games - what actually, ultimately counts - is that it was a better engineering decision to raise the clock. There are various bottlenecks you have in the pipeline that can cause you not to get the performance you want if your design is out of balance."

"Increasing the frequency impacts the whole of the GPU whereas adding CUs beefs up shaders and ALU," interjects Nick Baker.

"Right. By fixing the clock, not only do we increase our ALU performance, we also increase our vertex rate, we increase our pixel rate and ironically increase our ESRAM bandwidth," continues Goosen.

"But we also increase the performance in areas surrounding bottlenecks like the drawcalls flowing through the pipeline, the performance of reading GPRs out of the GPR pool, etc. GPUs are giantly complex. There's gazillions of areas in the pipeline that can be your bottleneck in addition to just ALU and fetch performance."

GPU Compute and the importance of the CPU

Goosen also believes that leaked Sony documents on VGLeaks bear out Microsoft's argument:

"Sony was actually agreeing with us. They said that their system was balanced for 14 CUs. They used that term: balance. Balance is so important in terms of your actual efficient design. Their additional four CUs are very beneficial for their additional GPGPU work. We've actually taken a very different tack on that. The experiments we did showed that we had headroom on CUs as well. In terms of balance, we did index more in terms of CUs than needed so we have CU overhead. There is room for our titles to grow over time in terms of CU utilisation."

Microsoft's approach to asynchronous GPU compute is somewhat different to Sony's - something we'll track back on at a later date. But essentially, rather than concentrate extensively on raw compute power, their philosophy is that both CPU and GPU need lower latency access to the same memory. Goosen points to the Exemplar skeletal tracking system on Kinect on Xbox 360 as an example for why they took that direction.

"Exemplar ironically doesn't need much ALU. It's much more about the latency you have in terms of memory fetch, so this is kind of a natural evolution for us," he says. "It's like, OK, it's the memory system which is more important for some particular GPGPU workloads."

The team is also keen to emphasise that the 150MHz boost to CPU clock speed is actually a whole lot more important than many believe it is.

"Interestingly, the biggest source of your frame-rate drops actually comes from the CPU, not the GPU," Goosen reveals. "Adding the margin on the CPU... we actually had titles that were losing frames largely because they were CPU-bound in terms of their core threads. In providing what looks like a very little boost, it's actually a very significant win for us in making sure that we get the steady frame-rates on our console."

This in part explains why several of the custom hardware blocks - the Data Move Engines - are geared towards freeing up CPU time. Profiling revealed that this was a genuine issue, which has been balanced with a combination of the clock speed boost and fixed function silicon - the additional processors built in to the Xbox One processor.

"We've got a lot of CPU offload going on. We've got the SHAPE, the more efficient command processor relative to the standard design, we've got the clock boost - it's in large part actually to ensure that we've got the headroom for the frame-rates," Goosen continues - but it seems that the systems's Data Move Engines can help the GPU too.

:lol:
 

WeasteDevil

New Member
Joined
Jun 21, 2001
Messages
109,016
Location
Salford in Castellón de la Plana
"Imagine you've rendered to a depth buffer there in ESRAM. And now you're switching to another depth buffer. You may want to go and pull what is now a texture into DDR so that you can texture out of it later, and you're not doing tons of reads from that texture so it actually makes more sense for it to be in DDR. You can use the Move Engines to move these things asynchronously in concert with the GPU so the GPU isn't spending any time on the move. You've got the DMA engine doing it. Now the GPU can go on and immediately work on the next render target rather than simply move bits around."

Other areas of custom silicon are also designed to help out the graphics performance.

"We've done things on the GPU side as well with our hardware overlays to ensure more consistent frame-rates," Goosen adds. "We have two independent layers we can give to the titles where one can be 3D content, one can be the HUD. We have a higher quality scaler than we had on Xbox 360. What this does is that we actually allow you to change the scaler parameters on a frame-by-frame basis."

Dynamic resolution scaling isn't new - we've seen it implemented on a lot of current-gen titles. Indeed, the first example in the current generation was on a Sony title: WipEout HD. Impact on image quality can be rough at 720p, but at higher resolutions and in concert with superior scaling, it could be a viable performance equalising measure.

"I talked about CPU glitches causing frame glitches... GPU workloads tend to be more coherent frame to frame. There doesn't tend to be big spikes like you get on the CPU and so you can adapt to that," Goosen explains.

"What we're seeing in titles is adopting the notion of dynamic resolution scaling to avoid glitching frame-rate. As they start getting into an area where they're starting to hit on the margin there where they could potentially go over their frame budget, they could start dynamically scaling back on resolution and they can keep their HUD in terms of true resolution and the 3D content is squeezing. Again, from my aspect as a gamer I'd rather have a consistent frame-rate and some squeezing on the number of pixels than have those frame-rate glitches."

"From a power/efficiency standpoint as well, fixed functions are more power-friendly on fixed function units," adds Nick Baker. "We put data compression on there as well, so we have LZ compression/decompression and also motion JPEG decode which helps with Kinect. So there's a lot more to the Data Move Engines than moving from one block of memory to another."

We've been talking in-depth for over an hour and our time draws to a close. The entire discussion has been completely tech-centric, to the point where we'd almost forgotten that the November launch of Xbox One is likely to be hugely significant for Nick Baker and Andrew Goosen personally. How does it feel to see the console begin to roll off the production line after years in development?

"Yeah, getting something out is always, always a great feeling [but] my team work on multiple programs in parallel - we're constantly busy working on the architecture team," says Baker.
Goosen has the final word:

"For me, the biggest reward is to go and play the games and see that they look great and that yeah, this is why we did all that hard work. As a graphics guy it's so rewarding to see those pixels up on the screen."

:lol:
 

WeasteDevil

New Member
Joined
Jun 21, 2001
Messages
109,016
Location
Salford in Castellón de la Plana
The combined system bandwidth controversy

Baker is keen to tackle the misconception that the team has created a design that cannot access its ESRAM and DDR3 memory pools simultaneously. Critics say that they're adding the available bandwidths together to inflate their figures and that this simply isn't possible in a real-life scenario.

"You can think of the ESRAM and the DDR3 as making up eight total memory controllers, so there are four external memory controllers (which are 64-bit) which go to the DDR3 and then there are four internal memory controllers that are 256-bit that go to the ESRAM. These are all connected via a crossbar and so in fact it will be true that you can go directly, simultaneously to DRAM and ESRAM," he explains.
The controversy surrounding ESRAM has taken the design team very much by surprise. The notion that Xbox One is difficult to work with is perhaps quite hard to swallow for the same team that produced Xbox 360 - by far and away the easier console to develop for, especially so in the early years of the current console generation.

"This controversy is rather surprising to me, especially when you view as ESRAM as the evolution of eDRAM from the Xbox 360. No-one questions on the Xbox 360 whether we can get the eDRAM bandwidth concurrent with the bandwidth coming out of system memory. In fact, the system design required it," explains Andrew Goosen.

"We had to pull over all of our vertex buffers and all of our textures out of system memory concurrent with going on with render targets, colour, depth, stencil buffers that were in eDRAM. Of course with Xbox One we're going with a design where ESRAM has the same natural extension that we had with eDRAM on Xbox 360, to have both going concurrently. It's a nice evolution of the Xbox 360 in that we could clean up a lot of the limitations that we had with the eDRAM.

"The Xbox 360 was the easiest console platform to develop for, it wasn't that hard for our developers to adapt to eDRAM, but there were a number of places where we said, 'gosh, it would sure be nice if an entire render target didn't have to live in eDRAM' and so we fixed that on Xbox One where we have the ability to overflow from ESRAM into DDR3, so the ESRAM is fully integrated into our page tables and so you can kind of mix and match the ESRAM and the DDR memory as you go... From my perspective it's very much an evolution and improvement - a big improvement - over the design we had with the Xbox 360. I'm kind of surprised by all this, quite frankly."
This is total bollocks for example.

You still have to get stuff in and out of the eSRAM to the main RAM, and that can only be done at 68GB/s max.
 

Redlambs

Creator of the Caftards comics
Joined
Aug 22, 2006
Messages
42,421
Location
Officially the best poker player on RAWK.
This is total bollocks for example.

You still have to get stuff in and out of the eSRAM to the main RAM, and that can only be done at 68GB/s max.
Therein lies Cider's problem.

He's unwilling to accept a power percentage figure when it favours the PS4, but is all to willing to accept a clearly fudged figure in favour of the XB1. Fanboyism at it's finest, except he takes it to new levels by attempting to pass off other people's posts as his own. It's sad.

Still, he has a Wii so that puts him automatically at a higher level than you, you big dope.
 

Alock1

Wears XXXL shirts and can't type ellipses
Joined
Nov 30, 2011
Messages
16,109
Cider still cutting and pasting from elsewhere I see, whilst failing to understand what he's putting out there.

This thread is beyond tedious now.
Dude.. this is a football forum - if you want opinions and discussions with only people who you deem to have the relevant expertise, I'm sorry but you're in the wrong place.

If you deem the article bollocks, fine.. but what is wrong with him posting it here? You can't deny the expertise of the people in the article; of course they have a vested interest - but that doesn't deem everything said worthless.

Ciderman90210 and Weaste have both trolled in this thread; and you've accused me of the same.. but atleast try to add to discussion rather than just dismissing everybody elses opinion because you know better.
If you can't be bothered to take the time to discuss and make use of your expertise in this thread; just simply use it to undermine everybody else - why are you bothering, seriously?

Every post you make recently is; '[insert name here] is a fanboy, he just doesn't understand anything. i'm not a fan boy and know everything; so I'm gonna claim everything else is bullshit without giving any sort of explanation, or trying to add to the discussion.'
 

WeasteDevil

New Member
Joined
Jun 21, 2001
Messages
109,016
Location
Salford in Castellón de la Plana
Dude.. this is a football forum - if you want opinions and discussions with only people who you deem to have the relevant expertise, I'm sorry but you're in the wrong place.

If you deem the article bollocks, fine.. but what is wrong with him posting it here? You can't deny the expertise of the people in the article; of course they have a vested interest - but that doesn't deem everything said worthless.

Ciderman90210 and Weaste have both trolled in this thread; and you've accused me of the same.. but atleast try to add to discussion rather than just dismissing everybody elses opinion because you know better.
If you can't be bothered to take the time to discuss and make use of your expertise in this thread; just simply use it to undermine everybody else - why are you bothering, seriously?

Every post you make recently is; '[insert name here] is a fanboy, he just doesn't understand anything. i'm not a fan boy and know everything; so I'm gonna claim everything else is bullshit without giving any sort of explanation, or trying to add to the discussion.'
:lol:

It's not the football forum, it's the Entertainment forum. And YOU are talking to a real developer here you tit!
 

Redlambs

Creator of the Caftards comics
Joined
Aug 22, 2006
Messages
42,421
Location
Officially the best poker player on RAWK.
Dude.. this is a football forum - if you want opinions and discussions with only people who you deem to have the relevant expertise, I'm sorry but you're in the wrong place.

If you deem the article bollocks, fine.. but what is wrong with him posting it here? You can't deny the expertise of the people in the article; of course they have a vested interest - but that doesn't deem everything said worthless.

Ciderman90210 and Weaste have both trolled in this thread; and you've accused me of the same.. but atleast try to add to discussion rather than just dismissing everybody elses opinion because you know better.
If you can't be bothered to take the time to discuss and make use of your expertise in this thread; just simply use it to undermine everybody else - why are you bothering, seriously?

Every post you make recently is; '[insert name here] is a fanboy, he just doesn't understand anything. i'm not a fan boy and know everything; so I'm gonna claim everything else is bullshit without giving any sort of explanation, or trying to add to the discussion.'
Go bleat on elsewhere. Sure you like having your tongue up Cider's arse so much, but my point is simple.

How can your beloved Cider talk about figures being bullshit on one side, then post a load of shite about figures on the other side?

If he was able to explain that then fine, but can he?
 

amolbhatia50k

Sneaky bum time - Vaccination status: dozed off
Joined
Nov 8, 2002
Messages
96,065
Location
india
Dude.. this is a football forum - if you want opinions and discussions with only people who you deem to have the relevant expertise, I'm sorry but you're in the wrong place.

If you deem the article bollocks, fine.. but what is wrong with him posting it here? You can't deny the expertise of the people in the article; of course they have a vested interest - but that doesn't deem everything said worthless.

Ciderman90210 and Weaste have both trolled in this thread; and you've accused me of the same.. but atleast try to add to discussion rather than just dismissing everybody elses opinion because you know better.
If you can't be bothered to take the time to discuss and make use of your expertise in this thread; just simply use it to undermine everybody else - why are you bothering, seriously?

Every post you make recently is; '[insert name here] is a fanboy, he just doesn't understand anything. i'm not a fan boy and know everything; so I'm gonna claim everything else is bullshit without giving any sort of explanation, or trying to add to the discussion.'
There's so much fanboism in this thread, you expect it not to pointed out and laughed at?

The problem in this thread is clearly the ones calling out of the fanboi crap. Clearly.
 

x42bn6

Full Member
Joined
Jun 23, 2008
Messages
18,887
Location
西田麻衣の谷間. Being a nerd, geek and virgin
"Yeah, I think that's right. In terms of getting the best possible combination of performance, memory size, power, the GDDR5 takes you into a little bit of an uncomfortable place," says Nick Baker.
I wonder if Microsoft have unwittingly and subconsciously described how they feel about GDDR5 in the PS4 here. This is a really weird statement and reeks of FUD.
 

WeasteDevil

New Member
Joined
Jun 21, 2001
Messages
109,016
Location
Salford in Castellón de la Plana
I wonder if Microsoft have unwittingly and subconsciously described how they feel about GDDR5 in the PS4 here. This is a really weird statement and reeks of FUD.
Because it's bollocks! I've said it before, as soon as Cerny said 1.84TF, 8GB GDDR5, 176GB/s (this was higher BTW at one point) they shit themselves. It's been nothing more than damage control ever since.
 

WeasteDevil

New Member
Joined
Jun 21, 2001
Messages
109,016
Location
Salford in Castellón de la Plana
Lambs, I don't get this bit!

"If you're only doing a read you're capped at 109GB/s, if you're only doing a write you're capped at 109GB/s," he says. "To get over that you need to have a mix of the reads and the writes but when you are going to look at the things that are typically in the ESRAM, such as your render targets and your depth buffers, intrinsically they have a lot of read-modified writes going on in the blends and the depth buffer updates. Those are the natural things to stick in the ESRAM and the natural things to take advantage of the concurrent read/writes."

It's not possible, the bus is 4x256 no? So 1024 bit. It can surely only push that amount of data per cycle. How the feck does it mix them? It's 110GB/s no? Is this tosser talking about reading from one 8MB block and writing to the other? Still doesn't make any sense. It would still be 256 + 256. You can't read and write at the same time!
 

Alock1

Wears XXXL shirts and can't type ellipses
Joined
Nov 30, 2011
Messages
16,109
There's so much fanboism in this thread, you expect it not to pointed out and laughed at?

The problem in this thread is clearly the ones calling out of the fanboi crap. Clearly.
Somebody posted an article; just because they have an agenda of posting pro Xbox articles doesn't make that particular notion 'fanboyism'. It's an article relevant to the thread and there is no problem in posting it. We have posted articles in here from Sony experts; with their opinion being deemed worthy of discussion even though they have a bias/vested interest in the products we are talking about - why should it be different for Microsoft employees?

Go bleat on elsewhere. Sure you like having your tongue up Cider's arse so much, but my point is simple.

How can your beloved Cider talk about figures being bullshit on one side, then post a load of shite about figures on the other side?

If he was able to explain that then fine, but can he?
Him being a hypocrite doesn't mean what he posted was irrelevant to the topic, I found it interesting. I reserve judgment but honestly expect the PS4 console to perform better, but I definitely thought there was some interesting points made - even if the points made were with a PR spin on them. Why not respond to the points made in Ciderman90210's article rather than simply dismiss it without giving reason, and criticizing him? My point is, your way of responding with 'Cider/Weaste/Alock hasn't got a clue' doesn't add anything to the discussion.

You and Weaste telling us that you have knowledge/expertise that we don't is fine; and I respect that - but it doesn't mean I'm just going to take your word on everything, especially if you don't make any effort to showcase your expertise and just say 'its bollocks'. Quote the article, call it out on it's bullshit or maybe point out if there is any good reasoning in there - or why bother commenting? It just seems unnecessary and is as bad as any 'fanboy' trolling.

In fact, you might not realise, but you are talking to a real life game developer and a fecking computer scientist.
I thought you were an English teacher with a degree in Computer Science? But still; so what? But either way, you still have an agenda in this thread, and for such reason you make it hard for people to take you seriously and actually believe the stuff you come out. I understand you know more than me about this, but when every post made is either anti-ms/xbox or pro-sony with no inbetween - what do you expect?

Redlambs himself said only a few pages back that it's irrelevant how many games he has worked on or which games he has worked on. There have been articles posted in here which include the opinion of developers, game critics and other people with some sort of expertise in the gaming field - anything pro Xbox you are happy to dismiss as bollocks; ignoring that they might know what they are talking about. If it's pro Sony, you are more than happy to talk it up.

--
It's nice on forums like this which aren't tech specific to have guys who know what they are talking about.. but all you do is make fun of other people's lack of knowledge, and assume superiority on the discussion. Why not put your information to good use, and explain to us without any sort of agenda, without any sort of 'know-it-all' tint on it.
 

WeasteDevil

New Member
Joined
Jun 21, 2001
Messages
109,016
Location
Salford in Castellón de la Plana
I thought you were an English teacher with a degree in Computer Science?
I used to be a University lecturer in Manchester focusing on OOA and OOD and programming in 68K ASM, Sparc ASM, and C and C++ (generally under Solaris - a Unix system). I have a first class honours degree in Computer Science and an unfinished PhD. Sorry, but I married a Spanish woman and moved to Spain, so I couldn't continue with that at the time as I could speak Spanish, never mind Catalan, so I changed path. I have however worked in the field and have made technical presentations to the directors of Microsoft Spain - mostly to do with what was called Navision.

What are your qualifications?
 

Hectic

Full Member
Joined
Jun 8, 2006
Messages
75,376
Supports
30fps
Fanboyism at it's finest, except he takes it to new levels by attempting to pass off other peoples posts as his own. It's sad.
Out of interest, are you referring to his posts on this page? If so, hasn't be gone to great lengths to obviously not pass off these posts as his own? How could you read that and think he's trying to claim it?
 

WeasteDevil

New Member
Joined
Jun 21, 2001
Messages
109,016
Location
Salford in Castellón de la Plana
Redlambs himself said only a few pages back that it's irrelevant how many games he has worked on or which games he has worked on.
It is irrelevant to the topic. It's not irrelevant to the subject, he knows his stuff. He's more of a game designer though rather than a router, a programmer (that's me, and I'm fecking very good at it).
 

WeasteDevil

New Member
Joined
Jun 21, 2001
Messages
109,016
Location
Salford in Castellón de la Plana
Out of interest, are you referring to his posts on this page? If so, hasn't be gone to great lengths to obviously not pass off these posts as his own? How could you read that and think he's trying to claim it?
He's been trolling the whole thread for about a week! It's too the extent that people looking for real information can't because the thread goes to shit when Cider posts his shit. At one point the thing also became a Project Spark wanking fest.

Thread ban, I'll take one as well if needs be just to remove the wanker. Shame, as I have reasonable things to say, but I'll take the thread ban if you remove that trolling cnut.
 

Hectic

Full Member
Joined
Jun 8, 2006
Messages
75,376
Supports
30fps
Somebody posted an article; just because they have an agenda of posting pro Xbox articles doesn't make that particular notion 'fanboyism'. It's an article relevant to the thread and there is no problem in posting it. We have posted articles in here from Sony experts; with their opinion being deemed worthy of discussion even though they have a bias/vested interest in the products we are talking about - why should it be different for Microsoft employees?



Him being a hypocrite doesn't mean what he posted was irrelevant to the topic, I found it interesting. I reserve judgment but honestly expect the PS4 console to perform better, but I definitely thought there was some interesting points made - even if the points made were with a PR spin on them. Why not respond to the points made in Ciderman90210's article rather than simply dismiss it without giving reason, and criticizing him? My point is, your way of responding with 'Cider/Weaste/Alock hasn't got a clue' doesn't add anything to the discussion.

You and Weaste telling us that you have knowledge/expertise that we don't is fine; and I respect that - but it doesn't mean I'm just going to take your word on everything, especially if you don't make any effort to showcase your expertise and just say 'its bollocks'. Quote the article, call it out on it's bullshit or maybe point out if there is any good reasoning in there - or why bother commenting? It just seems unnecessary and is as bad as any 'fanboy' trolling.



I thought you were an English teacher with a degree in Computer Science? But still; so what? But either way, you still have an agenda in this thread, and for such reason you make it hard for people to take you seriously and actually believe the stuff you come out. I understand you know more than me about this, but when every post made is either anti-ms/xbox or pro-sony with no inbetween - what do you expect?

Redlambs himself said only a few pages back that it's irrelevant how many games he has worked on or which games he has worked on. There have been articles posted in here which include the opinion of developers, game critics and other people with some sort of expertise in the gaming field - anything pro Xbox you are happy to dismiss as bollocks; ignoring that they might know what they are talking about. If it's pro Sony, you are more than happy to talk it up.

--
It's nice on forums like this which aren't tech specific to have guys who know what they are talking about.. but all you do is make fun of other people's lack of knowledge, and assume superiority on the discussion. Why not put your information to good use, and explain to us without any sort of agenda, without any sort of 'know-it-all' tint on it.
Great post and a very reasonable response.
 

Alock1

Wears XXXL shirts and can't type ellipses
Joined
Nov 30, 2011
Messages
16,109
I used to be a University lecturer in Manchester focusing on OOA and OOD and programming in 68K ASM, Sparc ASM, and C and C++ (generally under Solaris - a Unix system). I have a first class honours degree in Computer Science and an unfinished PhD. Sorry, but I married a Spanish woman and moved to Spain, so I couldn't continue with that at the time as I could speak Spanish, never mind Catalan, so I changed path. I have however worked in the field and have made technical presentations to the directors of Microsoft Spain - mostly to do with what was called Navision.

What are your qualifications?
That's awesome.. So I was right, and you still waste your knowledge with knob head posts and an obvious agenda.

I'm currently doing a Law degree, and don't have near as much knowledge on these sorts of things as you. Luckily for me, I've never called out an opinion in this thread as bollocks.. well, not one backed up with fact/expertise - go through my posts; everything I say is clearly an opinion and on things that aren't clear cut and are open to subjectivity.

I've been called a fan boy troll in this thread; but I've probably posted more information on both consoles.. complementing both Sony and Microsoft and also criticizing both.
 

Hectic

Full Member
Joined
Jun 8, 2006
Messages
75,376
Supports
30fps
He's been trolling the whole thread for about a week!
That I do not doubt at all, I meant specifically this page though with all that Xbox article stuff, surely no-one thought he was trying to claim that at his own, or was Redlambs post referring to the stuff from the past week?
 

amolbhatia50k

Sneaky bum time - Vaccination status: dozed off
Joined
Nov 8, 2002
Messages
96,065
Location
india
Out of interest, are you referring to his posts on this page? If so, hasn't be gone to great lengths to obviously not pass off these posts as his own? How could you read that and think he's trying to claim it?
No I think that was earlier.
 

Redlambs

Creator of the Caftards comics
Joined
Aug 22, 2006
Messages
42,421
Location
Officially the best poker player on RAWK.
Alock, the point you are making about me is utter rubbish. I've been on here a long time doing this and I've said plenty about these consoles, well before the popular sites and magazines knew anything too.

What I don't do is comment on every single thing and go over it time and time again, I used to and it got boring for everyone. Saying I add nothing to the debate is a huge load of bollocks, but you are new to all this so I accept you haven't been around when I release info or add my opinion on this sort of thing.

If I want to talk about the internals of these machines again then I will. But when you have someone like Cider who really isn't interested in it and is mostly trolling (and if you don't believe he is, then you really don't know him and his behaviour on this site), then it becomes gauling. If you want to look back, I've tried talking to him properly but he doesn't really want to know. And the copying and pasting posts isn't about that article he pasted either, it's about taking other people's opinions from other forums and passing them off as his own. Get that straight.

As I've said, for years I've been doing this and I choose to release info on here and very rarely specific gaming sites BECAUSE it's a football forum that used to be full of genuinely interested and impartial people, people that I like. So don't fecking lecture me on that side of it, it's not my fault Cider has come in here and tried to make it Neogash pt.2.
 

WeasteDevil

New Member
Joined
Jun 21, 2001
Messages
109,016
Location
Salford in Castellón de la Plana
That's awesome.. So I was right, and you still waste your knowledge with knob head posts and an obvious agenda.
What agenda you you think I have? If the XB1 was more powerful than the PS4 I would tell you so. The problem is it's not, and Microsoft dancing about with their PR to try to somehow remove the power difference concerns me even more, and I will call them out on it. This is a company that wanted to totally shaft you, yet they were forced to make a turnaround. Believe the shit they say if you want, there is a significant difference in performance between XB1 and PS4.
 

Redlambs

Creator of the Caftards comics
Joined
Aug 22, 2006
Messages
42,421
Location
Officially the best poker player on RAWK.
Great post and a very reasonable response.
It would be if he kept up with the points being made.

Cider is trolling, he's been caught passing off posts from other sites as his own and has dragged this thread down.

I have no problem with posting things about the consoles or articles. As for Alock himself, he is moaning about me calling him a fanboy ages ago, but that's exactly how he entered this thread. I appreciate he's changed tact now though, other than the brown nosing Cider, I have no issue with him and think he often adds to the discussion well.