Previous | Next --- Slide 4 of 28
Back to Lecture Thumbnails

The concept of eDRAM sounds intuitive enough, but why haven't it seen much usage until recent years?
One major reason is the cost to integrate capacitors into the CPU chip. Unlike SRAMs, which are made out of purely transistors, DRAMs are made out of capacitors and transistors. Although small capacitors are cheap to produce and can be densely packed, they become more costly when we need to integrate them into transistor only fabrication processes. This is why until recently, DRAMs have been manufactured independently. However, the gain from a larger bandwidth has motivated this integration nowadays.

Bottom-line: expect eDRAM to be more expensive than DRAM, but provide better performance than conventional DRAM.


Amusingly enough, the Xbox 360's 10 MB of eDRAM isn't enough to hold a full 1280x720 image. Here's a post that's rather critical of the 360's eDRAM.

Overall, this seems to reflect the issues we've seen related to the increasing use of heterogenous hardware. One of these concerns is that it makes software that takes full advantage of the hardware more difficult to write. Another is that balancing the hardware (in this case, the choice of 10 MB) is a very important decision. This is similar to the issues discussed in (lecture 22, slide 28).