Not all DDR3 SDRAM is created equal. That’s the message Samsung is spreading lately by talking about its 20nm-class DDR3 SDRAM. The company is using 1.5V, 50nm-class DDR3 SDRAM as a benchmark and says that a server loaded with 96Mbytes of the 1.5V DDR3 SDRAM consumes 65.3W just in the DRAM. Multiply that by the tens of thousands of servers in a data center and you’ll quickly realize that the energy and cooling costs just for the DRAM are significant. Compare that with the 21.8W that Samsung claims is consumed by its 1.35V, 20nm-class DDR3 SDRAM. There’s a 67% difference in power consumption there, but it’s not just a result of the lower operating voltage because the 30nm-class DDR3 SDRAM also runs at 1.35V and consumes 33.6W—about a third more.
Why? According to this article in The Register, “The paths traversed by the electrons in the smaller process chips are shorter, so less energy is needed to push them around the chips.” That was said by Samsung Semiconductor Europe GmBH’s Peyman Blumstengel, a senior manager for strategic business development. In tech speak, I’d say that the lower impedances of the shorter on-chip traces permit the use of lower-power I/O drivers.
Lots more numbers and figures on Samsung’s site, here. Oh, and I think this will be a topic of discussion at Memcon in September. You can sign up over there on the right side of this blog.
This is unworthy. It is not the lower impedance of the tracking that assists with the reduced power, but a combination of higher parasitic impedances – i.e.
a) the reduced tracking and gate capacitance (due to reduced track lengths), and
b) increased leakage impedance (due presumably to a change to fully-depleted channels)