The heat signature of a 4U rig is the equivalent of a 42U rack and most data centers are designed for the density spread out over 42U not 4. It's like trying to cool a heat gun in a shoebox. Very tough. The heat problem (at least here in the US) hasn't been much of an issue because it was winter and heat was a desirable benefit along with the bitcoin mined. There are the liquid cooled solutions that are available, and Martin Enclosures just released a new line of racks - http://www.martinenclosures.com/product/bitrack/ that sound like they were developed to help deal with the heat problem.
This totally confuses me, let's take the most dense system on the market, the SP31, you are tell me most 42U racks don't pull more than 6K Watts (say single NEMA L6-30P)? I am not sure what rock you are living under but I have been running and/or hosting gear in data centers since the 90s and it was a rare rack that had such a low power usage..
I can point to three global banks, two media companies, 2 global ad agencies, a computer manufacturer, and an $8B outsourcing company and tell you will the degree of accuracy that only comes from being embedded in their operations groups on projects that the rack loads are 2.4-3.6 today. I wouldn't have believed it.
Contrast that with animation and geospatial (oil & gas) who run 30Kw/cabinet or more for their applications. I have blogged about high density and the HPC markets, and how density will be a bazillion watts a foot, but the reality doesn't support the projections in a broad sense. In pockets, yes. generally, no.
High density applications in *some* racks, not in every single rack from wall to wall. The building simply wasn't designed with that in mind.
The high density environments I am most familiar with are in converted shipping containers. At one animation Studio they run 34Kw/cabinet in every cabinet in specially designed computer rooms. They exist, just not broadly.