Innovations and approaches to go green in Asia
Increasing the energy efficiency of data centers in the tropics is not hopeless, but it is tougher, and with significantly lesser gains than similar facilities situated in locations with temperate climates.
Yet tropical Southeast Asia is also home to two-third of a billion people, with towering digital expansion that is increasing quickly. As more information centers are being constructed in the area, are there inventions or functional strategies which can help data centre providers go green ?
Dabbling in hybrid liquid cooling In a video call, Lee walked DCD through a hybrid cooling system he developed for data centers. Much like present water-cooling solutions employed by PC gaming enthusiasts, the coolant is contained within a closed loop, effectively eliminating water reduction. Lee takes this further, feeding the water into a heat exchanger in the stand level, which transports the heat to an outside dry cooler via another loop.
When tested under loads of 32kW, Lee says his hybrid cooling system attained a PUE of merely 1.091 — with the advantage of being far simpler to implement than full immersion liquid cooling. And since 95 percent of this warmth has removed via liquid cooling, Lee says his solution might allow for information centers to be constructed without chillers and CRAC units. Without these systems and the raised floors that normally include them, the outcome is less M&E work, lower prices, and faster fit-out of information centers.
But getting a greenfield data center constructed without chillers is a challenging market in most regions of the world, and much more in tropical locations. However, a surprise finding in new tests conducted in AI Singapore earlier this year prompted Lee to commercialize his hybrid cooling solution for present facilities.
Having a single-rack testbed consisting of 20 servers with one AMD Epyc processor and four Nvidia RTX2080 GPUs each, Lee claims his hybrid cooling system recorded a surprising reduction in IT power of 25 percent. The sharp decreasing of junction temperature on the microprocessors led to the big fall in IT electricity consumption, Lee said. This was possible due to the use of fluid cooling, along with a unique, high efficiency”oblique fin” cold plate he designed.
This usually means that present data centers can deploy Lee’s solution to benefit from considerably lower power consumption. By removing the second cooling loop and bleach the waste heat through regular rear door heating exchangers, no construction alterations would be required. Lee says he’s now in advanced discussions with at least two operators in the region about his solution.
So why has hybrid liquid cooling not found greater use? Lee hypothesized this might be on account of the vast majority of equipment sellers coming from the USA or Europe. With lower hanging fruits like ambient temperatures cooling, there could be no urgency to create cooling systems targeted especially towards deployments in tropical climates.
Efficiency Begins with the design
But standard systems deployed well can make a world of difference, according to Darren Hawkins, CEO of both SpaceDC. Building an efficient data centre boils down to the design, ” he advised DCD in a call. The Singapore-based provider is now building a data center campus in Indonesia that is scheduled to start in the next half of this year. Façade components will be installed to reduce solar heat gains to a minimum, states Hawkins. Obviously, a larger data centre also allows for a decrease carbon footprint due to factors such as economies of scale, a reduced PUE, and also fewer individuals needed for safety and maintenance.
A common error is underestimating the importance of the ideal skills for deployed systems:”Skillset is particularly important, especially with the complicated, large data centres that we are building.
“You want to ensure that the technology that you deploy is controlled and maintained in its peak performance. Your tech is your enabler. But in case your operations team is not familiar with this , then that could lead to further troubles.”
When adaptability in Critical
Transferring for the common denominator is most likely not on his team’s mind, considering how SpaceDC’s forthcoming facilities don’t use elevated flooring, and do not even rely on the power grid for normal operations. Rather, JAK1 and JAK2 will utilize a natural gas-driven reciprocating motor to power the information center campus.
Hawkins claims that this less common option has been made after an exhaustive audit of their energy sources accessible Jakarta, its quality, background of distribution, the available capability, and the way it’s distributed. “We’re easily able to identify the grid has several issues each month. They are not necessarily blackouts — however low voltages and high harmonics. This usually means that data centers here will normally head to generator power,” he explained.
In view of this, providing the most resilient data center design meant adapting with an on-site power plant. As a bonus, as the gas-powered generator usually means the existence of absorption chillers to recycle waste heat for the chillers, while the cleaner power in the generators translate into a more efficient energy chain with equipment like UPS units functioning in high-efficiency mode.
Hawkins likens the ability to accommodate construction data centers:”We supply everything [hyperscalers] need in terms of continuous cooling, contiguous area, and reliable power. We consider that and turn it into a building that’s adapted to your climate. Prerequisites will be the same, however, how you give it’s extremely different. This ability to translate and deliver across different towns, not only cope, but to execute.”
With new inventions and also a willingness to adapt, there’s not any reason data facility operators in tropical climates cannot nevertheless build energy efficient amenities.