How an alleged employee slanders us (3) – WEDOS and free-cooling

[gtranslate]

So here we have another article in the campaign against WEDOS. The author no longer claims to be our employee, but engages in another false claim against us.

The third article has already been published, with which our unsuccessful competitors solve their minds and their business impotence :-). So let’s write the truth about how it really is at WEDOS.

Economically and ecologically only with WEDOS

From the beginning, we claim that we solve everything economically. It looks ecological in the results. We decided on the most economical solution from the beginning. We have chosen the most energy efficient servers on the market, we use only energy efficient processors and it is so unusual that even Intel wrote about us on their English website.

We are gradually improving everything and gradually saving more and more on operating costs, especiallyelectricity. It started with servers, continued with LED lights in the building, through free-cooling and ended with buying the most economical UPS on the market (only rotary UPS are more economical).

We now even heat with waste warm air from the server room in our offices, saving nearly 100,000 a year in gas costs.

Of course, we’ve gone through the trouble of building a cold aisle between the racks over time, we have all the gaps in all the rack cabinets sealed to prevent loss and mixing of cold and warm air.

We claim that we are probably one of the most environmentally friendly datacenters far and wide. 🙂

What’s the cooling situation here? Like this:

Why are we doing this? Does that make sense? Definitely!

So it is no secret that thanks to our savings, the real financial savings amount to hundreds of thousands of crowns per month. And we only have about 240 servers. Take the situation for the future… it will be millions a month that we will have and the competition will not. This can amount to tens of millions of crowns per year, which we can invest in development or new services for our customers.

To illustrate, the servers are currently consuming over 50 kW every hour.We have efficient processors in the servers, but without them the consumption would be more than 20 kW higher every hour. Without special energy-saving servers, we would have a real consumption of tens of kW higher every hour.

If we didn’t have other cost-saving solutions, we would bleed financially. After all, without the cold alley we would pay hundreds of thousands of crowns more per month for electricity. A cold aisle means that cold air is blown under the floor, which is closed and only has openings (grilles) where you want to cool. So there is cooler air under the floor, which passes through the floor vents to the outside and is used for cooling. The servers are enclosed in a so-called. a cold alley where the cold air comes in and has nowhere to escape because the alley is closed on all sides. So the air has to pass into the servers and there it gets warmer and it is significantly warmer behind the servers than in front of them. The warm air is extracted and re-cooled and used. The underfloor is therefore significantly cooler than in the room itself and the hot and cold air do not mix, thus avoiding energy losses.

Why do we have our free-cooling…

Saving money comes first. We don’t want to shortchange services, but we want to solve everything economically.

If you are cooling with air conditioning, the same air is still used inside the server room, which is circulated and cooled in the air conditioning by compressors. It is an expensive and energy-intensive solution. In our case, when the servers produce (together with the UPS) about 50 kW of heat every hour, we need about 33 kW of electricity for cooling every hour – if we use air conditioning.

If we use our free-cooling, it’s different. Only about 2-12 kW of electricity is enough to cool 50 kW of heat. In summer it is more and in cooler days or times of the day (night or morning) it is significantly less. Now it is around 2 kW every hour. Note that we save an average of 30 kW of electricity every hour. Multiply that by the number of hours in a day, the number of days in a year, and the cost of electricity, and you’ll see that the savings are huge. In our case, the amounts are currently close to CZK 1 million per year. As the number of servers grows it will be more and more.

We installed our free-cooling this winter and it works great. It was a big investment (it was close to a million crowns in total), but it has already paid for itself.

We also have 6 air conditioning units in the data centre, each with a cooling capacity of 89 kW. So we have huge reserves and also redundancy in our free-cooling.

So how does it work?

Our free-cooling does not work as stated in the article. It’s not that air is sucked in from the outside on one side and sucked out on the other. It’s a fairly complex mechanism that we designed ourselves specifically for this purpose.

In principle, it can be said that free-cooling ensures air circulation throughout the server room. And the colder air from the outside is mixed into it by means of dampers according to the current situation and the warmer air is taken out (and now the warm air is also taken to the upper floor for heating the offices).

In many places there are temperature (and humidity) sensors all over the server room (and outside the building, in the ducts and under the floor) to monitor everything and the entire control system (based on KNX bus = control of the entire server room centrally from a PC including control of lights and sockets) adjusts the flap settings and the speed of the intake and exhaust motors accordingly.

As a result, the temperature at the inlet to the servers is stable, and there are certainly never spikes. The temperature inside does not copy the temperature outside. The flap and air induction system keeps an eye on it. Accuracy depends on how we set it up. We currently have an allowable temperature change of 0.6 degrees, which is more accurate than any commercial air conditioner. We have also tested everything to within 0.1 degrees and it works.

If the outside temperature is high, the free-cooling is switched off and the air conditioning is switched on. This happens fully automatically without human intervention. In this exceptionally hot summer we used the air conditioning for a maximum of 100 hours, always only during the afternoon rush hour, and in the evening the cooling switched back from air conditioning to free-cooling. The average annual temperature in Hluboká is around 9 degrees. Then we can use free-cooling without any problems.

Since our datacenter is a former civil defense shelter (in case of war) and it is partially underground and in the city center (in the residential and administrative buildings), we are dealing with noise pollution. In the free-cooling system, there are special silencers to keep outside noise below 40 decibels (even a modern refrigerator is noisier).

Our free-cooling is now equipped with motors that provide an air exchange volume of 30,000 m3/hr. Yes, 30,000,000 litres of air “swirl” inside the server room every hour. Some of the cool is drawn in from outside, some is indoors (depending on how warm and humid it is outside). Everything is controlled in the mixing chamber to the exact temperature.

The whole system is controlled automatically from the PC, everything is connected to the air conditioning and everything works without human intervention.

Free-cooling is equipped with fire dampers that ensure hermetic closure of the data hall from other areas in case of fire. The dampers are electronically controlled directly from the control panel of the fixed fire extinguishing system, but are also fitted directly with thermal fuses and close themselves should the electronics fail.

Free-cooling is designed to be used up to 33 degrees outside. The servers are certified for 33 degrees of incoming air. We use free-cooling up to about 28 degrees, because then the servers turn on other fans and the consumption increases. And it pays to cool with air conditioning. You know very well how many days (or rather hours) a year there are more than 28 degrees in the shade…

So what is wrong (falsely) in the article?

Filtration

Of course, we are dealing with filtration. We bought very expensive filters and they are changed regularly. We have a service company that takes care of everything and we don’t have to worry about anything. Hluboká is also relatively clean 🙂 . But we didn’t underestimate anything. Come and take a look and you will see that there is no dust anywhere :-).

Emerging hot spots

Imagine 3 million litres of air moving every hour in a space of about 250 m2. Nice breeze. So the temperature mixes nicely and there are no critical spots. The airflow is so strong it can carry paper boxes.

Rapid temperature changes

As already stated, there is no such problem. The system is more accurate than commercial air conditioning systems costing millions of crowns. This is due to the fact that we regulate not only the ratio of incoming cool air, but also its flow. Everything is precisely controlled by controlling not only the speed of both motors (supply and exhaust), but also the air dampers by means of actuators. The dampers control not only the volume of air coming in from outside, but also the mixing ratio with the outgoing warm air, and by adjusting the dampers we control how much air goes out of the building and how much is to circulate inside the data room.

Evidence (graph of temperatures over 2 days, these are the exact temperatures reported at any change and are therefore not an average over a few minutes):

Environmental humidity

We have humidity sensors in many places in the server room (+ outside + under the floor + in the ducts) and we make sure that we are within the limits allowed by the servers. If this does not happen, free-cooling can be shut down, or an automatic shutdown will occur.

There are no problems with lack of humidity, because as we know, the air outside is much more humid than in rooms that are cooled purely by air conditioning. Therefore, data halls that are cooled only by air conditioning must be artificially humidified. With free-cooling it’s the other way around. You have to make sure that the humidity does not exceed certain limits. Of course, we monitor this – again with the help of flaps. It only takes a little, because you change the ratio of outgoing (dry) to incoming (moist) air and you have the correct values. In case this is not enough, the free-cooling is switched off and the air conditioning is turned on for a while to provide dehumidification. But we’ve never needed it yet. Static electricity is dissipated by the anti-static floor, which is all earthed (just like the seagulls).

Free-cooling WEDOS is a sotsified device at the top level

Our free-cooling is our “invention”, but it is based on 16 years of experience. It’s working great. We can’t complain. Everything is fully automatic, reliable and accurate. Operationally cheaper than any commercial solution we have looked for on the market.Several commercial cooling companies have already expressed interest in our system, as have data centre operators. Here’s a video of how free-cooling cools at 27 degrees outside. No problem. Reliable.

What are we planning next?

We’re already testing oil cooling. We test servers that are immersed in oil and cooled by oil (not air). What are the advantages? More huge energy savings. Oil dissipates heat about 300 times better than air. You need less energy to dissipate heat. You produce less heat because you take all the fans out of the servers, which can be up to 50% of the power consumption of the entire server. The server is perfectly surface cooled and can be overclocked. The server is not in the dust, there are no shocks (the oil absorbs them) and last but not least they make non-flammable versions of the oil. Yes, we are also testing this at WEDOS and we want to put it into real operation. Here is a sample of our video with the server in oil:

What’s next?

Let’s see what the author writes next. We’re looking forward to it. He must be very annoyed with us. Basically, he is advertising for us with his articles because we can easily explain all the lies that are in each article.

We’re glad that someone bothers us like that and that they give us so much time. It shows that we do our job well. Very good. It’s advertising for us. By being talked about and written about, we will become more popular and successful again.

I just hope that people don’t think that we are making this anti-campaign ourselves in order to make ourselves visible.

Message to the author of the anti-campaign

You don’t have to be ashamed of your identity. You don’t have to use a fake first and last name. Would you really like to come and see us before you write more articles about us? You’ve obviously never been here, you’re piecing together information from the internet and making a lot of assumptions. So you’re just embarrassing yourself.

We are not afraid of criticism. We can face it.

We don’t understand how you can draw conclusions here if you have never seen the system and don’t know how it works. You imagine that the system just makes a simple “draft” of the large metal air through the room. No, actually, we get it. When the competition is desperate, they’ll do exactly that…

We have nothing to be ashamed of. Free-cooling works exactly as it should and saves us hundreds of thousands a month in costs for any other cooling solution…

You don’t believe me? Come on, come on, we’ll show anyone. Masarykova 1230, Hluboká nad Vltavou.

What are we responding to?

Here is a screenshot of the article we are responding to. The author of the anti-campaign continuously edits the articles so as not to be “an idiot”. So don’t be surprised :-).

Below is a screenshot of the site we are responding to (just for preservation). We assume that this is part of the competition and so will be evidence before changes are made.

Our alleged employee has set up a website wxxxxyyyg.com (address modified and screenshot is here at the end of the article). The first article reports on how we got it wrong with power. In the next we learned different information.