The Cooling System of Radiator




Because the thermal efficiency of internal combustion engines increases with internal temperature, the coolant is kept at higher-than-atmospheric pressure to increase its boiling point. A calibrated pressure-relief valve is usually incorporated in the radiator's fill cap. This pressure varies between models, but typically ranges from 4 to 30 psi (30 to 200 kPa).[4]

As the coolant system pressure increases with a rise in temperature, it will reach the point where the pressure relief valve allows excess pressure to escape. This will stop when the system temperature stops rising. In the case of an over-filled radiator (or header tank) pressure is vented by allowing a little liquid to escape. This may simply drain onto the ground or be collected in a vented container which remains at atmospheric pressure. When the engine is switched off, the cooling system cools and liquid level drops. In some cases where excess liquid has been collected in a bottle, this may be 'sucked' back into the main coolant circuit. In other cases, it is not.


Before World War II, engine coolant was usually plain water. Antifreeze was used solely to control freezing, and this was often only done in cold weather. If plain water is left to freeze in the block of an engine the water can expand as it freezes. This effect can cause severe internal engine damage due to the expanding of the ice.

Development in high-performance aircraft engines required improved coolants with higher boiling points, leading to the adoption of glycol or water-glycol mixtures. These led to the adoption of glycols for their antifreeze properties.

Since the development of aluminium or mixed-metal engines, corrosion inhibition has become even more important than antifreeze, and in all regions and seasons.


An overflow tank that runs dry may result in the coolant vaporizing, which can cause localized or general overheating of the engine. Severe damage may result if the vehicle is allowed to run over temperature. Failures such as blown head gaskets, and warped or cracked cylinder heads or cylinder blocks may be the result. Sometimes there will be no warning, because the temperature sensor that provides data for the temperature gauge (either mechanical or electrical) is exposed to water vapor, not the liquid coolant, providing a harmfully false reading.

Opening a hot radiator drops the system pressure, which may cause it to boil and eject dangerously hot liquid and steam. Therefore, radiator caps often contain a mechanism that attempts to relieve the internal pressure before the cap can be fully opened.


The invention of the automobile water radiator is attributed to Karl Benz. Wilhelm Maybach designed the first honeycomb radiator for the Mercedes 35hp


It is sometimes necessary for a car to be equipped with a second, or auxiliary, radiator to increase the cooling capacity, when the size of the original radiator cannot be increased. The second radiator is plumbed in series with the main radiator in the circuit. This was the case when the Audi 100 was first turbocharged creating the 200. These are not to be confused with intercoolers.

Some engines have an oil cooler, a separate small radiator to cool the engine oil. Cars with an automatic transmission often have extra connections to the radiator, allowing the transmission fluid to transfer its heat to the coolant in the radiator. These may be either oil-air radiators, as for a smaller version of the main radiator. More simply they may be oil-water coolers, where an oil pipe is inserted inside the water radiator. Though the water is hotter than the ambient air, its higher thermal conductivity offers comparable cooling (within limits) from a less complex and thus cheaper and more reliable[citation needed] oil cooler. Less commonly, power steering fluid, brake fluid, and other hydraulic fluids may be cooled by an auxiliary radiator on a vehicle.

Turbo charged or supercharged engines may have an intercooler, which is an air-to-air or air-to-water radiator used to cool the incoming air charge—not to cool the engine.


Aircraft with liquid-cooled piston engines (usually inline engines rather than radial) also require radiators. As airspeed is higher than for cars, these are efficiently cooled in flight, and so do not require large areas or cooling fans. Many high-performance aircraft however suffer extreme overheating problems when idling on the ground - a mere seven minutes for a Spitfire.[6] This is similar to Formula 1 cars of today, when stopped on the grid with engines running they require ducted air forced into their radiator pods to prevent overheating.


Reducing drag is a major goal in aircraft design, including the design of cooling systems. An early technique was to take advantage of an aircraft's abundant airflow to replace the honeycomb core (many surfaces, with a high ratio of surface to volume) by a surface-mounted radiator. This uses a single surface blended into the fuselage or wing skin, with the coolant flowing through pipes at the back of this surface. Such designs were seen mostly on World War I aircraft.

As they are so dependent on airspeed, surface radiators are even more prone to overheating when ground-running. Racing aircraft such as the Supermarine S.6B, a racing seaplane with radiators built into the upper surfaces of its floats, have been described as "being flown on the temperature gauge" as the main limit on their performance.[7]

Surface radiators have also been used by a few high-speed racing cars, such as Malcolm Campbell's Blue Bird of 1928.


It is generally a limitation of most cooling systems that the cooling fluid not be allowed to boil, as the need to handle gas in the flow greatly complicates design. For a water cooled system, this means that the maximum amount of heat transfer is limited by the specific heat capacity of water and the difference in temperature between ambient and 100 °C. This provides more effective cooling in the winter, or at higher altitudes where the temperatures are low.

Another effect that is especially important in aircraft cooling is that the specific heat capacity changes and boiling point reduces with pressure, and this pressure changes more rapidly with altitude than the drop in temperature. Thus, generally, liquid cooling systems lose capacity as the aircraft climbs. This was a major limit on performance during the 1930s when the introduction of turbosuperchargers first allowed convenient travel at altitudes above 15,000 ft, and cooling design became a major area of research.

The most obvious, and common, solution to this problem was to run the entire cooling system under pressure. This maintained the specific heat capacity at a constant value, while the outside air temperature continued to drop. Such systems thus improved cooling capability as they climbed. For most uses, this solved the problem of cooling high-performance piston engines, and almost all liquid-cooled aircraft engines of the World War II period used this solution.

However, pressurized systems were also more complex, and far more susceptible to damage - as the cooling fluid was under pressure, even minor damage in the cooling system like a single rifle-calibre bullet hole, would cause the liquid to rapidly spray out of the hole. Failures of the cooling systems were, by far, the leading cause of engine failures.


Although it is more difficult to build an aircraft radiator that is able to handle steam, it is by no means impossible. The key requirement is to provide a system that condenses the steam back into liquid before passing it back into the pumps and completing the cooling loop. Such a system can take advantage of the specific heat of vaporization, which in the case of water is five times the specific heat capacity in the liquid form. Additional gains may be had by allowing the steam to become superheated. Such systems, known as evaporative coolers, were the topic of considerable research in the 1930s.

Consider two cooling systems that are otherwise similar, operating at an ambient air temperature of 20 °C. An all-liquid design might operate between 30 °C and 90 °C, offering 60 °C of temperature difference to carry away heat. An evaporative cooling system might operate between 80 °C and 110 °C. At first glance this appears to be much less temperature difference, but this analysis overlooks the enormous amount of heat energy soaked up during the generation of steam, equivalent to 500 °C. In effect, the evaporative version is operating between 80 °C and 560 °C, a 480 °C effective temperature difference. Such a system can be effective even with much smaller amounts of water.

The downside to the evaporative cooling system is the area of the condensers required to cool the steam back below the boiling point. As steam is much less dense than water, a correspondingly larger surface area is needed to provide enough airflow to cool the steam back down. The Rolls-Royce Goshawk design of 1933 used conventional radiator-like condensers and this design proved to be a serious problem for drag. In Germany, the Günter brothers developed an alternative design combining evaporative cooling and surface radiators spread all over the aircraft wings, fuselage and even the rudder. Several aircraft were built using their design and set numerous performance records, notably the Heinkel He 119 and Heinkel He 100. However, these systems required numerous pumps to return the liquid from the spread-out radiators and proved to be extremely difficult to keep running properly, and were much more susceptible to battle damage. Efforts to develop this system had generally been abandoned by 1940. The need for evaporative cooling was soon to be negated by the widespread availability of ethylene glycol based coolants, which had a lower specific heat, but a much higher boiling point than water.


An aircraft radiator contained in a duct heats the air passing through, causing the air to expand and gain velocity. This is called the Meredith effect, and high-performance piston aircraft with well-designed low-drag radiators (notably the P-51 Mustang) derive thrust from it. The thrust was significant enough to offset the drag of the duct the radiator was enclosed in and allowed the aircraft to achieve zero cooling drag. At one point, there were even plans to equip the Supermarine Spitfire with an afterburner, by injecting fuel into the exhaust duct after the radiator and igniting it[citation needed]. Afterburning is achieved by injecting additional fuel into the engine downstream of the main combustion cycle.

Navigation