Relative humidity is defined as the ratio between the actual amount of water vapor in the air and the maximum amount of vapor possible that the air should absorb at a given temperature. Specific humidity is defined as the ratio of the water vapor weight in kg to the dry air weight in g/kg. Absolute humidity determines the ratio of the moisture weight in kg in a volume of 1 m³ of moist air.
At each temperature there is a specific point at which air can no longer absorb moisture. The state is called the saturation point or dew point (relative humidity reaches the maximum, ie 100%). For saturated compressed air we refer to the pressure dew point. At this point, condensation begins to form and must be removed from the compressed air as quickly as possible.
The amount of condensate in the compressed air depends on the ambient temperature of the intake air to the compressor. Therefore, the higher the air temperature the more condensate is generated. Summer months are a critical period as the ambient temperature of the intake air is highest. For example, an ambient temperature of 20°C means the amount of moisture in the air is 17.30 g/m³, whereas an ambient temperature of 35°C more than doubles the value to 39.63 g/m³. If the ambient air is compressed to a pressure of 8 bar at 35°C and cooled again to the same temperature, the air can no longer retain such an amount of moisture because the volume is reduced to 1/8 m³ (4.95 g humidity), and because the maximum absolute humidity only depends on temperature and volume, not on pressure, 7/8 of the condensate us seperated from the total amount of moisture, which is 34.68 g (39.63 - 4.95).
If the condesate is not quickly removed from the compressed air by means of a condensation or adsorption dryer, it is discharged to the compressed air line network, where it poses an enormous risk. The temperature of the intake air to the compressor is very important and should be controlled as low as possible, especially in the summer months.