Water absorption testing plays a crucial role in evaluating the amount of moisture roofing tiles can potentially absorb. Excessive water absorption can lead to undesirable consequences such as cracking in the tiles. Hence, this test is essential to determine the suitability of roofing tiles for various applications, especially in environments prone to rainfall, freezing, and thawing cycles. The ideal roofing tile should have a considerably low absorption capacity.
To ensure uniformity and reliability in water absorption testing, various international codes, such as the American Standard for Testing Materials and the Indian Standard, have been established. One of the prominent methods outlined by the American Standard for Testing Materials (ASTM) is ASTM C373. According to this standard, non-porous tiles typically exhibit a water absorption range of 0.1-0.5%, while porous products may show absorption in the range of 9-15%.
The primary objective of the water absorption test is to determine the percentage of water absorption in roofing tiles. This metric serves as a critical indicator of the tiles’ ability to resist moisture, influencing their overall durability and performance.
For conducting the water absorption test on roofing tiles, the following apparatus is essential:
Two tiles are selected from the sample for the water absorption test. These tiles represent the overall characteristics of the roofing material.
The water absorption test is conducted through the following step-by-step procedure:
The percentage of water absorption is calculated using the following equation:
W=(M2−M1M1)×100 (Equation 1)
Where:
After performing the water absorption test, the calculated water absorption of the given roofing tiles is determined as a percentage, providing valuable insights into the tiles’ moisture resistance properties.