Ancient civilizations established themselves around water sources. While the importance of ample water quantity for drinking and other purposes was apparent to our ancestors, an understanding of drinking water quality was not well known or documented. Although historical records have long mentioned aesthetic problems (an unpleasant appearance, taste or smell) with regard to drinking water, it took thousands of years for people to recognize that their senses alone were not accurate judges of water quality.
Water treatment originally focused on improving the aesthetic qualities of drinking water. Methods to improve the taste and odor of drinking water were recorded as early as 4000 B.C. Ancient Sanskrit and Greek writings recommended water treatment methods such as filtering through charcoal, exposing to sunlight, boiling, and straining. Visible cloudiness (later termed turbidity) was the driving force behind the earliest water treatments, as many source waters contained particles that had an objectionable taste and appearance. To clarify water, the Egyptians reportedly used the chemical alum as early as 1500 B.C. to cause suspended particles to settle out of water. During the 1700s, filtration was established as an effective means of removing particles from water, although the degree of clarity achieved was not measurable at that time. By the early 1800s, slow sand filtration was beginning to be used regularly in Europe.
During the mid to late 1800s, scientists gained a greater understanding of the sources and effects of drinking water contaminants, especially those that were not visible to the naked eye. In 1855, epidemiologist Dr. John Snow proved that cholera was a waterborne disease by linking an outbreak of illness in London to a public well that was contaminated by sewage. In the late 1880s, Louis Pasteur demonstrated the “germ theory” of disease, which explained how microscopic organisms (microbes) could transmit disease through media like water.
During the late nineteenth and early twentieth centuries, concerns regarding drinking water quality continued to focus mostly on disease-causing microbes (pathogens) in public water supplies. Scientists discovered that turbidity was not only an aesthetic problem; particles in source water, such as fecal matter, could harbor pathogens. As a result, the design of most drinking water treatment systems built in the U.S. during the early 1900s was driven by the need to reduce turbidity, thereby removing microbial contaminants that were causing typhoid, dysentery, and cholera epidemics. To reduce turbidity, some water systems in U.S. cities (such as Philadelphia) began to use slow sand filtration.
While filtration was a fairly effective treatment method for reducing turbidity, it was disinfectants like chlorine that played the largest role in reducing the number of waterborne disease outbreaks in the early 1900s. In 1908, chlorine was used for the first time as a primary disinfectant of drinking water in Jersey City, New Jersey. The use of other disinfectants such as ozone also began in Europe around this time, but were not employed in the U.S. until several decades later.
Federal regulation of drinking water quality began in 1914, when the U.S. Public Health Service set standards for the bacteriological quality of drinking water. The standards applied only to water systems which provided drinking water to interstate carriers like ships and trains, and only applied to contaminants capable of causing contagious disease. The Public Health Service revised and expanded these standards in 1925, 1946, and 1962. The 1962 standards, regulating 28 substances, were the most comprehensive federal drinking water standards in existence before the Safe Drinking Water Act of 1974. With minor modifications, all 50 states adopted the Public Health Service standards either as regulations or as guidelines for all of the public water systems in their jurisdiction.
By the late 1960s it became apparent that the aesthetic problems, pathogens, and chemicals identified by the Public Health Service were not the only drinking water quality concerns. Industrial and agricultural advances and the creation of new man-made chemicals also had negative impacts on the environment and public health. Many of these new chemicals were finding their way into water supplies through factory discharges, street and farm field runoff, and leaking underground storage and disposal tanks. Although treatment techniques such as aeration, flocculation, and granular activated carbon adsorption (for removal of organic contaminants) existed at the time, they were either underutilized by water systems or ineffective at removing some new contaminants.
Health concerns spurred the federal government to conduct several studies on the nation’s drinking water supply. One of the most telling was a water system survey conducted by the Public Health Service in 1969 which showed that only 60 percent of the systems surveyed delivered water that met all the Public Health Service standards. Over half of the treatment facilities surveyed had major deficiencies involving disinfection, clarification, or pressure in the distribution system (the pipes that carry water from the treatment plant to buildings), or combinations of these deficiencies. Small systems, especially those with fewer than 500 customers, had the most deficiencies. A study in 1972 found 36 chemicals in treated water taken from treatment plants that drew water from the Mississippi River in Louisiana. As a result of these and other studies, new legislative proposals for a federal safe drinking water law were introduced and debated in Congress in 1973.
Chemical contamination of water supplies was only one of many environmental and health issues that gained the attention of Congress and the public in the early 1970s. This increased awareness eventually led to the passage of several federal environmental and health laws, one of which was the Safe Drinking Water Act of 1974. That law, with significant amendments in 1986 and 1996, is administered today by the U.S. Environmental Protection Agency’s Office of Ground Water and Drinking Water (EPA) and its partners.
Since the passage of the original Safe Drinking Water Act, the number of water systems applying some type of treatment to their water has increased. According to several EPA surveys, from 1976 to 1995, the percentage of small and medium community water systems (systems serving people year-round) that treat their water has steadily increased. For example, in 1976 only 33 percent of systems serving fewer than 100 people provided treatment. By 1995, that number had risen to 69 percent.
Since their establishment in the early 1900s, most large urban systems have always provided some treatment, as they draw their water from surface sources (rivers, lakes, and reservoirs) which are more susceptible to pollution. Larger systems also have the customer base to provide the funds needed to install and improve treatment equipment. Because distribution systems have extended to serve a growing population (as people have moved from concentrated urban areas to more suburban areas), additional disinfection has been required to keep water safe until it is delivered to all customers.
Today, filtration and chlorination remain effective treatment techniques for protecting U.S. water supplies from harmful microbes, although additional advances in disinfection have been made over the years. In the 1970s and 1980s, improvements were made in membrane development for reverse osmosis filtration and other treatment techniques such as ozonation. Some treatment advancements have been driven by the discovery of chlorine-resistant pathogens in drinking water that can cause illnesses like hepatitis, gastroenteritis, Legionnaire’s Disease, and cryptosporidiosis. Other advancements resulted from the need to remove more and more chemicals found in sources of drinking water.
According to a 1995 EPA survey, approximately 64 percent of community ground water and surface water systems disinfect their water with chlorine. Almost all of the remaining surface water systems, and some of the remaining ground water systems, use another type of disinfectant, such as ozone or chloramine.
Many of the treatment techniques used today by drinking water plants include methods that have been used for hundreds and even thousands of years (see the diagram below). However, newer treatment techniques (e.g., reverse osmosis and granular activated carbon) are also being employed by some modern drinking water plants.
Recently, the Centers for Disease Control and Prevention and the National Academy of Engineering named water treatment as one of the most significant public health advancements of the 20th Century. Moreover, the number of treatment techniques, and combinations of techniques, developed is expected to increase with time as more complex contaminants are discovered and regulated. It is also expected that the number of systems employing these techniques will increase due to the recent creation of a multi-billion dollar state revolving loan fund that will help water systems, especially those serving small and disadvantaged communities, upgrade or install new treatment facilities.