1 Introduction
World population growth, global change, ever-growing irrigation, and economic development have led and will keep leading to increased water use and severe scarcity, especially in arid and semiarid regions [1]. The World Economic Forum [2] highlighted the water crisis as a worldwide risk because of its potential global impact. Bouwer [3] estimated the global renewable water supply to be around 7000 m3 per person per year (2000 m3 is considered adequate for good living standards and less than 500 m3 implies water scarcity) [3]. Therefore, water shortages result not from the global lack of water, but from the spatial and temporal mismatch between demand and supply. Things will get worse. Agriculture, livestock, and industry account more than 80% of all human water consumption. The need to feed an ever-increasing population will lead to a parallel increase in water demand. Although water might not be the biggest hurdle to meet this challenge [4], water systems will be stressed. Water engineers have traditionally overcome the problem through reservoirs (but these are reaching a limit in many regions of the world), water transfers (but these are unpopular and limited in distance), desalination (but this comes at a steep energy cost), and groundwater (see the following).
Groundwater represents 99% of the liquid fresh water of the planet [5], and it is the only readily available resource in many arid regions. Therefore, its use has increased sharply since the 1960s, leading to worldwide depletion of water levels [6]. Water-level depletion causes adverse effects, such as seawater intrusion (and loss of submarine groundwater discharge to coastal ecosystems), land subsidence, and especially the reduction of inflow to groundwater-dependent bodies. (Many rivers are losing their base flow and many wetlands are becoming dry.) In short, groundwater is always there; it can be used for moderate extraction or as a strategic reserve, but it cannot be viewed as a long-term solution to meet a growing water demand.
Integrated Water Resources Management is emerging as a primary way to address water scarcity [3]. Surface water, groundwater, pristine water, and wastewater must be combined to address this situation by implementing existing and new techniques for reconciling the demands of people, agriculture, industry, and the environment [6]. Otherwise, water scarcity might have drastic consequences, for example population migration or wars [7]. For two reasons, water reuse is increasingly considered an alternative for the entire water-cycle management and to support the circular economy: first, the discharge of effluents from wastewater treatment plants into surface waters is becoming more expensive because of the stricter requirements to protect the quality of the receiving water and the environment, and, second, discharge effluents might be considered an important water resource [3]. Levine and Asano [8] summarized this challenge by stating that, with the current demand, “society no longer has the luxury of using water only once,” and the recycling/reuse of water is needed for sustainable water resource management [8].
In this context, water quality becomes the leading problem. Wastewater was traditionally discharged directly, or after treatment, into recipient water bodies. As a result, a growing number of contaminants, from heavy metals to emerging micropollutants, have been detected in these bodies.
Water treatment technologies can help in alleviating the pressure on water quality in industrialized countries. But these technologies can be expensive; they require a considerable infusion of capital and infrastructure, the use of chemical products, and the added need of managing the generated residuals. Therefore, even in countries that can afford such technologies, the current trend is to reduce the production of residues and the use of chemical products during depuration processes, leaning more toward the “natural” treatment for supply-water production [7]. In this context, the artificial recharge of aquifers (AR) is positioned as an increasingly realistic option because it increases available groundwater resources and facilitates the improvement of the recharge water quality.
One of the most important difficulties to overcome in the area of water reuse is negative public opinion [9]. Water reuse, for nonpotable use (e.g., irrigation or industrial) or indirect potable use (e.g., discharge into drinking water reservoirs or supply), has been presented in many regions of the world as the alternative for facing the increasing demand for quality water [9]. Public acceptance is a determinant for a water reuse project to succeed. Negative perceptions may lead to failure [10], as in San Diego and Toowoomba, Australia [11, 12]. In some cases, the projects have been carried out with little awareness of the population, and in other cases projects have received great public support.
There is little information regarding how the public perception of water reuse has evolved through time, although some changes have taken place. Historically, the acceptance of water recycling depends on the potential use; it is generally more acceptable to recycle it for irrigation or for industrial cooling than for potable uses [13]. Factors such as age, political affiliation, public dialogue and information, trust in the technologies and in the public management agencies, or perception of the good quality of the reused water are also related to its acceptance [9, 11–13]. It is interesting to note that the cognizance of water reuse as already being part of the management scheme can help to increase public acceptance [9].
The objective of this introductory chapter is to provide the background for the rest of the book by summarizing the history of water engineering, as well as the emerging challenges and technologies. Managed aquifer recharge, which is not covered in other chapters, is also discussed here.
2 A historical appraisal: History of water treatment, sanitation and reuse
Water technology has gone through two well-defined periods: The ancient and contemporary eras, separated by the dark Middle Ages, and the slowly recovering modern age. Ancient times witnessed a slow but continuous improvement in almost all branches of water technology. In fact, it could be argued that water needs drove technological developments. Well construction is as old as history, with records in the book of Genesis. Around 3000 BC, a network of some 700 wells fed the city of Mohenjo-Daro in the Indus Valley. Almost the same can be said of water conveyance. Bromehead [14] mentions the aqueduct of Nineveh, whose maker, Sennacherib, recorded on the masonry: “To make the orchards luxuriant I cut and directed a canal with iron pickaxes from the borders of Kisiri to the plain about Nineveh through mountain and lowland” [14]. Hydraulic engineering matured in Roman times. The only technology they did not master was the design of spillways, which prevented them from building reservoirs in running rivers. Still, they did build dams to create reservoirs fed through canals from nearby rivers. Examples include the Cornalvo and Proserpina dams in Merida (Spain), which are still operational.
Water treatment underwent a parallel effort, “Sanskrit and Greek writings recommended water treatment methods such as filtering through charcoal, exposing to sunlight, boiling, and straining” [15]. Wastewater was not treated in ancient times, but efforts to get rid of it appear to hav...