The inheritance of the genetic information contained in our DNA has been studied for the past 60 years, with the research efforts culminating in the Human Genome Project. The
Human Genome Project aimed to map and sequence every gene in the human organism and to understand how these were positioned in our chromosomes. The successful completion of the project required an internationally coordinated effort by many research groups and has provided the basic information we need to understand how genes work at the molecular level. This latter effort has been described as being part of the “post-genomic” era, and its aims are to understand how information contained in the sequence of DNA bases can be turned into proteins that create the structures of the cell, and how this process is controlled.
The mechanism by which genes are controlled is probably the most active and fascinating area of research in science today. How does one type of cell, for example a fibroblast, “know” that it is different from a neuron or a muscle cell, given that all these cells have essentially the same information about protein synthesis contained in their genomes? At least some of this control of cell type comes from specific transcription factor proteins that instruct some genes to express while others remain silent, but this cannot explain how the cell remembers to only produce other cells of the same type when it divides. Maintaining cellular identity and function is most probably effected by using so-called “epigenetic” mechanisms.
The term epigenetics
seems to be used to explain a wide range of biological observations, so it is useful to have a precise
definition of its meaning. The word “epigenetics” was first used by
Conrad Waddington in 1942, and his definition of the subject was as follows: “a branch of biology which studies the causal interactions between genes and their products which bring the phenotype into being.” With the benefit of modern hindsight, we can see that this is quite a broad description, as it covers most of the mechanisms by which cellular identity and function can be maintained. However, Waddington’s concept of a “gene” did not benefit from the investigations of the 1950 and 1960s. In spite of this, his original definition is remarkably close to the epigenetic control of gene transcription that we will cover in later chapters of this book.
Our increasing knowledge of genome functions has refined the definition of epigenetics, and today the term is generally accepted as meaning the study of changes in gene function that are mitotically and/or meiotically heritable and that do not entail a change in the sequence of DNA. We cannot assume that this definition will be final, of course; even with our current knowledge we can see that restricting the focus of epigenetics to “gene” function alone might be viewed as erroneous because there is increasing evidence that epigenetic mechanisms can control the functions of noncoding sequences of DNA (that is, those sections of the genome that do not contain sequences formally identified as genes). However, for the purposes of this broad discussion of the topic of epigenetics, the definition is good enough.
A system capable of controlling gene function is interesting to most biologists, but some of the possible consequences of epigenetic control of gene expression have also attracted interest from nonscientists because of the impact that our lifestyles may have on the health or behavior of future generations. At a first glance this would seem to go against the “traditional” principles of genetics, because the bulk of the scientific data in the first 50 years of the twentieth century seemed to suggest that the phenotypes of animals and their offspring can be determined solely on the basis of the genes they possess and pass on to future generations. Any changes that occur in phenotype would thus be due only to alterations in the DNA sequences of the genes and would therefore fall under the heading of “evolution” in the Darwinian sense of that word.
The acceptance of Darwin’s theories took years, and Darwin’s concept of evolution had to compete with several nineteenth-century ideas. The most prevalent of these competing hypotheses was probably the theory of inheritance of acquired characteristics, published by
Jean Baptiste Lamarck in 1801. Lamarck suggested that if an organism changes its phenotype to adapt to its environment, those changes could be passed on to its offspring. Darwin suggested that this would not occur and that the only mechanism by which phenotypic change could occur stemmed from some organisms’ having variations that help them to survive in their environment and be more successful at producing offspring. Because these useful traits arise from the parent’s genes, the only way that the offspring will acquire such a survival advantage is by receiving copies of the advantageous genes from their parents. Ultimately, the evidence generated by the study of genetics supported Darwin, and Lamarck’s theory was discredited, leading to a central dogma of twentieth-century biology that the only way for traits to be passed on was through the inheritance of
genes and that genes could not be affected by events in the outside world. This belief seemed to stand to reason; after all, it seemed to be common sense that something bad that happened during the life of one’s grandfather could not have any effect on one’s own health. However, some of the worst events of the twentieth century seem to suggest otherwise.
Epigeneticists often refer to the “Dutch Hunger Winter” of 1944, and we will be making our own references to this World War II event in the chapters describing the impact of epigenetic mechanisms on a variety of diseases. The Hunger Winter resulted from a German blockade of food and fuel shipments into western Holland from September 1944 until the liberation of the country from Nazi German rule in May 1945. The blockade, coupled with a harsh winter, caused widespread
famine and resulted in a large number of deaths. Because the Hunger Winter was well documented, it has allowed us to measure the effects of famine on human health. The results of many of these studies suggest a link between starvation of pregnant women and the health of their offspring. Several epidemiological investigations found that the resulting children were more susceptible to diseases such as diabetes, heart disease, and obesity than the children of normally fed mothers. Furthermore, mental illnesses such as
schizophrenia seemed to be more prevalent in the children of Hunger Winter mothers. More surprisingly, similar propensities to develop diseases were eventually observed in the grandchildren of Hunger Winter mothers, thereby countering a possible argument that starvation in the mother could simply lead to alteration in the development of her unborn fetus. These data seem to suggest that a grandmother’s diet could affect the health of several generations, which further implies that an adaptation to her environment has produced a heritable trait—something that is not supposed to happen if we accept that Darwinian evolution is the only cause of phenotypic change.
One might still be tempted to argue that it was the mother’s malnutrition that damaged the developing fetus and therefore caused some form of damage to the child’s genes by introducing mutations in the DNA sequences whose harmful effects became evident only later, in adult life; however, recent research indicates otherwise. Prompted by the findings from the
Dutch Hunger Winter, more recent studies have focused on the problems arising in the children of men who either are obese or have suffered starvation. It is known that there is a correlation between the pattern of epigenetic marks (mostly DNA methylations) on the insulin-dependent growth factor gene IGF2
body mass index of the father, with hypomethylation of this gene being observed in newborns arising from couples in which the father is obese. These data imply that the nutritional state of the father may also contribute to the health of the children; this could only have been transmitted to the child via the father’s spermatozoa. Although it is still possible that the starvation or
obesity of the father could have introduced mutations into his own IGF2
gene before the child’s conception, it seems unlikely that similar mutations would occur in all obese individuals to confer similar IGF2
hypomethylation in the offspring. It is noteworthy that hypomethylation of IGF2
was also observed in the children of Hunger Winter mothers six decades after the children’s birth in 1944–1945. There is therefore a considerable body of evidence supporting the acquisition of a heritable trait as a consequence of the nutritional state of the parents.
Other diseases can also be considered to have an epigenetic basis, regardless of how well a child’s parents ate. Only 10% of
breast cancers that run in families can be linked to known genetic mutations. Exposing pregnant rats to chemicals known to influence breast cancer risk in humans (such as a
high-fat diet or the
ethinyl estradiol) caused female
offspring to develop a higher incidence of mammary tumors than the offspring of rats that were not exposed to these risk agents. Interestingly, the increased risk of breast cancer does not seem permanent, at least for some of the risk agents. For example, the great-granddaughters of rats exposed to a high-fat diet had no greater tumor incidence than those of control pregnant rats. The effects of ethinyl estradiol, however, seemed to be more durable: the great-granddaughters also had an enhanced risk of developing cancer.
We do not know the exact mechanisms by which chemical exposure can alter the pattern of epigenetic modifications in the genomes of affected animals, and this is one of the reasons why epigenetics is such a fascinating and potentially fruitful area of investigation. After all, if certain environmental exposures or behavior patterns in previous generations can influence the health of the current generation, altering those exposures or behaviors in the current generation might improve the health of the nex...