Creating A Memory of Causal Relationships
eBook - ePub

Creating A Memory of Causal Relationships

An Integration of Empirical and Explanation-based Learning Methods

Michael J. Pazzani

Share book
  1. 360 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

Creating A Memory of Causal Relationships

An Integration of Empirical and Explanation-based Learning Methods

Michael J. Pazzani

Book details
Book preview
Table of contents
Citations

About This Book

This book presents a theory of learning new causal relationships by making use of perceived regularities in the environment, general knowledge of causality, and existing causal knowledge. Integrating ideas from the psychology of causation and machine learning, the author introduces a new learning procedure called theory-driven learning that uses abstract knowledge of causality to guide the induction process. Known as OCCAM, the system uses theory-driven learning when new experiences conform to common patterns of causal relationships, empirical learning to learn from novel experiences, and explanation-based learning when there is sufficient existing knowledge to explain why a new outcome occurred. Together these learning methods construct a hierarchical organized memory of causal relationships. As such, OCCAM is the first learning system with the ability to acquire, via empirical learning, the background knowledge required for explanation-based learning. Please note: This program runs on common lisp.

Frequently asked questions

How do I cancel my subscription?
Simply head over to the account section in settings and click on ā€œCancel Subscriptionā€ - itā€™s as simple as that. After you cancel, your membership will stay active for the remainder of the time youā€™ve paid for. Learn more here.
Can/how do I download books?
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
What is the difference between the pricing plans?
Both plans give you full access to the library and all of Perlegoā€™s features. The only differences are the price and subscription period: With the annual plan youā€™ll save around 30% compared to 12 months on the monthly plan.
What is Perlego?
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, weā€™ve got you covered! Learn more here.
Do you support text-to-speech?
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Is Creating A Memory of Causal Relationships an online PDF/ePUB?
Yes, you can access Creating A Memory of Causal Relationships by Michael J. Pazzani in PDF and/or ePUB format, as well as other popular books in Psychology & History & Theory in Psychology. We have over one million books available in our catalogue for you to explore.

Information

Year
2013
ISBN
9781134992324
Edition
1

Chapter 1

Introduction

1.1. Predicting the Outcome of Events

Understanding what caused an event to occur enables the understander to predict, to plan for, to produce, to prevent, and to explain the occurrence of the event Therefore, learning causal relationships is a crucial task in understanding and mastering the environment. An additional benefit of learning causal relationships is that future learning can be constrained by ignoring those possibilities that are inconsistent with existing causal knowledge.
In this book, I present an integrated theory of learning to predict and explain the outcome of events. This theory is implemented in a computer program called OCCAM. The theory integrates aspects of previous research in learning and memory (DeJong & Mooney, 1986; Kolodner, 1984; Lebowitz, 1980; Mitchell et al., 1986a; Schank, 1982) to model human learning in several different domains under a variety of circumstances. These domains include simple physical causes (e.g., breaking glass and inflating balloons), children's social interactions (e.g., coercion and agency), and complex planning situations (e.g., kidnapping and economic sanction incidents).
Many tasks require an understander to reason about causality. In the following sections, I consider the tasks of prediction, explanation, planning, and inference. For each task, I give examples of physical causality and social causality. In physical causality, a result occurs as a consequence of transmission of some sort of force. In contrast, transmission of forces does not play a major role in determining human behavior. Instead, human behavior is considered to be a consequence of intentions to achieve some goal. In spite of the differences between these domains, the same procedure is able to learn, store, and retrieve knowledge in each of the domains.

1.1.1. Prediction

Prediction is the task of determining the consequences of a future or a hypothetical event. For example, a political analyst at the Rand Corporation filled out a questionnaire to indicate the likely outcome of several hypothetical economic sanction incidents. Examples of some responses are given below:
Question: What would happen if the US refused to sell computers to South Korea unless South Korea stopped exporting automobiles to Canada?
Answer: S. Korea will probably buy computer equipment from some other country.
Question: What would happen if the US offered to sell coal to West Germany if West Germany agreed not to buy coal from South Africa?
Answer: W. Germany would agree since it wouldn't cost them a thing-- unless this move meant retaliation by S. Africa on W. Germany on some essential exports that W. Germany was highly dependent on.
Question: What would happen if the US threatened to cut off food aid to Ethiopia unless Ethiopia modernized its agricultural production?
Answer: Ethiopia would not agree. It would just ask for more help from the Eastern Bloc.
These economic sanction incidents are all examples of social causality. Countries, like people, have certain goals (e.g., survival and economic growth) and their actions are planned to pursue these goals. The ability to make predictions is dependent on an understanding of these goals.
In addition to social causality, I have looked at examples of utilizing knowledge of physical causality in prediction. For example, OCCAM learns to predict that a small child will be able to successfully inflate a balloon only after the balloon has been stretched.

1.1.2. Explanation

A prediction answers questions about what will happen under certain circumstances. Explanation, on the other hand, requires answering questions about why a cause results in an effect The ability to explain is essential if a computer is to be trusted to make a prediction. If a computer (or a human) cannot articulate a convincing line of reasoning to justify a prediction, who would be willing to believe the prediction?
Typically, an explanation consists of a set of intermediate states that connect a cause and an effect. In physical causality, these intermediate states are states of the world. In social causality, the intermediate states are often mental states.
In addition to asking the political analyst at the Rand Corporation to make a prediction about hypothetical sanction incidents, I also requested an explanation to justify the prediction. An example explanation is given below:
Question: What would happen if the US refused to sell computers to South Korea unless South Korea stopped exporting automobiles to Canada?
Answer: S. Korea will probably buy computer equipment from some other country.
Question: Why?
Answer: If the US restricts S. Korea's supply of computers, they would be willing to pay a higher price for the computers and some other country would move in.
This explanation references several intermediate states: South Korea's goal of obtaining computers, South Korea's willingness to pay a higher price to obtain the computers, and some other country's goal of making a profit by selling computers to South Korea.
In the realm of physical causality, when OCCAM learns to predict that a small child will be able to successfully inflate a balloon after the balloon has been stretched, it also constructs a series of intermediate states: pulling on a balloon results in a state (i.e., the balloon is stretched out) that enables the child to make the balloon bigger by blowing air into the balloon.

1.1.3. Planning

One aspect of planning is to predict, prevent or prepare for anticipated events. If an outcome is not desirable, it may be possible to come up with a plan to change the outcome. For example, the United States stockpiles oil and strategic materials to mitigate the effects of an interruption in the supply of these materials. Similarly, South Africa has been stockpiling commodities mat it imports to avoid economic hardship in the event that stricter economic sanctions are implemented and enforced.
Planning also requires the planner to reason about physical causality. Parents often give small children plastic cups to drink out of. Knowledge of physical causality (glass cups break when they are dropped; plastic cups are unbreakable; small children are likely to drop things) helps to prevent the undesirable consequences of giving a small child a glass cup. OCCAM is able to acquire knowledge of causality to support planning.

1.1.4. Inference

An important task in natural language understanding is inferring information that is not explicitly stated in a text. For example, consider the following story:
Kidnapping-1
John Doe, who was abducted on Ms way to school Monday morning, was released today after his father left $50,000 in a trash can in a men's room at the bus station.
This short story leaves many things unstated. For example, it does not state why the father put money in a trash can nor why John Doe was abducted or released. However, a typical adult reading this story has no difficulty answering these questions. General knowledge about kidnapping, including the motives of a kidnapper and the goals of a parent, must be used to infer the missing information.
Similarly, in the following story, knowledge of physical causality is necessary to infer a causal connection between the events:
Snowstorm-1
After two days of snow and rain, the roof of the old building collapsed, injuring two occupants sleeping in an upstairs bedroom.
In this story, the causal connection between the snow and the roof collapse is not explicitly stated. A temporal connection is given from which a typical adult reading this story can infer a causal connection. Similarly, there is no explicit mention of how the two occupants were injured, but a typical person reading this story can infer that part of the roof that collapsed must have fallen on the occupants.

1.1.5. Knowledge of causality facilitates future learning

Learning is one important task that can be aided by knowledge of causality. The learning of new causal knowledge can be facilitated by focusing on relationships which are consistent with existing knowledge. To illustrate, consider how one might learn that on cold winter days, roads that have been salted are less slippery than roads that have not been salted.
One way to acquire this knowledge is purely empirical. The slipperiness of roads on cold days would be noted under different conditions and eventually a regularity could be detected.
An alternative means of acquiring this knowledge is analytical. The fact that salted roads are less slippery is a direct consequence of two facts:
ā€¢ Ice is slippery.
ā€¢ Salt melts ice.
A learner who knows these two facts has a great advantage when it comes to learning that salted roads are less slippery. It is possible to deduce the effect of spreading salt on an icy road. Here, learning that salted roads are less slippery consists of simply storing the results of this deduction.
As OCCAM learns, it acquires knowledge that facilitates future learning.
For example, in physical causality OCCAM acquires the following knowledge:
ā€¢ OCCAM is presented with examples of people attempting to open a refrigerator. Some people pull on the door and it opens; others pull on the door and it doesn't open. After many examples, it is able to determine that the age of the person pulling on the door (as opposed to the hair color or eye color, etc.) is a good predictor for determining whether the door will open. In addition to learning a specific fact (adults are strong enough to open a refrigerator), it also learns some general knowledge (adults are strong) that may be transferred to other problems.
ā€¢ Once OCCAM has learned mat adults are strong, it is presented with examples of people attempting to inflate balloons. Some people blow into the balloon and it is inflated; others blow into the balloon and it does not inflate. However, because it has already learned about strength (in the context of opening refrigerators), ...

Table of contents