
eBook - ePub
Assessing Change in English Second Language Writing Performance
- 200 pages
- English
- ePUB (mobile friendly)
- Available on iOS & Android
eBook - ePub
Assessing Change in English Second Language Writing Performance
About this book
This book introduces a new framework for analyzing second language (L2) learners' written texts. The authors conducted a major study on changes and differences in English L2 learners' writing performance to advance understanding of the nature of L2 writing development over time, in relation to L2 instruction and testing, and to offer a model that professionals and researchers can use in their own longitudinal and cross-sectional studies of L2 writing development. Grounded in research, data, theory, and technology, this will be a welcome how-to for language test developers, scholars, and graduate students of (L2) writing and assessment.
Frequently asked questions
Yes, you can cancel anytime from the Subscription tab in your account settings on the Perlego website. Your subscription will stay active until the end of your current billing period. Learn how to cancel your subscription.
No, books cannot be downloaded as external files, such as PDFs, for use outside of Perlego. However, you can download books within the Perlego app for offline reading on mobile or tablet. Learn more here.
Perlego offers two plans: Essential and Complete
- Essential is ideal for learners and professionals who enjoy exploring a wide range of subjects. Access the Essential Library with 800,000+ trusted titles and best-sellers across business, personal growth, and the humanities. Includes unlimited reading time and Standard Read Aloud voice.
- Complete: Perfect for advanced learners and researchers needing full, unrestricted access. Unlock 1.4M+ books across hundreds of subjects, including academic and specialized titles. The Complete Plan also includes advanced features like Premium Read Aloud and Research Assistant.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, weāve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes! You can use the Perlego app on both iOS or Android devices to read anytime, anywhere ā even offline. Perfect for commutes or when youāre on the go.
Please note we cannot support devices running on iOS 13 and Android 7 or earlier. Learn more about using the app.
Please note we cannot support devices running on iOS 13 and Android 7 or earlier. Learn more about using the app.
Yes, you can access Assessing Change in English Second Language Writing Performance by Khaled Barkaoui,Ali Hadidi in PDF and/or ePUB format, as well as other popular books in Langues et linguistique & Langues. We have over one million books available in our catalogue for you to explore.
Information
Subtopic
Langues1
Introduction
Recently, there has been a growing interest in researching and assessing second language (L2) development longitudinally in order to understand how L2 proficiency develops over time, how long it takes L2 learners to attain proficiency in L2, and the effects of L2 instruction and individual and contextual factors on L2 development (e.g., Barkaoui, 2014; Graham & Macaro, 2008; Lyster & Izquierdo, 2009; Norris & Manchón, 2012; Ortega & Byrnes, 2008; Ortega & Iberri-Shea, 2005; Polio, 2017; Polio & Park, 2016; Usborne et al., 2009). Although numerous studies have examined the effects of different approaches to L2 instruction on L2 proficiency development, few such studies have examined the effects of L2 instruction on L2 writing development. Most of these studies tend to focus on changes in scores on L2 proficiency tests (e.g., Elder & OāLoughlin, 2003; Ling et al., 2014; OāLoughlin & Arkoudis, 2009). A few studies have also examined changes in the linguistic characteristics of L2 learnersā texts after L2 instruction, focusing mainly on changes in the grammatical aspects of L2 learnersā texts (i.e., accuracy, complexity, and fluency; e.g., Storch, 2009; Storch & Tapper, 2009). This book extends this line of research by proposing a broader framework for analyzing L2 learnersā texts and reporting on a study that used this framework to examine changes in the writing performance of a sample of L2 learners on different types of writing tasks after a period of English language study.
There are two main benefits for this line of research. First, longitudinal research can help us understand the nature and durability of the effects of L2 instructional practices on the nature, rate, and level of L2 proficiency development (Ortega & Byrnes, 2008; Ortega & Iberri-Shea, 2005). Findings from this research can thus inform both theory and practice. For example, identifying what aspects of L2 proficiency develop and how long it takes each aspect to develop can inform material design, decisions on length and timing of instruction, and the like. Second, this line of research can provide information about the sensitivity of L2 proficiency tests to changes in L2 proficiency over time. Most L2 proficiency tests are designed to measure L2 proficiency at one point in time, rather than L2 development over time. This does not mean that these tests are not used to assess L2 development or progress over time and/or in relation to L2 instruction. However, when L2 proficiency tests are used to make claims about changes in L2 proficiency over time and/or in relation to L2 instruction, empirical evidence needs to be provided that such tests are sensitive to change in the construct being measured. That is, they can reliably detect true changes in L2 proficiency over time.
To establish such evidence in the context of L2 writing tests, we need to adopt a longitudinal approach to examine whether L2 instruction is associated with changes in L2 ability, which in turn are associated with changes in the scores and linguistic characteristics of learnersā written responses (cf. Norris & Manchón, 2012; Polio, 2017; Polio & Park, 2016). This research can also examine factors contributing to variability in test performance between learners (i.e., cross-sectionally) as well as over time (i.e., longitudinally). One would expect that changes in learnersā L2 proficiency will be reflected in changes in the linguistic characteristics of their L2 texts, which, in turn, are reflected in changes in their writing scores. To the extent that empirical evidence backs these assumptions, the testās validity argument is supported.
However, most studies on L2 writing tests tend to be cross-sectional, comparing essay scores and characteristics across tasks and/or test-takers at one point in time. A few studies have examined changes in test performance over time or the relationships between L2 test scores and L2 instruction. Some of these studies have adopted a cross-sectional approach by comparing the scores of test-takers with different length of English study at one point in time (e.g., Wang et al., 2008), while others adopted a quasi-experimental approach by comparing the test scores of the same learners before and after relevant L2 instruction, mainly in relation to IELTS Academic (e.g., Archibald, 2000; Brown, 1998; Elder & OāLoughlin, 2003; OāLoughlin & Arkoudis, 2009; Rao et al., 2003; Read & Hayes, 2003) and, recently, TOEFL-iBTĀ® (Ling et al., 2014) and TOEFL ITP (Choi & Papageorgiou, 2014).
In the quasi-experimental studies, a finding that test scores after relevant L2 instruction are substantially higher than pre-instruction scores is interpreted as supporting the claim that the test measures L2 ability (Bachman, 2004; Brown, 2005). The reasoning is that changes in test scores after instruction reflect changes in the construct being measured (i.e., L2 ability) as a result of L2 instruction. This reasoning is based on the assumption that the instruction is effective. The quasi-experimental studies above indicate that test scores do change after a period of English language instruction, but the direction and magnitude of score change vary depending on course characteristics, language skill, and learner characteristics (e.g., initial English proficiency, first language [L1], educational level). For example, OāLoughlin and Arkoudis (2009) found that, after a period of English language instruction, the greatest average improvement in IELTS scores was in listening and reading and the least average improvement was in writing; students with lower initial scores for listening, reading, and writing tended to improve significantly more than students with initial higher scores; and undergraduate students improved more than postgraduate students. Recently, Ling et al. (2014) compared the scores of two groups of L2 learners who took TOEFL-iBTĀ® practice tests before and after a period of six or nine months of English language instruction in China and the USA. They found moderate to substantial levels of improvement on each of the TOEFL-iBTĀ® sections, with students in the USA showing greater gains in writing and speaking, while students in China showed greater gains in reading and listening scores. Similarly, Choi and Papageorgiou (2014) examined changes in TOEFL ITP scores after an extended period of English language instruction in two contexts (Turkey and Thailand). They found that section and total test scores improved significantly in both contexts, suggesting that TOEFL ITP can be used to monitor L2 proficiency development. It should be noted here that TOEFL ITP does not assess L2 writing.
A few studies adopted a quasi-longitudinal approach by comparing the scores of test-takers who repeated L2 proficiency tests twice (e.g., Green, 2005; Gu et al., 2015; Zhang, 2008). Green (2005), for example, compared the writing scores of more than 15,000 test-takers who each took IELTS twice in the period January 1998 to June 2001. He found considerable individual variation in the rate of score gain in writing across test occasions and that test-taker age, region of origin (e.g., East Asia, Europe), number of years spent learning English, period between tests, and writing score at occasion 1 were significantly associated with gains in writing scores across test occasions. However, generally, initial writing score was a better predictor than the interval between tests of scores on the second test. Furthermore, relatively rapid gains appear more likely for those with initial scores at the lower end of the IELTS scale. Finally, Gu et al. (2015) used data from 4,606 students who took TOEFL Junior Standard ā which assesses improvement in young learnersā proficiency in English as a foreign language ā more than once between early 2012 and mid-2013 in order to examine the relationship between (a) the occasion interval between test occasions, used as a proxy for changes in underlying L2 proficiency because of learning, and (b) changes in test scores across test occasions. TOEFL Junior Standard is composed of multiple-choice questions in three sections: listening, reading, and language form and meaning. Gu, Lockwood, and Powers found that candidates obtained higher scores on the second administration and that there was a positive, statistically significant relationship between interval length and score gains. Generally, students with longer intervals between retesting (interpreted by Gu, Lockwood, and Powers as an indicator of more English language learning opportunities) exhibited greater gains in terms of both section and total test scores than did students with shorter intervals. Gu, Lockwood, and Powers concluded that these findings āprovide initial support for the claim that [TOEFL Junior] can be used to monitor growth for young English language learnersā (p. 11).
One limitation of these studies is that they examined only changes in test scores. Examining test scores, however, provides only a partial picture of L2 development and factors affecting it. When examining test-takersā responses to L2 performance tests, such as writing tests, it is also important to analyze the linguistic characteristics of test-takersā responses in order to understand the nature and rate of change in specific aspects of language performance over time (Barkaoui & Knouzi, 2012). There is a long tradition in applied linguistics of analyzing L2 learnersā texts to describe and explain variability in L2 writing performance. As we discuss in Chapter 2, however, this research tends to focus on the grammatical aspects of writing (e.g., grammatical accuracy, syntactic and lexical complexity) and, to a lesser extent, discourse aspects (e.g., coherence, cohesion), and, thus, does not provide a complete picture of the construct or the development of L2 writing ability. Second, most of this research is cross-sectional, comparing L2 writing performance across tasks and/or learner groups at one point in time. In Chapter 2, we propose a broader framework for analyzing L2 learnersā texts that we think covers the construct of L2 writing comprehensively and is more sensitive to differences and changes in L2 writing ability. In Chapter 3, we review the few studies that have examined L2 writing performance longitudinally. In the following chapters, we report on a study that used the framework in Chapter 2 to examine changes in the linguistic characteristics of texts written by a sample of English language learners in response to TOEFL-iBTĀ® writing tasks before and after a period of English language instruction.
The book includes nine chapters. Chapter 2 discusses the importance of examining L2 learnersā written texts, the different models and frameworks in the literature for analyzing learnersā written texts, and an outline of the framework we propose. Chapter 3 reviews previous studies that have examined changes in L2 learnersā written texts over time and/or after L2 instruction. Chapter 4 describes the context, research questions, and methods of the current study. Chapters 5 to 8 report the findings of the study, which are then summarized and further discussed in Chapter 9. Chapter 9 also discusses the implications of the study for assessment, teaching, and research.
2
A Framework for Analyzing L2 Learnersā Texts
There is a long tradition of collecting and analyzing learner language (oral or written) in SLA (second language acquisition) research in order to describe and explain L2 learnersā underlying linguistic knowledge (Ellis & Barkhuizen, 2005). As Ellis and Barkhuizen (2005) explained, a key assumption in this research is that how learners perform a language task (e.g., giving a presentation, writing an essay) āserves as the principal source of information about what they know about the languageā (p. 6). Ellis and Barkhuizen (2005) outlined different methods for analyzing learner language in terms of expression (form) and/or content, with each method reflecting ādifferent conceptions of what it means to acquire an L2ā (p. 359).
Similarly, in L2 writing research, L2 learnersā written texts are often analyzed to gain insight into L2 proficiency and its variability and/or development (e.g., Hinkel, 2002, 2005; Leki et al., 2008; Plakans, 2014; Polio & Friedman, 2017; Wolfe-Quintero et al., 1998). Polio and Friedman (2017), for example, outlined several reasons for analyzing learner texts such as identifying the effects of feedback and instruction, comparing writing performance across tasks, contexts, and learner groups, and/or validating tests (cf. Hinkel, 2005; Plakans, 2014). Plakans (2014) also explained that studies on L2 learnersā texts in the context of L2 writing tests can identify variation in writing performance across tasks and proficiency levels and provide evidence for the construct of writing and the interpretation of scores from such tests.
In fact, analyzing L2 learnersā texts is a very common approach to explaining the meaning of L2 writing test scores. Many studies analyze test-takersā written responses to writing tests and then examine the relationships between the linguistic and discourse characteristics of these responses and their scores (e.g., Banerjee et al., 2007; Barkaoui, 2010a, 2010b; Barkaoui & Knouzi, 2012; Cumming et al., 2005; Frase et al., 1999; Kennedy & Thorp, 2007; Mayor et al., 2007). This approach is based on the assumption that the quality of test performance (as reflected in test scores) can be partially explained by examining the characteristics of the performance itself (Chapelle, 2008; Cumming et al., 2005). Research on the linguistic characteristics of test-takersā written responses can thus provide important insights concerning the validity of score-based inferences about learnersā abilities, and the effects of contextual, individual, and task factors on the characteristics of L2 learnersā texts (Banerjee et al., 2007; Barkaoui, 2007; Barkaoui & Knouzi, 2012; Cumming et al., 2005; Hinkel, 2002; Iwashita et al., 2008; Plakans, 2014; Weigle, 2002).
Previous Frameworks for Examining L2 Learnersā Texts
One limitation of previous studies, as Plakans (2014) observed, is that they tend to focus on the linguistic features of L2 learnersā written texts as indicators of L2 proficiency or development, rather than as indicators of L2 writing proficiency or development (cf. Norris & Manchón, 2012; Polio & Park, 2016). For example, Wolfe-Quintero et al. (1998) reviewed more than 30 studies that analyzed L2 learnersā written texts using numerous measures of syntactic and lexical complexity, accuracy, and fluency (often termed CAF). However, Wolfe-Quintero, Inagaki, and Kim emphasized early on in their book that they āare not interested in measuring the ability to āwrite wellā in a second language, but in measuring language development as it is manifest in a written modalityā (p. 2, emphasis added). In other words, their focus was not how L2 writing ability develops, but on the development of L2 as reflected in specific linguistic features of writing, in this case measures of CAF.
This focus on linguistic features, specifically CAF, can be due to (a) a historical view of writing as a means for teaching and assessing linguistic knowledge (i.e., vocabulary and grammar), rather than a goal in itself, and/or (b) the disagreement that Ellis and Barkhuizen (2005) noted in the field of SLA concerning whether L2 ability includes linguistic knowledge only...
Table of contents
- Cover
- Half Title
- Series Information
- Title Page
- Copyright Page
- Dedication
- Table of Contents
- List of Figures
- List of Tables
- Series Editorsā Foreword
- Acknowledgments
- 1 Introduction
- 2 A Framework for Analyzing L2 Learnersā Texts
- 3 Research on L2 Writing Development
- 4 Method
- 5 Changes in Grammatical Aspects
- 6 Changes in Discourse Aspects
- 7 Changes in Sociolinguistic and Strategic Aspects
- 8 Changes in Content and Source Use
- 9 Discussion and Implications
- References
- Appendix A: Rating Scale and Instructions for Rating Linguistic Accuracy
- Appendix B: Rating Scale and Instructions for Rating Coherence and Cohesion
- Appendix C: Rating Scale and Instructions for Rating Text Structure
- Appendix D: Rating Scale and Instructions for Rating Argument Quality
- Appendix E: Rating Scale and Instructions for Rating Register
- Appendix F: Rating Scale and Instructions for Rating Metadiscourse Use
- Appendix G: Source Use Coding Scheme and Guidelines
- Appendix H: Rating Scale and Instructions for Rating Source Use
- Author Index
- Subject Index