The Concise Encyclopedia of Applied Linguistics. Carol A. Chapelle

Чтение книги онлайн.

Читать онлайн книгу The Concise Encyclopedia of Applied Linguistics - Carol A. Chapelle страница 73

The Concise Encyclopedia of Applied Linguistics - Carol A. Chapelle

Скачать книгу

1 2 3 4 3. Paraphrases ideas from source text appropriately 1 2 3 4 4. Selects ideas from the source text well 1 2 3 4 5. Connects ideas from source text with own 1 2 3 4 6. Clearly develops a thesis position 1 2 3 4 7. Provides support for position 1 2 3 4 8. Follows logical organization and cohesion 1 2 3 4 9. Displays grammatical accuracy 1 2 3 4 10. Uses clear specific vocabulary 1 2 3 4

      Validity of test‐score interpretation and use needs to be justified on the basis of evidence for a correspondence between test scores and the integrated ability that the test is intended to measure, as well as evidence for the utility of the test scores. Test developers and researchers need to consider how to elicit evidence in order to conduct validation research. When dividing language into skills areas, defining the construct appears manageable; questions arise, however, with the construct underlying integrated assessment. Examining writing processes in a thematically linked integrated assessment, Esmaeili (2002) concluded that reading and writing could not be viewed as stand‐alone constructs. In a study of non‐native and native English speakers, Delaney (2008) found that reading‐to‐write tasks elicited processes attributable to unique constructs that were not merely a combination of reading ability and writing skill, but also of discourse synthesis. Plakans (2009) also found evidence of discourse synthesis in writers' composing processes for integrated writing assessment and concluded that the evidence supported interpretation of such a construct from test scores. Using structural equation modeling and qualitative methods, Yang and Plakans (2012) found complex interrelated strategies used by writers in reading–listening–writing tasks, further supporting the idea that the processes related to discourse synthesis (selecting, connecting, and organizing) improved test performance. In a similar study, focused on summarization tasks, Yang (2014) used structural equation modeling (SEM) to provide evidence that the task required comprehension and construction strategies as well as planning, evaluating, source use, and discourse synthesis strategies. While research into validity and integrated assessment is building momentum, ongoing attention and research is needed to attention to refine evolving definitions, innovation in task types, and approaches to scoring.

      Integrating skills in assessment will benefit from continued investigation as well as creative innovation. It represents an advance from the approach of the past that defined language as four skills plus grammar and vocabulary. Since integration appears in classroom and authentic contexts (Hirvela, 2016), its emergence or reemergence in testing is inevitable. Meanwhile, research must continue to look at the integrated skills construct(s) to better understand how they can best be defined and measured and how scores from assessments of integrated skills can be used.

      SEE ALSO: Assessment of Speaking; Assessment of Writing; English for Academic Purposes; Rating Scales and Rubrics in Language Assessment; Task‐Based Language Assessment; Validation of Language Assessments

      1 Ascención, Y. (2005). Validation of reading‐to‐write assessment tasks performed by second language learners (Unpublished doctoral dissertation). Northern Arizona University, Flagstaff.

      2 Barkaoui, K. (2015). Test takers' writing activities during the TOEFL iBT® Writing Tasks: A stimulated recall study. ETS Research Report Series, 1, 1–42.

      3 Brown, J. D., Hilgers, T., & Marsella, J. (1991). Essay prompts and topics: Minimizing the effect of mean differences. Written Communication, 8, 533–56.

      4 Carroll, J. M. (1961). Fundamental considerations in teaching for English language proficiency of foreign students. Washington, DC: Routledge.

      5 Cumming, A., Kantor, R., Baba, K., Erdosy, U., Eouanzoui, K., & James, M. (2005). Differences in written discourse in writing‐only and reading‐to‐write prototype tasks for next generation TOEFL. Assessing Writing, 10, 5–43.

      6 Delaney, Y. A. (2008). Investigating the reading‐to‐write construct. Journal of English for Academic Purposes, 7, 140–50.

      7 Esmaeili, H. (2002). Integrated reading and writing tasks and ESL students' reading and writing performance in an English language test. Canadian Modern Language Journal, 58(4), 599–622.

      8 Gebril, A. (2010). Bringing reading‐to‐write and writing only assessment tasks together: A generalizability analysis. Assessing Writing, 15, 100–17.

      9 Gebril, A., & Plakans, L. (2013). Toward a transparent construct of reading‐to‐write tasks: The interface between discourse features and proficiency. Language Assessment Quarterly, 10, 9–27.

      10 Gebril, A., & Plakans, L. (2014). Assembling validity evidence for assessment academic writing: Rater reactions to integrated tasks. Assessing Writing, 21, 56–73.

      11 Hirvela, A. (2016). Connecting reading and writing in second language writing instruction (2nd ed.). Ann Arbor: University of Michigan Press.

      12 Huang, H. T., & Hung, S. T. (2013). Comparing the effects of test anxiety on independent and

Скачать книгу