Analysis and comprehension of multimodal texts

Author(s)
Daly, Ann E
Unsworth, Leonard
Publication Date
2011
Abstract
This research is part of an Australian Research Council linkage project that is developing a model of image-language relations in multimodal texts (Unsworth, Barnes and O'Donnell, 2006-2008). The materials used in the investigation were the 2005 and 2007 NSW Basic Skills Tests (BST). Data from the tests enabled an analysis of variance to compare the mean difficulty of BST items assessing the comprehension of different types of image-language relations which have been outlined by Unsworth and Chan (2008). As there was a range of difficulty for the items assessing each type of image-language relation, it was decided to investigate other aspects of the texts that might be causing difficulty. Accordingly, complexity within the targeted written language segments and complexity within the images were analysed separately using selected features of Functional Grammar (Halliday, 2004/ 1994) and Visual Grammar (Kress & van Leeuwen, 1996) to see whether these factors also affected item difficulty. In conclusion, the findings about multi-semiotic text complexities and item difficulties are considered in order to identify factors that might be relevant to the comprehension of multimodal texts and to suggest implications for future research about language and literacy learning in primary schools.
Citation
Australian Journal of Language and Literacy, 34(1), p. 61-80
ISSN
1839-4728
1038-1562
Link
Language
en
Publisher
Australian Literacy Educators' Association (ALEA)
Title
Analysis and comprehension of multimodal texts
Type of document
Journal Article
Entity Type
Publication

Files:

NameSizeformatDescriptionLink