Browsing by Browse by FOR 2008 "010499 Statistics not elsewhere classified"
Now showing 1 - 28 of 28
- Results Per Page
- Sort Options
- Some of the metrics are blocked by yourconsent settings
ReportPublication The Adaptation of Curvature Measures to Assess Nonlinearity in 'Functionless' Models(University of New England, School of Mathematics, Statistics and Computer Science, 2006) ;Thomas, KEllem, BernardBates and Watts (1980) presented curvature measures for assessing the effects of nonlinearity in regression models. The practical and routine methods for assessing this nonlinearity during data analysis employ the methodology of profile plots and traces. (Bates and Watts, 1988). In their book on Nonlinear Regression these profiling methods were clearly demonstrated with the fitting of nonlinear regression models, compartmental models and multiresponse models to data. This paper demonstrates that these profile methods can be extended to models similar to the above. This new class of models does not require that an analytical function or mathematical form of solution be specified. Such 'functionless' models generally require numerical schemes for their solution.2191 1 - Some of the metrics are blocked by yourconsent settings
Conference PublicationPublication 969 - Some of the metrics are blocked by yourconsent settings
Publication Open AccessJournal ArticleComputational Gains Using RPVM on a Beowulf Cluster(Technische Universitaet Wien, Institut fuer Statistik und Wahrscheinlichkeitstheorie, 2003) ;Carson, B; Mason, IAThe Beowulf cluster Becker et al. (1995); Scyld ComputingCorporation (1998) is a recent advance incomputing technology that harnesses the power ofa network of desktop computers using communicationsoftware such as PVM Geist et al. (1994) and MPIMessage Passing Interface Forum (1997). Whilst thepotential of a computing cluster is obvious, expertisein programming is still developing in the statisticalcommunity.Recent articles in R-news Li and Rossini (2001)and Yu (2002) entice statistical programmers to considerwhether their solutions could be effectively calculatedin parallel. Another R package, SNOW Tierney(2002); Rossini et al. (2003) aims to skillfullyprovide a wrapper interface to these packages, independentof the underlying cluster communicationmethod used in parallel computing. This article concentrateson RPVM and wishes to build upon the contributionof Li and Rossini (2001) by taking an examplewith obvious orthogonal components and detailingthe R code necessary to allocate the computationsto each node of the cluster. The statistical techniqueused to motivate our RPVM application is the geneshavingalgorithm Hastie et al. (2000b,a) for whichS-PLUS code has been written by Do andWen (2002)to perform the calculations serially.The first section is a brief description of the Beowulfcluster used to run the R programs discussedin this paper. This is followed by an explanation ofthe gene-shaving algorithm, identifying the opportunitiesfor parallel computing of bootstrap estimatesof the "strength" of a cluster and the rendering ofeach matrix row orthogonal to the "eigen-gene". Thecode for spawning child processes is then explainedcomprehensively and the conclusion compares thespeed of RPVM on the Beowulf to serial computing.975 132 - Some of the metrics are blocked by yourconsent settings
Journal ArticlePublication Connecting experimental probability and theoretical probabilityIn this reflective paper, I explore the thinking of a group of pre-service teachers as they reason about experimental probability and theoretical probability. I am particularly interested in investigating whether pre-service teachers could construct a bidirectional link between the experimental probability and theoretical probability, similar to the tentative model I introduce elsewhere (2008) for coordinating the two perspectives on distribution. Overall, this research study contributes to understanding how pre-service students can build connections to help teachers conceptualize and support students to embrace elements that act as connections between the two approaches to probability.1271 - Some of the metrics are blocked by yourconsent settings
Conference PublicationPublication Connecting Probability to Statistics using Simulated PhenomenaThis article addresses the use of probability to build models in computer-based simulations, through which exploring data and modelling with probability can be connected. The article investigates students' emerging reasoning about models, probability, and statistical concepts through an observation of grade 9 students, who used TinkerPlots2 to model a sample simulation based on probabilistic models of populations and tested models by comparing their behaviour with the generated data. Results from this research study suggest that students' use of probability to build models in computer-based simulations helps students to conceive of objects as comprising a set of data and the data distribution as being a choice made by the modeller to create approximations of real or imagined phenomena, where approximations depend on signal and variation.966 1 - Some of the metrics are blocked by yourconsent settings
Book ChapterPublication Connecting Probability to Statistics Using Simulated PhenomenaThis Chapter addresses the use of probability to build models in computer-based simulations, through which exploring data and modelling with probability can be connected. The article investigates students' emerging reasoning about models, probability, and statistical concepts through an observation of grade 9 students, who used TinkerPlots to model a sample simulation based on probabilistic models of populations and tested models by comparing their behaviour with the generated data. Results from this research study suggest that students' use of probability to build models in computer-based simulations helps students to conceive of objects as comprising a set of data and the data distribution as being a choice made by the modeller to create approximations of real or imagined phenomena, where approximations depend on signal and variation.976 1 - Some of the metrics are blocked by yourconsent settings
Conference PublicationPublication The Coordination of Distributional Thinking(International Association for Statistical Education (IASE) & International Statistical Institute (ISI), 2007)My aim is to trace the thinking-in-change (Noss & Hoyles, 1996) during the co-ordination of two epistemologically distinct faces of distribution. By co-ordination here, I refer to the connection between a data-centric perspective on distribution, which identifies distribution as an aggregated set of actual outputs, and a modelling perspective on distribution, which views distribution as a set of possible outcomes and associated probabilities. The coordination requires that the learner connects in both directions the data that forms a distribution of results to make up the modelling distribution. The dual connection is, I believe, at the heart of informal inference.1360 5 - Some of the metrics are blocked by yourconsent settings
Conference PublicationPublication Creating models and simulations of real world phenomena by defining attributes for a phenomenonThis paper highlights how students may develop new practices when using a variety of digital tools for creating models and simulations of real world phenomena by defining attributes for a phenomenon that can be measured. When applying a probability model to a real world situation, we conceive of that model as including a random process governed by a theoretical density distribution, and that data is sampled from this distribution. In this paper, the visualizations used when generating empirical sampling distributions of model parameters to assist model test and revision dynamically link the sampling process with the development of an empirical distribution. We present and discuss several examples along with conceptual and mathematical ideas that support the use of specific visualizations and aspects of modelling for fostering the development of students' robust knowledge of the logic of inference when using computer-based simulations to model and investigate connections between real contexts and data, and probability distributions. This paper further discusses how the big data era brings challenges and opportunities to the modelling and simulation field associated with big data.1356 - Some of the metrics are blocked by yourconsent settings
BookPublication Data Visualization and Statistical Literacy for Open and Big DataGood information is invaluable for decision-making. With the explosion of computational power and digital data storage technologies, and more recently the huge increase in mobile communications devices, vast quantities of data are becoming available. But these data are unwieldy. Where data was once scarce, hard to gather, and therefore had to be subject to the very careful analytical processes familiar from mathematical statistics and probability theory, now data come in overwhelming quantity (Volume), at high speeds (velocity), and in many different forms (variety). The challenges and opportunities in Big Data arise from the attempt to use data effectively when it is available in massive Volume, at high Velocity, and in great Variety. This book brings together a number of works that explore a range of significant issues related to Big Data. At one end of the range is concern for gathering data, as seen in the chapter on web scraping, to issues in processing data, as seen in the chapter on national statistical institutes, to concerns for using and analysing data, as seen in several chapters related to teaching with Big Data.2386 - Some of the metrics are blocked by yourconsent settings
Conference PublicationPublication Developing a framework for reasoning about explained and unexplained variation(International Association for Statistical Education (IASE) & International Statistical Institute (ISI), 2010); As a principal form of statistical thinking, consideration of variation impacts on all aspects of statistics. There has been extensive research about students' reasoning about variation but little research focusing on helping students model variation as a combination of explained and unexplained variation. A study analysed responses to a measurement instrument that was developed to assess tertiary students' informal reasoning about variation, focusing on explained and unexplained variation. Selected students were also interviewed. This paper reports the analysis of the responses that informed the refinement of a framework that describes six components of reasoning about explained and unexplained variation. Implications for researchers and educators will also be discussed.2580 - Some of the metrics are blocked by yourconsent settings
Conference PublicationPublication Developing a modeling approach to probability using computer-based simulations(Local Organizing Committee of The 12th International Congress on Mathematical Education, 2012)This research study investigates how middle school students use probability to model random behaviour in real-world contexts and the connections that they build among fundamental probabilistic concepts when they engage in exploring computer-based simulations that treat probability as a modelling tool. This article also discusses how students use the modelling approach to probability to draw basic inferences about data from examination of distributions. The results suggest that the way students express the relationship between signal and noise is of importance while building models from the observation of a real situation. This relationship seems to have a particular importance in students' abilities to build comprehensive models that link observed data with modelling distributions.978 - Some of the metrics are blocked by yourconsent settings
Journal ArticlePublication Drawing inference from data visualisationsThis article investigates how 14- to 16- year-old students interpret representations of multivariate data generated by data visualisation tools and how they then seek to construct their own meaningful data visualizations that highlight emerging important aspects of data. Students were asked a single question - about where they would like to live - that involved reasoning about a complex data set with many different variables that they were able to explore using a dynamic visualization tool that allowed them to easily generate multiple visualizations of the relevant data set. Findings show the diverse inferences that students articulated to reason about covariation between multiple variables while using the cycle of inquiry and visual analysis. Students revisited their specific kinds of inferences while using complex data visualisation tools, inventing and revising their visual representations of data. Once they obtained some necessary insight, they readily made an informed decision.1074 1 - Some of the metrics are blocked by yourconsent settings
Conference PublicationPublication The Emergence of Distribution From Causal RootsOur premise, in line with a constructivist approach, is that thinking about distribution and stochastic phenomena in general, must develop from resources already established. Our prior research has suggested that, given appropriate tools to think with, meanings for distribution might emerge out of knowledge about causality. In this study, based on the second author's ongoing doctoral research, we consider the relationship between the design of a microworld, in which students can control attempts to throw a ball into a basket, and the emergence of meanings for distribution. We suggest that the notion of statistical error or noise is a rich idea for helping students to bridge their deterministic and stochastic worlds.1118 - Some of the metrics are blocked by yourconsent settings
Conference PublicationPublication The Emergence of Stochastic Causality(Universidad Michoacana de San Nicolas de Hidalgo (UMSNH) [University of Saint Nicholas of the State of Michoacan], 2008); Pratt, DaveThis paper describes a range of students' (age 15) expressions of the interplay between causality and variation, which we relate to dimensions of complex causality implicit in mastering the concept of distribution. The results indicate support for our conjecture that it is possible to harness causality in order to help students to invent new ideas of "stochastic causality", a sense of control over random processes through the careful design of a simulation, which places emphasis upon deterministic as well as stochastic behaviour. This result stands apart from mainstream research, which tends to separate the determined and stochastic worlds.1379 - Some of the metrics are blocked by yourconsent settings
Journal ArticlePublication Estimating Parameters from Samples: Shuttling between SpheresIn order to better understand the thinking of students' learning to make informal statistical inferences, this research examined the thinking of senior secondary school students (age 17) engaged in the task of using observed data to make point estimates of a population parameter within a computer-based simulation. Following the "Growing Samples" instructional model, the point estimation activity involved sampling and estimating across three tasks with different sample sizes. This research study aimed to trace the evolution of the students' thinking, with particular attention to use of the statistical concepts in making informal inferences from sampling. The students in this study were observed to rely primarily on mathematical thinking, which, perhaps, inhibited their ability to construct meanings about the basic statistical concepts underpinning sampling when performing point estimates. At times in the process students were seen to shift between mathematical thinking, statistical thinking, and thinking about the context, but the mathematical thinking seemed to dominate their attempts to create estimates. These research findings are useful for informing the teaching of point estimation of a population parameter to school-aged students. The research findings also stress the need for teachers to rethink the relationship between statistical thinking and mathematical thinking in order to promote statistical thinking in relevant learning situations for their students.974 - Some of the metrics are blocked by yourconsent settings
Conference PublicationPublication GeoGebra in Teaching and Learning Introductory StatisticsThis article discusses the special opportunities for teaching statistics that technology offers teachers who aim to provide rich learning experiences for their students. These opportunities involve automation of many activities such as quickly organising data, computing measures, and generating graphs. By automating the tasks of computing statistics or generating data, technology facilitates students' ability to visualise abstract concepts, affording an opportunity to focus on conceptual understandings and data analysis. This article also examines how GeoGebra can be integrated into the curriculum and learning process of introductory statistics to engage college students in cycles of investigation including (a) managing data (b) developing students' knowledge for understanding specific statistical concepts, (c) performing data analysis and inference, and (d) exploring probability models. Recommendations are included for ways mathematics educators can engage college learners in developing their knowledge for exploring data concepts and doing statistics with GeoGebra.1082 4 - Some of the metrics are blocked by yourconsent settings
Journal ArticlePublication Geogebra in Teaching Introductory StatisticsThis article discusses the special opportunities for teaching statistics that technology offers teachers who aim to provide rich learning experiences for their students. These opportunities involve automation of many activities such as quickly organising data, computing measures, and generating graphs. By automating the tasks of computing statistics or generating data, technology facilitates students' ability to visualise abstract concepts, affording an opportunity to focus on conceptual understandings and data analysis. This article also examines how GeoGebra can be integrated into the curriculum and learning process of introductory statistics to engage college students in cycles of investigation including (a) managing data, (b) developing students' knowledge for understanding specific statistical concepts, (c) performing data analysis and inference, and(d) exploring probability models. Recommendations are included for ways mathematics educators can engage college learners in developing their knowledge for exploring data concepts and doing statistics with GeoGebra.977 - Some of the metrics are blocked by yourconsent settings
Journal ArticlePublication Informal Inferential Reasoning: Interval Estimates of ParametersThis research examined the informal inferential reasoning of senior secondary school students (age 17) when engaged in a computer-simulated sampling activity calling for the estimation of population parameters. The students undertook a task involving interval estimation of parameters within a computer-simulated environment. The research observed the students while they made and then explained their parameter estimates in order to better understand how the students formed the interval estimates, with particular attention to different strategies they adopted in forming these estimates. Activities involved sampling and estimating across three different sample size situations followed by a reflection stage to compare the estimates. Results of the analysis of the discussion between the students and the researcher during the students' activities are presented. A number of strategies for forming an interval estimate emerged. The students experimented with choosing different strategies for forming the interval estimate when a new sample (observed values) was drawn. The research findings are useful for informing the teaching of interval estimation to school-aged students.1033 - Some of the metrics are blocked by yourconsent settings
Publication Open AccessJournal ArticleModel-based Informal InferenceFollowing recent scholarly interest in teaching informal linear regression models, this study looks at teachers' reasoning about informal lines of best fit and their role in pedagogy. The case results presented in this journal paper provide insights into the reasoning used when developing a simple informal linear model to best fit the available data. This study also suggests potential in specific aspects of bidirectional modelling to help foster the development of robust knowledge of the logic of inference for those investigating and coordinating relations between models developed during modelling exercises and informal inferences based on these models. These insights can inform refinement of instructional practices using simple linear models to support students' learning of statistical inference, both formal and informal.2353 1 - Some of the metrics are blocked by yourconsent settings
Conference PublicationPublication Multidirectional Modelling for Fostering Students' Connections between Contexts and Data, and Probability DistributionsThe article investigates how 14- to 15- year-olds build informal conceptions about data distributions, and theoretical probability distributions as they engaged in a multidirectional modelling process using computer-based simulations. The students of this study are engaged in modelling. First, students examined data from an unknown stochastic process and built a model of the processes that might explain the outputs. Second, the students constructed representations that generated data whose distributions were well predictive of real world samples. This study shows shifts in the conceptual structures across the two directions and points to the potential of specific aspects of multidirectional modelling for fostering the development of students' robust knowledge of the logic of inference when using computer-based simulations to model and investigate connections between real contexts and data, and probability distributions.1014 - Some of the metrics are blocked by yourconsent settings
Conference PublicationPublication Reasoning About Sampling in the Context of Making Informal Statistical Inferences(International Collaboration for Research on Statistical Reasoning, Thinking and Learning (SRTL), 2011); This research study examined how senior secondary school students develop understanding of the core statistical concepts of "sample" and "sampling" when making statistical inferences; and how students build interconnections between these concepts. This was observed as students engaged in making interval estimates of a population parameter within a computer-simulated environment. Activities involved sampling and estimating across three different sample size situations followed by a reflection stage to compare the estimates. Results of the four stages of the students' activities are presented. Discussion of the results will be shared at the SRTL-7 forum.2416 2 - Some of the metrics are blocked by yourconsent settings
DatasetPublication Shedding, kinetics, molecular epidemiology and improved surveillance of inclusion body hepatitis caused by fowl adenovirus type-8b in broiler chicken(University of New England, 2019); ; ; Inclusion body hepatitis caused by Fowl adenovirus is an important disease of chickens. A major risk factor identified in this thesis was multi-age sites. Viral load, faecal shedding of virus, clinical signs and bursal atrophy were reduced in chickens carrying maternal antibody or those infected at 14-17 days of age relative to 1-3 days of age. Methods to quantify FAdV-8b in litter and dust were developed and the utility of these tests shown. Presence of high levels in litter darkling beetles is suggestive of a role in epidemiology. FAdV8 was readily inactivated in litter at temperatures of 45˚C or above.474 5 - Some of the metrics are blocked by yourconsent settings
Conference PublicationPublication 991 - Some of the metrics are blocked by yourconsent settings
Publication Open AccessThesis Masters ResearchSpatio-Temporal Analysis of EEG data using Wavelets and Geostatistical methods(2010) ;Grant, Paul ;Murison, RobertMy intention here is to provide an overview of the methodology used in applying wavelet analysis techniques to undertake signal analysis combined with the application of geostatistical methods. After defining the problem and provision of the application of Wavelets to mismatch negativity trials, I provide a succinct walk through wavelet analysis and some applications, also introduce the possibility of spatial correlation and develop models to allow for such. Finally we construct methods enabling us to detect the latency of onset of mismatch negativity responses.3170 784 - Some of the metrics are blocked by yourconsent settings
ReportPublication Statistical Curvature in Two Dimensions: A Teaching Application(University of New England, School of Mathematics, Statistics and Computer Science, 2005); Thomas, KylieThis report presents a novel method of teaching curvature measures via simple 2D examples using the elementary formulae for curvature encountered by undergraduate students. Using this procedure students can verify algebraically and numerically the invariance of intrinsic curvature and corroborate the best parameterisation by examination of parameter effects curvature, again mathematically and empirically. For serious users of curvature measures, this elementary exposition also reconciles the general definition of 'statistical curvature' coined by Efron (1975), with the approach of Amari (1990), albeit for the exponential connection only.2266 1 - Some of the metrics are blocked by yourconsent settings
Journal ArticlePublication Students' Construction of Meanings about the Co-ordination of the Two Epistemological Perspectives on DistributionThe importance of the connection between theoretical frequencies and observed relative frequencies in pedagogy is advocated by various probabilistic researchers. This study examines an associated area of importance to mathematical education. Little is known about the process by which the co-ordination of the data-centric and modelling perspectives on distribution might be achieved. The focus of this paper is on variation in students' (aged 14 to 15 years) evolving meanings about the co-ordination of two distinct epistemological perspectives on distribution. Extracts from two case studies illustrate students' construction of two interpretations for the two perspectives on distribution through their attempts to transform, directly or indirectly, the specific modelling distribution, and observing how that changes a graph/histogram of the actual outcomes. This is done by using on-screen control mechanisms to change the way that the computer generates the data within a carefully designed computer simulation.941 - Some of the metrics are blocked by yourconsent settings
Journal ArticlePublication Students' Emerging Reasoning About Data Tables of Large-Scale DataThis study investigated thirty-two Year 9 secondary school students' (15 year olds) reasoning about data tables of large-scale data. Eight groups of four students, drawn from six classes, participated in a workshop that examined the components of population change for EU and candidate countries, namely natural increase of population, net overseas migration for Europe and their country, and total population growth. Students investigated trends in real data displayed in tables, and responded to a set of reflective questions. Analysis of the reasoning used by the students revealed four levels of data-table comprehension - reading the data, reading within the data, reading beyond the data, and reading behind the data - similar to the levels described for students working with smaller data sets.1278 - Some of the metrics are blocked by yourconsent settings
Conference PublicationPublication Towards the design of tools for the organization of the stochastic(European Society for Research in Mathematics Education (ERME), 2006); Pratt, DaveThis paper reports on one aspect of the ongoing doctoral research of the first named author. This study builds on prior work, which identified that students of age 11 years had sound intuitions for short-term randomness but had few tools for articulating patterns in longer-term randomness. This previous work did however identify the construction of new causal meanings for distribution when they interacted with a computer-based microworld. Through a design research methodology, we are building new microworlds that aspire to capture how students might use knowledge about the deterministic to explain probability distribution as an emergent phenomenon. In this paper, we report on some insights gained from early iterations and show how we have embodied these ideas into a new microworld, not yet tested with students.1205