Date of Award
Spring 3-2009
Document Type
Thesis
Degree Name
Master of Science in Educational Technology (MESET)
First Advisor
Mark Hawkes
Second Advisor
Vicki Sterling
Third Advisor
Haomin Wang
Abstract
Although technology has impacted education in many ways over the past half century, impacts on student assessment are perhaps the least understood. The role of technology in supporting authentic task assessment is an under-studied phenomenon. Given that assessment is a critical skill set for teachers, and since few pre-service teachers at the college level have enough experience with assessment, this research was undertaken to understand how technology-supported micro-lessons and different forms of assessment might impact professional learning. The research questions are as follows:
1. Does the technology supported self, peer, and instructor assessments enhance students' ability to perform assessment tasks?
2. Do students' perceptions of the value of technology supported self, peer and instructor assessments change from pre to post participation in technology supported peer and self -assessment episodes?
3. Is peer- or self-assessment consistent with the instructor's assessment?
4. Are there any statistically significant differences among self, peer and instructor assessment scores? If there are differences, what are they?
5. Does gender impact the self-assessment and perception of the different forms of assessment?
6. Do the micro-lesson tasks and the assessment practices enhance the pre-service teachers' professional learning?
From the spring of 2006 to the spring of 2008, 161 junior/senior undergraduates from college education at a Midwest university developed and delivered micro-lessons on educational psychology topics and presented them to their peers. Lessons were evaluated by the peers, the students themselves, and an expert teacher. This project studied peer, self-, and instructor's performance assessments with the support of technology. It also investigated preservice teachers' perception change of assessments and feedback after taking this project. Student professional learning reflections were also surveyed. We found that about 82.8% of the participants thought that the project helped improving their assessment skills. Over 2/3 of the participants thought that they had developed more effective assessment skills toward their peers' and their own lesson presentations. Paired T-test between the pre- and post- surveys showed a statistically significant increase in the mean scores of "the value in grading my own micro-lesson via DVD recording", "self assessment is worth the time effort", and "my interest in micro-lesson assessment enhanced by my tablet technology". There was a moderate correlation between self- and instructor assessments; there was a high correlation between peer group and instructor assessments. Statistically the scores given by the peer group and instructor were not significantly different, but self assessment scores were significantly lower than assessments by peer group and the instructor. There was no statistically significant difference between male and female students' self-assessments as well as the perception of different forms of assessment. The participants believed that this project had enhanced their professional learning. They hoped to apply what they had learned to other classes and future projects. Ninety-four percent of the participants thought the instructor's feedback was thoughtful and constructive; 40% of them were not satisfied with their peers' feedback.
Recommended Citation
Fu, Hongxia, "Technology in Support of Performance Assessment" (2009). Masters Theses & Doctoral Dissertations. 402.
https://scholar.dsu.edu/theses/402