A post today by Brian at ‘Learning is Messy’ has engaged some of my over-arching thoughts on technology in education. I am a strong proponent of technology in the classroom, but I carry a mighty asterisk when I say that, because what I actually support is meaningful technology in the classroom. Brian correctly uses a wonderful example from the book ‘Understanding by Design‘ by Grant Wiggins and Jay McTighe, which should be required reading for all teachers. The example describes a highly involved cross-curriculum activity that takes place during the fall. It uses the idea of apples and harvest as the needle to connect math, language, science and art. While all the activities are ‘fun’ and ‘engaging’ they don’t necessarily take into account the expected learning outcomes of the curriculum. Brian sums it up well when he says:
I’ll bet too, that a teacher doing this unit would overwhelmingly get very positive feedback from the students’ parents, especially any that volunteered to help with it. So would the teacher most likely do the same “unit” again next year? Even if they moved grade levels because they were told what a great job they did and how much the children SEEMED to learn?
I think this is a major problem with most attempts to integrate technology into education. Educators find out about something ‘cool’ and integrate it into their program, then have to scramble to find a way to link it with meaningful assessment. It’s a completely backwards way of thinking, and counterintuitive to creating a meaningful learning environment. I subscribe fully to the ideas presented in ‘Understanding by Design’ where, in a nut shell, they suggest all program development be started at the end (what am I trying to assess and how will I confirm those goals have been reached) and planned toward the beginning (now that I know what proof I require for assessment, what learning activities will provide the students with the knowledge and opportunities to apply this knowledge in meaningful ways).
Technology in education shouldn’t be the end goal, it should be the means to help students gain useful skills and enhance learning and understanding. In my classroom I focus on technology that allows students to share their experimental data (as access to larger data sets can provide stronger evidence for their hypothesis, while also requiring the consideration of other forms of experimental error), tools that let students collect and analyze data in ways not easily replicated with pen and paper, and ways to share with a wider audience, as this ‘authentic’ audience tends to cause students to take greater care in the creation of their final products.
The rub in all this is: at what point in time are students going to be exposed to these tools so that they can use them effectively? Curriculum’s are already bursting at the seams with ‘required content’ making setting aside time to engage students in useful technology a great challenge. What is needed is a school wide commitment to technology integration, with identified subjects where certain skills will be taught and assessed. With this foundation students can build upon their skill set and learn to figure out new technologies on their own — which I feel is one of the fundamental goals.
I concede that this is easier said than done, but I think that is in line with learning. Learning is hard, and learning takes time, effort and commitment. Too often we seek the easy ‘quick-fix’ solutions.