TLT

Teacher Learning of Technology-Enhanced Formative Assessment
Group Page: 
PERG
Contact(s): 
Beatty, Ian D.
Contact(s): 
Leonard, William J.
Funding: 
US National Science Foundation grant ESI-0456124 (TPC)
Starting date: 
2005-07-01

TLT was a five-year research project studying how secondary science and mathematics teachers learn to use an electronic "classroom response system" to implement a specific pedagogical approach called Technology-Enhanced Formative Assessment (TEFA).

Background: Classroom response systems (CRSs) were technology that helped an instructor poll students' responses to a question, displaying a graphical chart of the class' aggregated answers. These systems, although simple in concept, could have a beneficial and even transformative effect on instruction.* TEFA was a pedagogy that Gerace, Leonard, Beatty, and colleagues had developed over 15 years to support effective science teaching with a CRS. Teachers at the university, high school, and middle school levels had succeeded with TEFA, but mastering it was often challenging and time-consuming, and took extensive support.

Objectives: The TLT project was designed with three goals: (1) to better understand teacher learning of CRS technology and TEFA, and consequent changes to their practice; (2) to better understand effective and efficient methods of teacher professional development in TEFA; and (3) to develop tools and techniques for the evaluation of teachers' TEFA mastery, of suitable design and quality for use in a controlled, randomized study of the effects of TEFA on student learning.

Professional Development: Project staff had conducted (and continue to conduct) intensive, sustained, on-site professional development (PD) programs for 40 middle- and high-school science and math teachers at six schools in three Western Massachusetts school districts. PD focuses on use of CRS technology (provided to the teachers by the project), practice of the TEFA pedagogy, development of supporting curriculum elements, and attendant teaching issues. It began with a four-day summer workshop, continued with a year of weekly or biweekly after-school meetings, and ended with one or two years of monthly after-school meetings. The PD was itself conducted according to the TEFA approach.

Research: The project used a longitudinal, delayed-intervention design. Data was collected via several channels, including interviews and regular questionnaires for participating teachers, surveys for their students, videotaping of classes being taught, and video- and audio-taping of professional development meetings. Analysis was mixed-methods, focused on detailed, heavily triangulated case studies and cross-case analysis. Significant new instrumentation had been developed, tested, and refined during the course of the project.

Outcomes: Project staff had compiled detailed case studies of four participants, with partial profiles of several others, and had developed an initial model of teacher learning and pedagogical transformation called the "model for the co-evolution of teacher and pedagogy." A TEFA PD program had been developed and iteratively improved, and the TEFA pedagogy had been more clearly articulated, defended, elaborated, and disseminated. Experiences, methods, and preliminary results had been presented at several different professional conferences.

* For references, contact Ian Beatty (idbeatty@uncg.edu).

The TLT project is funded primarily by grant TPC-0456124 from the National Science Foundation. Any opinions, findings, conclusions, and recommendations expressed here or in other project publications are those of the principal investigators and do not necessarily reflect the views of the NSF.

Additional project support has been provided by InterWrite Learning (now owned by eInstruction), makers of the PRS-RF classroom response system.

TLT Publications

Stuff we've written about/in/from the TLT project
Contact(s): 
Beatty, Ian D.

publications

(Including journal articles and book chapters, and distributed conference papers.)

Beatty, Ian D. & Feldman, Allan (2009), "Illuminating teacher change and professional development with CHAT", In the Proceedings of the Annual Meeting of the National Association for Research in Science Teaching (NARST), Garden Grove CA, Apr 20.

Beatty, I. D. & Gerace, W. J. (2009). Technology-enhanced formative assessment: A research-based pedagogy for teaching science with classroom response technology. Journal of Science Education & Technology 18(2) 146.

Beatty, I. D., Feldman, A., Lee, H., St. Cyr, K. & Harris, R. (2008). "Teacher learning of technology-enhanced formative assessment," a conference paper accompanying a special symposium presented at the Annual International Conference of the US National Association for Research in Science Teaching (NARST), Baltimore, MD, Apr 01.

Feldman, A., & Capobianco, B. M. (2008). Teacher learning of technology enhanced formative assessment. Journal of Science Education and Technology 17(1), 82-99.

Beatty, I. D., Leonard, W. J., Gerace, W. J., & Dufresne, R. J. (2006). "Question based agile teaching: Teaching science (well) with an audience response system." In Banks, D. A. (Ed.), Audience Response Systems in Higher Education: Applications and Cases. Idea Group Inc., Hershey PA.

Beatty, I. D., Leonard, W. J., Gerace, W. J., & Dufresne, R. J. (2006). "Designing effective questions for classroom response system teaching." American Journal of Physics 74(1) 31-39.

reports

Beatty, I. D. (2008). TPPI: The TLT Pedagogical Perspectives Interview (technical report beatty-2008tpp). Amherst, MA: University of Massachusetts Scientific Reasoning Research Institute.

TLT Talks

Project-related presentations, workshops, and posters

presentations

Beatty, Ian D. & Feldman, Allan (2009-04-20), "Figuring out what the heck CHAT has to say about teacher change" (listed as “Co-evolution of practice and pedagogy: A model for science teacher change in the context of professional development” in the program), a talk presented to the Annual Meeting of the National Association for Research in Science Teaching (NARST), Garden Grove CA. → conference paper

Lee, H., Feldman, A., & Beatty, I. D. (2009-04-18). "Teachers' implementation of classroom response system to perform formative assessment in secondary science/math classes," a presentation at the Annual Meeting of the National Association for Research in Science Teaching, Garden Grove CA. → conference paper

Feldman, A. (2009-01-16). "Technology-Enhanced Formative Assessment plus Thinking Journey: An Innovative Approach to Science Teaching and Learning," an invited presentation at the University of South Florida, Tampa, FL.

Lee, H. & Feldman, A. (2009-01-10). "Barriers and affordances to science and math teachers' implementation of Technology Enhanced Formative Assessment," a talk presented at the Annual Meeting of the Association for Science Teacher Education, Hartford, CT.

St. Cyr, K., Beatty, I. D., Feldman, A., Gerace, W. J. & Leonard, W. J. (2009-01-08). "Teacher change facilitated by sustained school situated professional development: Exemplar learning of Technology Enhanced Formative Assessment (TEFA)," a paper presented at the Association for Science Teacher Education (ASTE) International Conference, Hartford, CT.

Feldman, A. (2008-11-12). "Technology-Enhanced Formative Assessment: An Innovative Approach to Science Teaching and Learning," an invited presentation at the Hebrew University of Jerusalem, Israel.

Beatty, I. D. (2008-10-24). "How do Physics students access their knowledge?", an invited seminar presented to the faculty and graduate students of the Cognitive Psychology Area of the Department of Psychology, University of North Carolina at Greensboro.

Harris, R., Lee, H., St. Cyr, K., Beatty, I., Feldman, A., Gerace, W., & Leonard, W. (2008-06-13). "Technology-enhanced formative assessment: A study of teacher change," a presentation at the University of Massachusetts Amherst School of Education Centennial, Amherst, MA.

Leonard, W. (2008-05-29). "Instructional innovations: Four ways to change the classroom dynamic," an invited presentation for the Summer Institute on Student Engagement, Berkshire Community College, Pittsfield, MA.

Beatty, I. D. (2008-05-13). "Modeling teacher change", an invited talk presented to the Bureau of Educational Research, Department of Educational Psychology, and Department of Physics at the University of Illinois, Urbana-Champaign, IL.

Beatty, I. D. (2008-04-22). "Modeling teacher change", an invited talk presented to the Department of Physics at the University of North Carolina, Greensboro, NC.

Beatty, I. D., Feldman, A., Lee, H., St. Cyr, K. & Harris, R. (2008-04-01). "Teacher learning of technology-enhanced formative assessment," a special symposium presented at the Annual International Conference of the US National Association for Research in Science Teaching (NARST), Baltimore, MD.

Feldman, A., Beatty, I. D., Leonard, W. J. & Gerace, W. J. (2008-03-24). "Technology-Enhanced Formative Assessment: An innovative approach to the teaching and learning of science," a contributed talk at the Annual Meeting of the American Educational Research Association (AERA), New York, NY.

Feldman, A., Beatty, I. D., Leonard, W. J. & Gerace, W. J. (2008-01-10). "Technology-Enhanced Formative Assessment: An Innovative Approach To Student-Centered Science Teaching," a paper presented at the Association for Science Teacher Education (ASTE) International Conference, St. Louis, MO. http://srri.umass.edu/publications/beatty-2008tia

Beatty, I. D. (2007-12-04). "Teacher Learning of Technology-Enhanced Formative Assessment: A research project involving secondary school science and math, classroom response systems, and teacher professional development," an invited colloquium for the University of Massachusetts STEM Institute, Amherst, MA.

Lee, H. (2007-10-18). "Teachers' concerns and obstacles to implementing classroom response system in secondary science/math classes," a talk presented at the Annual Meeting of the Association of Science Teacher Educators: Northeast Region, University of Massachusetts Amherst, MA.

Gerace, W. J. & Beatty, I. D. (2007-06-02). "Getting started with educational research," a seminar for postgraduate students in the RNA project, Faculty of Education, University of Johannesburg, South Africa.

Beatty, I. D. (2007-02-12) "De-trivializing classroom response systems," an invited seminar for the Physics Education Research Group, Department of Physics, The Ohio State University.

Beatty, I. D., Leonard, W. J., Feldman, A., & Gerace, W. J. (2006-07-25) "Illuminating teacher learning of technology-enhanced formative assessment," contributed talk DH05 at the Summer Meeting of the American Association of Physics Teachers (AAPT), Syracuse NY. AAPT Announcer 36(2) 133. http://ianbeatty.com/blog/?p=23

Gerace, W. J. & Beatty, I. D. (2005-11-25), "Learning to think with physics: A minds-on and hands-on approach to physics instruction," an invited seminar for the Pedagogical Institute, Nicosia, Cyprus.

Beatty, I. D. (2005-09-27) "Formative assessment and agile teaching: Re-framing physics instruction," an invited talk at the 90th Reunión Nacional de Físicia of the Asociación Física Argentina, La Plata, Argentina.

workshops

Beatty, I. D., Gerace, W. J., Leonard, W. J., & Feldman, A. (2008-11-15) "Technology-enhanced formative assessment (TEFA) with a classroom response system," a workshop at the Inaugural Conference on Classroom Response Systems: Innovations and Best Practices, Delphi Center for Teaching and Learning, University of Louisville, KY.

Leonard, W. (2008-05-29) Collaborative Learning. Invited workshop for the Summer Institute on Student Engagement, Berkshire Community College, Pittsfield, MA.

Beatty, I. D., Gerace, W. J., Leonard, W. J. & Feldman, A. (2008-03-29) "Teaching with classroom response technology ('clickers')," a workshop at the Annual National Conference of the US National Physics Teachers Association (NSTA), Boston, MA.

Phillis, Randall W. & Schneider, Stephen E. & Lavoie, Nathalie & Beatty, Ian D. & Maloy, Robert W. (2008-03-05) "Writing effective PRS questions," a workshop for the campus community by the "PRS Best Practice Fellows" project of the UMass President's Office and the UMass Amherst Center for Teaching, Amherst, MA.

Beatty, I. D. & Gerace, W. J. (2007-11-16). "Teaching Science with Technology-Enhanced Formative Assessment," an invited workshop for Bahamas public school science teachers organized by the Bahamas Ministry of Education, Nassau, Bahamas.

Gerace, W. J. & Beatty, I. D. (2007-11-06). "Question driven instruction with classroom response technology," an invited workshop for Connecticut public school teachers, Greater Hartford Academy of Math and Science, Hartford, CT.

Gerace, W. J. & Beatty, I. D. (2007-10-20). "Question driven instruction with classroom response technology," an invited workshop at the Fall Joint Meeting of the New England Sections of the American Physical Society and the American Association of Physics Teachers (AAPT), University of Connecticut, Storrs, CT.

Beatty, I. D. & Gerace, W. J. (2007-06-04). "QDI+TEFA: A radical research-based pedagogy with radical results," a workshop for the Faculty of Education, University of Johannesburg, South Africa.

Beatty, I. D. & Gerace, W. J. (2007-05-31). "A research project on science teacher professional development," a workshop for the Faculty of Education, University of Johannesburg, South Africa.

Gerace, W. J. & Beatty, I. D. (2007-05-28). "Constructivism: Implications for instruction and learning," part 1 of a workshop for the Faculty of Education, University of Johannesburg, South Africa.

Beatty, I. D. & Gerace, W. J. (2007-05-28). "Formative assessment and dialogical discourse: Magic keys to constructivist, student-centered instruction," part 2 of a workshop for the Faculty of Education, University of Johannesburg, South Africa.

Gerace, W. J. & Beatty, I. D. (2006-05-25) "Agile teaching of physics," a workshop for science teachers at the Makerere University Experimental School, Kampala, Uganda.

Gerace, W. J. & Beatty, I. D. (2006-05-05) "Agile teaching of physics," a workshop for faculty and in-service science teachers at the University of KwaZulu-Natal, Durban, South Africa.

Gerace, W. J. & Beatty, I. D. (2005-11-21:24), "A constructivist approach to promoting active learning in secondary physics classes," a series of four invited workshops for physics teachers organized by the Cyprus Ministry of Education in Nicosia, Larnaka, Pafos, and Limassol, Cyprus.

posters

Harris, R., Lee, H., & St. Cyr, K. (2008-06-13). "Technology Enhanced Formative Assessment: A study of teacher change," a poster presented at the Centennial Marathon Workshop at School of Education, University of Massachusetts Amherst, MA.

Beatty, I. D., Leonard, W. J., Gerace, W. J. & Feldman, A. (2006-07-26) "Teacher learning of technology-enhanced formative assessment," contributed poster EJ07–24 at the Summer Meeting of the American Association of Physics Teachers (AAPT), Syracuse NY. http://ianbeatty.com/sites/srri/files/posters/AAPT_2006–06_Poster_EJ07–24.pdf

Pedagogical Model

Technology-Enhanced Formative Assessment (TEFA)
Contact(s): 
Beatty, Ian D.
The most complete description and defense of the TEFA pedagogy has been published in our recent paper Technology-enhanced formative assessment: A research-based pedagogy for teaching science with classroom response technology. Below is an extraordinarily brief summary.

TEFA is designed around four pedagogical "pillars":

  1. question-driven instruction (QDI),

  2. dialogical discourse (DD),

  3. formative assessment (FA), and

  4. meta-level communication (MC).

TEFA structures classroom learning using a "question cycle".

  1. present a question;

  2. allow individual thinking or small-group work;

  3. collect responses;

  4. display a histogram of the responses;

  5. elicit and discuss the reasoning behind each response;

  6. continue discussing ideas, related situations, etc.; and

  7. provide wrap-up or closure (summary, mini-lecture, segue to another question, etc.).

A "classroom response system" (CRS) facilitates interaction with students and supports the question cycle.

  1. Students enter responses into radio-frequency "clickers";

  2. a radio-frequency receiver communicates with clickers;

  3. software aggregates the responses and presents a histogram of class-wide response choices; and

  4. additional features support review, diagnosis, etc.

Professional Development

Intensive, sustained, collaborative, on-site PD for TEFA
Contact(s): 
Leonard, William J.

Overview

PD is sustained: a 3-year program for each cadre.

All teachers in each cadre are from the same school.

PD taught by project PIs, using & modeling TEFA pedagogy

Participating Teachers

Typical Weekly PD Lesson & Homework

Checking in: Teachers talk about how they are doing (with technology, with question design, with managing discussion, etc.), what they have been trying, and what is on their minds.

Sample question cycle: Instructors model how a typical PRS question/discussion/closure cycle might look, and introduce new question styles.

Theoretical underpinnings: Instructors provide some structure and pedagogical theory to help teachers understand TEFA pedagogy and apply it consistently.

Working on new questions: Teachers collaborate in small groups to develop questions for their own use during the upcoming week, and to get feedback on them.

Homework: After each meeting, teachers reflect on their practice (make a journal entry; read, summarize, and react to an article; predict what will happen when they ask a certain question in class; etc.).

Research

Studying how to teach TEFA, and how teachers learn it
Contact(s): 
Beatty, Ian D.

Overview

TLT uses a longitudinal, delayed-intervention, mixed-methods research design to illuminate teacher learning.

Instrumentation Developed or Under Development

(in collaboration with SRI International, a project subcontractor)

TCOP: TEFA Classroom Observation Protocol

We are working on a live observation protocol aimed at capturing key indicators of TEFA skill and of primary pedagogical practices. Meanwhile, we visit and videotape complete TEFA-using classes for subsequent coding and analysis. Short pre- and post-observation interviews accompany each visit/observation.

TVBI: TEFA Video-Based Interview

In this semi-structured interview protocol, a teacher is shown select five-minute clips from different videotapes of their own teaching, and asked to reflect upon how and why their practice has evolved.

TIL: TEFA Implementation Log

This short, simple paper form documents one day's use (or not) of TEFA in each class and the teacher's perceptions of how it went.

TMRS: TEFA Monthly Reflection Survey

This web-based survey uses multiple-choice and free-response questions to probe teachers' TEFA use, their self-perceived skills and learning, and factors that have helped or hindered them.

TPPI: TEFA Pedagogical Perspectives Interview

This two-part semi-structured interview (one hour per part) probes a teacher's outlook and beliefs on pedagogy, general aspects of classroom practice, and classroom roles.

TPPS: TEFA Pedagogical Perspectives Survey

This web-based multiple-choice survey also probes aspects of teachers' views on pedagogy and classroom practice, but in a more scalable way. Large portions were drawn from published instruments, for validity and comparison with other studies.

TCFS: TEFA Contextual Factors Survey

This web-based survey of multiple-choice and free-response questions solicits a teacher's perceptions of aspects of their teaching context (technological, administrative, logistical, and social) that might help or hinder attempts to implement TEFA.

TLPI: TEFA Lesson Planning Interview (abandoned)

This one-hour, artifact-based, semi-structured interview probes a teacher's lesson planning practices, considerations, and priorities, aimed at surfacing instructional priorities and orientation.

TPBS: TEFA Professional Background Survey

This web-based survey captures a teacher's professional preparation and background, to support case studies.

TSS: TEFA Student Survey

This optically-scanned paper survey is for students in teachers' classes, eliciting their perceptions of the learning environment.

Data Collection

Data about teachers' pedagogical beliefs and views: TPPI (yearly), TPPS (yearly), TLPI (2x/year, discontinued)

Data about teachers' practice and TEFA implementation: TCOP (4x/year), TIL (daily), TSS (2x/year)

Data about teachers' reflections on their own learning: TMRS (monthly), TVBI (yearly)

Data about teachers' backgrounds and context: TCFS, (yearly), TPBS (once)

Preliminary Findings

Some tentative observations on our early experiences and data
Contact(s): 
Beatty, Ian D.

These findings are all very preliminary and tentative, and are as of August 2008.

When example content is used in PD, teachers tend to focus overly on superficial features.

Teachers seem to have surprising (to us, anyway) difficulty separating the intended pedagogic message from the context in which it is presented. Many become distressed if too many examples are given in a subject they care little about, or if they perceive the level to be too high for their students. Math teachers say there is too much science, and science teachers say there is too much math. Many don't understand that the questions modeled during PD are designed for them, not necessarily for their students.

Technology can be an impediment to FA practice.

CRS technology is intended to enhance practice of FA, but initially, using it presents a barrier. Further, teachers' initial facility with the hardware and software significantly impacts their learning curve for question development and implementation of the TEFA approach.

[I've been most focused on] getting the technology to work. Check TV, program, pass out clickers, check if everybody's unit is working then ask a question. Talking with kids is easy.

Teachers follow a predictable general trajectory of skills development and focus of attention.

Teachers concentrate on technology first, then progress to question design. Next, they work on managing whole-class discussion, and eventually on interpreting student responses. Integrating TEFA with other constraints and aspects of teaching is an area of focus that develops throughout the "trajectory".

The toughest part for me is designing questions... I think the course is definitely addressing the issue...

Individual teachers choose widely different details of TEFA practice to focus on.

Two teachers became very attentive to the amount of time they waited after asking a question. One became interested in the nature of questions. Another invented new ways of managing the classroom discussion. Yet another wrestled with tension between structure and control vs. unstructured discussion.

There's always that moment of ta-da! And you look at it and you think 'Wow!'... And on the fly, mentally, you say 'well now how do I handle this?'

In student surveys, some teachers have improved and some have not, but overall ratings have stayed about the same.

Based on three rounds of student surveys of cadre 1, teachers vary widely in how they have changed during the first year of PD. Early results suggest that the classroom environment created by this cadre of teachers has not changed significantly yet --- but they are not controlled for student population, subject, course level, etc.

I've been very pleased with [TEFA] as a way of diagnosing preconceptions or misconceptions. And I like it very much for that.

Middle school teachers have lower expectations for students than high school teachers, hindering TEFA adoption.

The teachers in cadre 1 include middle and high school math and science teachers. In general, the middle school teachers had much lower expectations for their students' ability to participate in TEFA and engage in quality discussions.

Reflection leads to improvement.

The teachers who were most successful with the approach during the first year seem to be those who were most self-reflective. When TEFA wasn't working as well as they'd hoped, they focused on their own beliefs and actions, and on what they could do differently. Those who were less successful with TEFA tended to attribute difficulties to students' abilities and attitudes, parents' attitudes, etc.

[Participation in the PD] does make me more reflective... I do reflect more often [about] what I'm doing and how well it connects with the kids, without a doubt it does force me to look at myself more closely.

Project Rationale

A literature-based argument for the need for a project like TLT

ACT Assessment scores for 2004 "reveal that an alarming number of graduating high school seniors continue to be unprepared for college science and math courses" (ACT, 2004). The need for reform of high school science and math instruction is clear, and ideas for reform are plentiful --- some more credible than others. Formative assessment is arguably the pedagogic practice that most dramatically improves student learning, especially for traditionally underachieving groups, so great attention should be paid to high school STEM reforms based upon it.

"Formative assessment" (FA) means using assessment (measurement) of student knowledge and learning in order to provide guiding feedback for students and teachers: assessment for learning, rather than assessment of learning. For assessment to be formative, the feedback must be used to adapt teaching and learning (Black & Wiliam, 1998a; Bransford et al., 1999; Sadler, 1989). According to the research literature, "innovations which include strengthening the practice of formative assessment produce significant, and often substantial, learning gains" across ages, school subjects, and countries --- gains "larger than most of those found for educational interventions" (Black & Wiliam, 1998b). Many studies show that FA is particularly beneficial for students who are traditionally "low achievers," with potential to help narrow the achievement gap between students from different socioeconomic strata (Black & Wiliam, 1998b; Stiggins, 2002). This last observation is particularly relevant in light of the No Child Left Behind act. FA can elicit richer classroom discourse and help students become more engaged and motivated (Gallagher, 2000). It can help students become aware of the limits of their understanding and the actions they can take to progress (Ramaprasad, 1983; Sadler, 1989). It can also catalyze significant teacher learning, as feedback gives teachers new understanding of student learning (Black et al., 2002) and the effectiveness of their own practices (Bransford et al., 1999).

Practicing FA is difficult. It requires new modes of pedagogy, deep changes in classroom practice, adoption of new roles by teacher and students, and a renegotiation of the implied contract between teacher and students (Black et al., 2002; Black & Wiliam, 1998b). Overall, FA is not well understood by teachers, even those attempting to implement it. Common pitfalls that lead to "weak practice" of FA include: encouraging superficial, rote, detail-oriented learning; failing to engage in critical review or discussion of assessment questions used; confusion between summative and formative uses, including over-emphasis of grading; and emphasizing competition among students rather than personal improvement (Black & Wiliam, 1998a).

The literature provides some insight into what makes for effective FA (Black et al., 2002; Black & Wiliam, 1998a). Assignments or questions used should encourage deeper and more reflective learning by requiring inferential and deductive reasoning, and they must be designed to reveal the thinking used by students to reach answers, not just their correctness. In addition to good instruments, teachers need the ability to interpret results and decide how to respond. Students learn more when the amount and kind of feedback they receive is tuned to their particular and immediate need. FA is perhaps most effective as "real-time" FA, in which feedback is almost continuously obtained to monitor learning and instruction is almost continuously adjusted (Stiggins, 2002).

Although the literature is plentiful on the effects of FA and the ways it has been implemented, it offers only scattered anecdotal accounts of teachers' learning of FA-based pedagogy. There is evidence that practicing FA effectively requires subject-specific pedagogical content knowledge as well as generic FA skills that apply across subjects (Black et al., 2002).

Practicing real-time FA with the number of students found in typical classrooms can be difficult or impossible because of the communication and information-processing load it places on the teacher. Classroom response systems (CRSs) help to overcome this problem by providing a supplemental, technology-enhanced channel of communication between teacher and students. A CRS consists of a set of input devices for students, such as laptop computers, PDAs, programmable calculators, or remote-control style "clicker" keypads; a computer for the instructor; some kind of network or mechanism connecting these together; and software to control the system. It enables a teacher to present a question, problem, or task to the class; have students enter their responses into input devices; collect and aggregate the answers; and display the aggregated answers as a histogram or other representation. Using such a tool, the teacher can efficiently pose questions to the class, motivate all students to answer the question, assimilate the range of answers, and motivate a class-wide discussion. CRSs are a key enabling technology for real-time FA (Beatty, 2004; Robert J. Dufresne et al., 1996; Roschelle et al., 2004).

Using a CRS does not automatically result in FA. The UMass Physics Education Research Group, of which Leonard, Gerace, and Beatty are members, has developed a pattern of implementing real-time FA with a CRS that we call technology-enhanced formative assessment (TEFA). In this pattern, a typical class is structured around an iterative "question cycle" during which the teacher presents a question to students; provides them a few minutes to discuss it amongst themselves in small groups and reach consensus; collects answers through the CRS; displays the histogram to the class; moderates a class-wide, student-centered discussion around arguments for and against various possible answers; and selects some kind of "follow-up" micro-lecture, question cycle, or other activity in response to the needs revealed by the students' answers and discussion (Beatty, 2004; Robert J. Dufresne et al., 1996; Wenk et al., 1997). We have found this approach to be both pedagogically effective and popular with students. Others have developed strategies for CRS use similar to ours and also reported learning gains (Roschelle et al., 2004), although from our perspective most underemphasize the teacher's real-time formative use of the feedback.

While the evidence is strong that CRSs can enable powerful practice of real-time FA and benefit learning, little research has been conducted on the details of how CRS use affects learning, and even less on how teachers develop mastery of it and how teacher educators can assist them. Most extant wisdom is based on the personal and anecdotal experiences of individuals who have found their own way to mastery or who have assisted others through informal mentoring or isolated workshops (R.J. Dufresne et al., 2000; Mazur, 1977; Roschelle et al., 2004).

The only published research we are aware of on teacher learning of TEFA is from Assessing-to-Learn (A2L), a previous NSF-funded project (ESI-9730438) by us on the use of TEFA in high school physics instruction (Feldman & Capobianco, 2003). A2L began with the intention of creating prototype CRS questions for high school physics teachers to use with TEFA. However, we discovered that learning to use TEFA in its fullness is a remarkably difficult and time-consuming process for most teachers. In addition to mastering a complex set of skills, it requires internalizing perspectives and habits of mind that change a teacher's very "way of being a teacher" (Blum, 1999; Davis et al., 2003). Even teachers who identified with the stated goals of FA often practiced it in a way inconsistent with these goals, resulting in "weak practice" as discussed above. Although teachers and students appeared to enjoy doing FA activities and claimed that the activities were valuable learning experiences, most teachers did not use the feedback they received to inform real-time teaching decisions; therefore, the activities were not "formative," at least for the teachers.

As a result, we broadened the scope of the A2L project to study factors that influence teachers' adoption of FA practices and the kinds of support teachers need to fully implement TEFA. Our findings inform Section 3 of this proposal. In essence, we have a relatively detailed model of the capabilities and perspectives that contribute to TEFA mastery, but a much cruder understanding of how teachers come to acquire these and reach mastery. In order to develop effective methods of teaching TEFA mastery to pre-service and in-service STEM teachers, and in order to know when and how to evaluate student learning impacts, a more substantial research base on teacher learning of TEFA is required. No such research exists apart from the Assessing-to-Learn study. This project aims to redress that, thus strengthening the practice of STEM instruction by improving teacher education and professional development for TEFA, and preparing the ground for a large study of the scalability and effects of TEFA-based pedagogy.

Because TEFA is a well defined and focused practice that nevertheless requires a broad range of skills and consonant perspectives to master, we believe it makes a good point of entry for research into the encompassing topics of formative assessment and of teacher learning in general. Thus, a research project on teacher learning of TEFA would contribute to the study of effective formative assessment and effective teacher education and development.

By contributing significantly to the research base on teacher learning of formative assessment and essentially founding a research base on teacher learning of TEFA, the project will enable better teacher education and professional development. By creating the foundations for a large IERI-style scaling study on TEFA, it will make possible a concerted, effectual initiative to promulgate the practice of TEFA. The goal of such a study would be to produce data sufficient for the What Works Clearinghouse. Since TEFA enables and enhances formative assessment in STEM classrooms, and since formative assessment significantly and often dramatically improves student learning -- especially among students who are often "left behind" by traditional STEM instruction --- this should be a major advance for STEM education and a step towards closing the achievement gap between socioeconomic groups.

According to the TPC Program Solicitation section on Research Projects, "Of special interest is research around pilot-scale models of STEM teacher professional continuum strategies that show promise for scalability" --- a description that fits our project well. Classroom response systems are now affordable commercial products appearing in quantity in K12 schools and universities; one manufacturer claims to have "over 600,000 response pads now being used in all 50 states in thousands of k-12 schools as well as over 450 universities and 10 foreign countries" (Trehub, 1991). The technology and interest for explosive scaling clearly exists, even without the support of research-based pedagogy and teacher professional development.

References