TPACK Leadership Diagnostic Tool: Adoption and Implementation by Teacher Education Leaders

Countries: 
Published: 
2019

Source: Journal of Digital Learning in Teacher Education, 35:1, 54-72

(Reviewed by the Portal Team)

The authors write that current and former members of the AACTE (American Association of Colleges for Teacher Education) I&T (Innovation and Technology) Committee conducted two rounds of interviews with leaders from three teacher education institutions that utilized the TPACK (technological, pedagogical, and content knowledge) leadership diagnostic tool.
The TPACK leadership diagnostic tool was developed as a resource for education leaders to assess existing supports for technology adoption and integration within teacher education programs.
The authors state that the purpose of this study was to explore how representative the TPACK leadership diagnostic tool was of education leaders’ concerns and processes when seeking to create and sustain an environment that supports TPACK-based initiatives.

The research questions that guided the researchers in this study were:
1. How was the TPACK leadership diagnostic tool used by education leaders during the implementation of TPACK-based initiatives?
2. In what ways did the TPACK leadership diagnostic tool serve as an opportunity to examine current practices and set realistic goals?
3. What are education leaders’ recommendations for the TPACK leadership diagnostic tool?

Methods
Participants and Data collection - Two rounds of semistructured interviews, approximately 1 year apart, were conducted by the researchers.
During round 1, eight teacher education program leaders described their existing or planned TPACK-based initiative and provided information on key personnel involved.
Participants were also asked to describe their use of the TPACK leadership diagnostic tool, perceptions of the diagnostic tool’s use, and future plans for the tool at their institution.
During round 2, three leaders who had begun their initiatives took part in a follow-up interview to discuss the progress of their TPACK-based initiative.
Interviews included questions and prompts about the TPACK leadership diagnostic tool and how it was used. Interviews for both round 1 and round 2 were digitally recorded and transcribed.
This article focuses on the second round of interviews because the participants discussed progress on their TPACK-based initiatives through the lens of the diagnostic tool.
Data analysis - Content analysis using a priori codes was selected as the method of analysis for determining how participating institutions were using the TPACK leadership diagnostic tool to guide leadership decisions in TPACK initiatives.
The TPACK leadership diagnostic tool (Graziano et al., 2017) itself served as the basis for codes used to analyze the second round of interviews (N = 3).
Constant comparative methods (Glaser & Strauss, 1967) were used to triangulate interview data with additional information from each institution and their TPACK initiatives.
 

Results and discussion
The TPACK leadership diagnostic tool “was developed as a self-assessment tool to serve the individual institution in its decision-making process” (Graziano et al., 2017, p. 378).
The authors report that results indicated that all components in the diagnostic tool were relevant for education leaders as they planned for and implemented their initiatives.
Education leaders used the diagnostic tool to engage with others about their initiatives, to consider how physical spaces and personnel could be repurposed in support of their initiatives, and to think critically about prioritizing competing political, financial, and contextual demands.
They also report that results illustrate that leadership decisions were instrumental in the planning and implementation of TPACK initiatives.
Five important areas for decision making are discussed.
Vision for change - Participants each described a rationale for redesigning their educator preparation programs to embed technology.
However, while they had specific reasons for pursuing their TPACK initiatives, they did not clearly articulate overall visions of how their preparation programs would develop TPACK competent candidates or how their initiatives were aligned to the vision of their respective universities.
Creating opportunities for faculty - Leaders in this study developed, or planned for, opportunities for faculty to engage with each other and school partners around technology adoption, integration, and modeling of TPACK, but they also recognized that scaling up these initiatives would take time.
Engagement with internal/external partners - Participants indicated that TPACK initiatives, including new educational spaces, created opportunities for outreach to external partners such as a technology mentoring program and professional development for area teachers.
Questions remain, however, about how the diagnostic tool was used to improve the internal/external partners’ understanding of their responsibilities and incentives to support the initiative.
Funding - Two of the three institutions funded their TPACK initiatives through external funding sources.
While external funding is often used as a catalyst for initial change, technology initiatives require dedicated budget allocations to sustain progress (ISTE 2017b; U.S. Department of Education & Office of Educational Technology, 2017).
Only one institution depended on existing personnel, budgets, and resources to enact their TPACK initiative.
Restructure of physical spaces - Each institution mentioned the restructuring of physical spaces as part of their initiatives.
The interrelationship between restructuring of physical space, faculty development for transformational instructional practices with technology, and improved teacher education candidates appears to be fertile ground for further research.

Conclusion
The authors conclude that based on this case study, participants did not continuously refer to the tool as a “road map” throughout the implementation of their initiatives.
Results indicate that effective use of the tool requires support, scaffolding, or even training.
The authors note that without guidance for leaders to understand and participate in the change process, leaders may be left chasing grant funding for technology or undertaking a vision based on the determination of a single individual.
Neither is optimal if the goal is transformational and sustainable change for effective technology use by teacher education faculty and candidates to enhance P12 learning.
They also note that leaders need to thoughtfully reflect on how competing priorities and resources, faculty time and attention, involvement of school partners, and the ever-critical policy environment can impact the development and implementation of their TPACK-based initiatives.
Making time to consult elements, such as those outlined in the TPACK leadership diagnostic tool, while leading the change process of TPACK-focused initiatives can help ensure that the initiatives are successful.

References
Glaser, B. G., & Strauss, A. L. (1967). The discovery of grounded theory: Strategies for qualitative research. Chicago: Aldine
Graziano, K. J., Herring, M. C., Carpenter, J. P., Smaldino, S., & Finsness, E. S. (2017). A TPACK diagnostic tool for teacher education leaders. TechTrends, 61(4), 372–379. doi:10.1007/s11528-017-0171-7
International Society for Technology in Education (ISTE). (2017b). ISTE essential conditions. Retrieved from https://www.iste.org/standards/essential-conditions
U.S. Department of Education, Office of Educational Technology. (2017). Reimagining the role of technology in education: 2017 national education technology plan update. Retrieved from https://tech.ed.gov/files/2017/01/NETP17.pdf

Updated: Jan. 05, 2020
Print
Comment

Share:

Facebook comments:

Add comment: