Key competencies: developing an instrument for assessing trainee teachers’ understanding and views

From Section:
Assessment & Evaluation
Countries:
Spain
Published:
Aug. 02, 2021
August 2021

Source: Teacher Development, 25:4, 478-493

(Reviewed by the Portal Team)

This study aimed to develop and pilot an instrument for primary preservice teachers to measure:
(1) their opinion of the current primary education competency-based educational policy mandate;
(2) how confidently they apply this competency-based mandate in the classroom during their practicum teaching experiences;
(3) how confidently they apply this mandate in the classroom during their practicum evaluation experiences; and (4) how completely they understand the intended rationale for this mandate.

Method

Participants and context
The sample pertaining to the third pilot run of the questionnaire reported here consisted of 263 students from a third-year course of University of Barcelona’s Bachelor’s Degree in Primary Education.
Students were surveyed in their usual classrooms by a member of the research team (not their own professor).
Answering the survey was optional, but those who participated received 1% in extra points towards their final grade.

Process
The process of generating the items reported here was carried out in the following seven stages:
(1) identification of goal categories and initial item generation;
(2) first draft of revisions by a six-member panel consisting of two academic experts, two active teachers, and two student teachers;
(3) pilot study 1 [n = 295];
(4) revision and modifications of pilot study 1 in order to improve psychometric scale properties;
(5) pilot study 2 [n = 277];
(6) additional revisions and completion of pilot study 2 with the aim of further improving the instruments’ psychometric properties, specifically internal consistency; and
(7) the pilot study 3 reported here [n = 263].
The final pilot stage [n = 263], reported here, consisted of an assessment of the psychometric properties of the questionnaire which were much improved compared to the results of the two initial pilot runs.

Results and discussion
The authors’ aim was to develop, test, and validate an instrument for its future implementation as a research tool in longitudinal and multilevel studies, in a wide range of educational contexts and used on a smaller scale as a way for professors and departments at universities in EU countries to assess their own teacher training.
Initially, it was expected that the proposed variables, based on the theoretical principles would be similarly reflected in the factor analysis.
For the most part, this was true.
Indeed, the analysis revealed that three hypothesized variables, Opinion of key competency-based educational reform (Opinion of reform, F1), Specific self-evaluation of ability to assess students within a key competency framework (Specific self-evaluation: Ability to assess, F2), and General self-evaluation of the key competency-based teaching understanding (General self-evaluation, F5), clearly corresponded to three of the proposed variables.
However, the expected loading for the third proposed variable, Cognitive understanding of education reform, was not coherent with the factor analysis. In addition, the two resulting factors are worthy of further study.
As previously noted, it is very difficult to change existing knowledge structures, as understanding is based on what we already know.
This could contribute to difficulties in accepting new information (see, for example, Beck, Czerniak, and Lumpe 2000; Smith et al. 1994; Spillane 2000).
Therefore, given the data presented here, the authors believe that separate consideration of a variable Application of a pre-conceptualization of previous educational policies (Application of pre-conceptualization) from Comprehension of the possibility of an interdisciplinary-focused approach (Comprehension of interdisciplinarity) is warranted.
However, an expansion focused on the differences between these two concepts could increase the internal consistency of these two scales; a limitation of the current instrument.
One additional limitation of the study is that instrument development has taken place in a specific university context.
Future studies should assess instrument validity and reliability in other university and national contexts.
Longitudinal application could help to evaluate current and future teachers on the concept of competencies, beliefs in their ability to apply competencies in their future classroom experiences, and beliefs in the usefulness of the concept through changes over time.
Longitudinal studies could identify potential changes over time in the five variables proposed here and during the various stages of formal and informal training.
As indicated by Feiman-Nemser (2001), trainee teachers’ initial perspectives should be examined critically before alternatives are presented, as teachers’ beliefs and theoretical knowledge shape their understanding of concepts (see also Buehl and Beck 2015; Ertmer, Ottenbreit-Leftwich, and Tondeur 2015; Korthagen and Vasalos 2005).
Future studies could also compare the opinions of different faculties of education with those of students at different universities to explore potential correlations and include students’ assessments of their university lecturers’ ability to transmit the concept.
The tools should be combined with other more qualitative, practical evaluations that are also used longitudinally, to achieve their potential.

References
Beck, J., C. M. Czerniak, and A. T. Lumpe. 2000. “An Exploratory Study of Teachers’ Beliefs Regarding the Implementation of Constructivism in Their Classrooms.” Journal of Science Teacher Education 11 (4): 323–343.
Buehl, M. M., and J. S. Beck. 2015. “The Relationship between Teachers’ Beliefs and Teachers’ Practices.” In International Handbook of Research on Teachers’ Beliefs, edited by H. Fives and M. G. Gill, 66–84. New York: Routledge.
Ertmer, P. A., A. T. Ottenbreit-Leftwich, and J. Tondeur. 2015. “Teachers’ Beliefs and Uses of Technology to Support 21st-Century Teaching and Learning.” In International Handbook of Research on Teachers’ Beliefs, edited by H. Fives and M. G. Gill, 403–418. New York: Routledge.
Feiman-Nemser, S. 2001. “From Preparation to Practice: Designing a Continuum to Strengthen and Sustain Teaching.” Teachers College Record 103 (6): 1013–1055.
Korthagen, F., and A. Vasalos. 2005. “Levels in Reflection: Core Reflection as a Means to Enhance Professional Growth.” Teachers and Teaching 11 (1): 47–71.
Smith, III, J. P., A. A. diSessa, and J. Roschelle. 1994. “Misconceptions Reconceived: A Constructivist Analysis of Knowledge in Transition.” The Journal of the Learning Sciences 3 (2): 115–163.
Spillane, J. P. 2000. “Cognition and Policy Implementation: District Policymakers and the Reform of Mathematics Education.” Cognition and Instruction 18 (2): 141–179. 


Updated: Feb. 27, 2022
Keywords:
Policy analysis | Europe | Self evaluation | Preservice teachers | Competencies