Linking Student Achievement to Teacher Preparation: Emergent Challenges in Implementing Value Added Assessment

Mar. 01, 2019

Source: Journal of Teacher Education, Volume: 70 issue: 2, page(s): 128-138

(Reviewed by the Portal Team)

In this paper the authors describe some of the challenges that have emerged in the work surrounding the deployment of Louisiana’s value-added assessment of teacher preparation programs with an interest in illuminating some of the critical challenges and initial lessons they have learned.
The discourse is aimed to support educational policy makers, teacher preparation leaders, and the researchers who may serve as their advisors.
Their goal is to shed light on challenges any state or university system can expect to encounter if they choose to disseminate value-added assessment of teacher preparation.
Their discussion is organized around three key themes that emerged in their work: calculation, communication, and change.
In each of the subsequent sections, they identify key challenges that emerged during the development and use of Louisiana’s VAA–TPP, what actions were taken to resolve those challenges, and some discussion of the putative utility of those decisions.

Beyond the statistical treatment of the data, the authors describe two critical data challenges which emerged during the development of the VAA model in Louisiana.
The first challenge surrounded data quality and availability. Once the State reached the point that sufficient data were available to conduct the VAA, the analyst team engaged in active testing of the quality of the data and the reliability of the links.
The authors were able to identify some problems and work with the State toward solutions. Equally importantly, data quality issues emerged in areas which they did not anticipate, and they worked with partners to find acceptable solutions.
The lesson learned from their experience in this case is that data quality problems can readily undermine the credibility and integrity of VAA of TPP.
The analyst team needs to take a leadership role in continuously vetting the quality of the data and reliability of longitudinal links and working toward solutions when problems are encountered.
The second challenge surrounded the adequacy of the VAA to incorporate variables about which teacher preparation leaders and policy makers were concerned, and to overcome the sorting resulting in discrepant class compositions.
The authors conducted follow-up studies examining these issues and shared the results with stakeholders. As a result, data-informed decisions were adopted that addressed the relevant concerns.
The experience taught two lessons. Seriously examining stakeholder concern through research guided by their questions can be helpful in obtaining buy in for the process.
For some issues (e.g., attendance), additional analyses yield muddled results that identify tradeoffs rather preferred solutions.
In cases where a single optimal choice was not identified, they made decisions that increased support for the process by including variables that concerned teacher educators.

The authors’ initial challenge with regard to communicating the VAA results was when to communicate them.
Although the State leadership team struggled with this issue for a time, the lesson that they learned that proved valuable in Louisiana was that time could be an ally.
They could choose to report results over time as they accumulated enough data to have confidence in their results. It is also the case that this created tensions when programs with small numbers of graduates did not have results.
Their second major communication challenge was setting standards for judging the meaning of the results.
This is a domain in which they learned what not to do.
For purposes of accountability and communication to the public, complex systems that incorporate multiple features of the data may be statistically appealing; but they are likely to be poorly accepted and yield results that consumers perceive as paradoxical.
It appears that the end system should be as straightforward and easily understood as is practical. A final lesson they learned regarding VAA–TPP is that colleges and universities are critical stakeholders that have intensive communication needs.
It is critical to budget substantial resources to providing them information and to answering their questions.

The initial reporting of VAA results made clear that VAA– TPP data were sufficient to motivate change and that they the lacked detail to guide what needed to be changed or how to accomplish that change.
The authors’ experience with the programs that wanted to make changes to improve VAA results hammered home the importance of communicating clearly about what VAA cannot do and being willing to support teacher education leaders as they begin exploring potential solutions for poor VAA results.
Another lesson that emerged from the Louisiana experience is that for TPPs there is a long lag between programmatic changes and when those can possibly show up in the program’s assessment data.
If the authors had appreciated that more clearly at the time, they felt that they might have been able to help the program leaders at the programs that were making changes to develop a communication plan for their stakeholders that explained the delay in advance.
A final instructive issue that emerged in the change process is that once TPP leaders and faculty have data about their graduates and the students they teach, they will ask for more and more detail.
The number and variety of the requests can quickly become unmanageable unless the analysts develop a plan for responding to those requests and apportion resources to that task a priori.
It is also worth acknowledging that these subgroup, descriptive, and subscale reports may be slicing the data so thinly that it can lead program faculty to begin contemplating programmatic changes in response to transient phenomena and chance variation. Follow-up detailed reports should be accompanied by appropriate cautions regarding their use. 

Updated: May. 29, 2019