Metrics and faculty governance

Last year, the members of the Board of Trustees expressed concern that we, as a campus, would not be able to measure the progress toward our goal of achieving excellence in all facets of faculty work unless we had a defined set of metrics to measure faculty performance. At the behest of the Trustees, the Provost’s Office drafted a plan to measure faculty research productivity.

Over the course of the last several years, UA leadership has repeatedly pointed out well-cited problems associated with certain external metrics, particularly Academic Analytics, in evaluating individual faculty or entire departments (for further reading on these problems, see list of links below). Members of the administration have shared these concerns, so the Provost’s Office began the process of creating research metrics by having faculty in each of the units formally define what they consider to be legitimate venues for promulgating research work in their field, how the unit evaluates grants, how the units measure impact, and how the unit evaluates awards won by its faculty.

Last month, the college deans began the process of having departments and units develop these metrics. Unfortunately, the rollout has not been smooth in all areas of campus. Faculty across campus have reported confusion about the purpose of compiling these metrics, concern that the metrics will be used for tenure, promotion, or merit evaluations, concern about the sources from where performance data will be obtained, how those metrics will be updated or corrected, how the metrics may impact personnel decisions or department level resource allocations, why service and teaching are excluded, and how faculty judgment and talent will remain the primary standard for determining excellence in our respective fields.

We also saw at least one college suggest to their faculty that Academic Analytics was an appropriate tool for compiling information about faculty impact and comparator data. Coupled with the article in the Eugene Weekly concerning administrative support for Academic Analytics <https://www.eugeneweekly.com/2018/01/25/questionably-measuring-success/>, faculty misgivings were compounded. We strongly urge any department considering using Academic Analytics to review the information in the links provided below. It is a deeply flawed source of information.

The concerns expressed by faculty raise fundamental questions that need full and coherent answers. At this point, we are convinced that UO academic leadership needs to better explain to faculty the purpose of this directive. While we understand the value of having a clear understanding of the work faculty do and ensuring that faculty work meets the standards of a world-class research university, we do not see the connection between that need and compiling tiered lists of journals and publishers. Academia has tried and true methods of determining whether faculty are living up to the standards of the university, including peer and external review. It is true that those methods can always benefit from examination and refinement, but we have been through the process of developing those policies quite recently.

Moreover, we question the speed with which faculty are being asked to do this work. It is our understanding that all of this metrics work is supposed to be done by mid-March, which is an extremely short timeframe for units to grapple with a very new idea with obscure rationale and unspecified consequences.

At our General Membership Meeting in January, members expressed deep misgivings about the apparent direction of this initiative and vigorously called for administration to engage with faculty on these issues. Executive Vice-Provost Scott Pratt was at the membership meeting to discuss the new hiring criteria, but he heard the concerns and responded to many of the issues raised. Union leadership is heartened to hear there is an acknowledgment that more clarity would be useful and assurances there is no intention to use these metrics in the evaluative processes. These are steps in the right direction, though to date we have not seen follow through resulting in clarification or assurance.

We will keep pushing the administration to provide clarification to faculty questions and objections. We will pass along what we learn, but we also need your input and involvement. What are your concerns, if any? Do you have no concerns? Are you hearing anything that clarifies or obscures the process? Please stay tuned for updates. If you wish to learn more, as always, please email us at [email protected].

Some references on recent debates about metrics and the termination of university contracts with Academic Analytics:

https://www.chronicle.com/article/UT-Austin-Professors-Join/242332 

https://www.chronicle.com/article/As-Concerns-Grow-About-Using/238034

https://www.aaup.org/news/statement-urges-caution-toward-academic-analytics#.WnkCqa6nGpo 

https://www.insidehighered.com/news/2015/12/11/rutgers-professors-object-contract-academic-analytics

https://www.insidehighered.com/news/2016/05/11/rutgers-graduate-school-faculty-takes-stand-against-academic-analytics 

https://www.chronicle.com/article/The-Tyranny-of-Metrics/242269