Educating Legislators Beyond the Numbers: Mixed Methods Economic Impact Studies
Community college leaders face increasing pressure to prove a return on government investment in their institutions. Performance-based funding measures and economic impact studies are commonly used to monetize the success of community colleges. These well-intentioned efforts produce a beneficial, but limited, view of community colleges. However, community college leaders are well versed in navigating the needs, and sometimes competing interests, of diverse stakeholders. They understand they need to comply with required evaluation metrics to gain access to indispensable funding, but are also innovative thinkers who seize opportunities to educate decision makers on the important work they do beyond the numbers. In this article, we will use our experience conducting a mixed methods economic impact study on rural community colleges in Pennsylvania to demonstrate how to do both—conform to existing evaluation standards and educate legislators (Saboe et al., 2020). We will share our quantitative and qualitative methodologies and an overview of our research findings to make the case for leaders to combine methods for community college economic impact studies.
Constraining Quantitative Evaluation Metrics
Legislators and higher education commission members are commonly charged with determining community college evaluation metrics. Unfortunately, these individuals do not always understand the unique mission, student populations, and work of community colleges. Regardless, they have increasing power to determine how these institutions will be evaluated and funded. This can lead to challenges for community college leaders (Kadlec & Shelton, 2015).
For instance, federal and state higher education officials commonly require community colleges to tell their stories within existing assessment metrics rather than adapting the measures to better fit the unique mission and vision of the colleges (Ocean et al., 2020). Clear examples are the performance-based funding metrics which are derived from a university model and exclusively quantitative. These metrics can easily miss success stories, community enhancements, continuing education, and pivotal higher education and industry partnerships. Additionally, performance-based funding can restrict access, further disadvantage under-resourced students and institutions, and encourage a focus on numeric outputs to advance an institution’s standing and associated funding (Ortagus et al., 2020). Presently, performance-based funding also typically ignores power, organizational, and environmental factors (Dougherty & Natow, 2020). Community college leaders and their allies need to think creatively to avoid being reduced to these imposed measures and to fully capture the success of their institutions and people.
Similarly, economic impact studies are one way to evaluate the financial impact of community colleges on local communities, but they were not developed specifically for use in the public community college sector. These studies are typically conducted as exclusively quantitative research which can result in an oversimplified view of the community college financial impact; they tend to cast large organizations in a favorable light (i.e., organizations with large payrolls and budgets make a large economic splash). Conversely, small organizations with skeletal staff and shoestring budgets should not be expected to generate large economic impacts. Given that certain community colleges and their satellite campuses are lean operations, their economic impact would not be large. In such an instance, a mixed methods approach to assessing economic impact is warranted.
Moreover, using private, market-based assessments, like economic impact studies in the public sector, without accounting for the ingrained bureaucracy or the role of public good will lead to an inaccurate assessment (Rouse & Smith, 1999). Higher education researchers advocate for qualitative measures to evaluate the work of community colleges more accurately (Ocean et al., 2020). Unlike performance-based funding where the metrics are prescribed at a state level, community college leaders have more autonomy to determine the methods employed in a voluntary economic impact study. We believe a mixed method economic impact study can both provide easily accessible quantifiable outputs and create an opportunity to expand the perspective and understanding of community colleges.
Research Funding and Methods
In 2019, the Center for Rural Pennsylvania solicited grant proposals to conduct an economic impact study on rural community colleges in Pennsylvania. The Center is a bipartisan legislative agency established in 1987 that funds research to better understand people, trends, and conditions in rural Pennsylvania as well as to identify policy options for Pennsylvania state legislators. The authors were awarded a grant and conducted an economic impact study using both quantitative and qualitative research knowing the audience for our findings would be legislators in Pennsylvania.
For the quantitative portion of our research, we used industry standard multipliers to quantify the economic impacts of rural community colleges in Pennsylvania. A multiplier is the numerical relationship between an original change in economic activity and the ultimate change in activity that results as the money is spent and re-spent through various sectors of the economy. We used IMPLAN modeling software to estimate the economic impact of rural Pennsylvania’s community colleges. For the qualitative economic impact research, we conducted telephone interviews with employees and local industry leaders and in-person site visits and employee interviews, and gathered publicly available secondary data from the Internet. We conducted a team thematic analysis of all gathered data to develop our research findings.
Economic Impact of Pennsylvania’s Rural Community Colleges
Our sample of rural college locations varied in size but covered the majority of rural Pennsylvania’s community college sites. Some large hubs had budgets over $10 million and more than one hundred employees, but other satellite sites with budgets under $5 million and fewer than fifty employees were more typical. Collectively, our sample of institutions contributed an estimated 543 jobs, $31.9 million in value added economic activity (i.e., gross regional product, or GRP), and $50.1 million to the total output of their local economies.
Using IMPLAN and detailed data provided by several rural community colleges, we found that the average rural site (excluding the largest, outlier sites) supports about 15 jobs, contributes $722,000 in value added GRP, and generates about $1.25 million in total output within their local economy. This is a small fraction of a county’s GRP. However, this impact does not include the future impact of students who use the valuable skills and credentials provided by community colleges to pursue further higher education, find employment, and contribute to Pennsylvania’s economy.
While important, these figures are only part of the story. Our qualitative investigation helped identify specific state funded programs that local industry leaders perceived as having the greatest impact. One program was Workforce & Economic Development Network of Pennsylvania (WEDnetPA). WEDnetPA was created by the Pennsylvania Department of Community and Economic Development in 1999 to provide grants for workforce development training. Many rural community colleges are approved sites for these state grants and their employees have developed the required relationships with local industry to maximize the funding. For instance, a rural community college administrator was able to coordinate between ten local companies with similar training needs, saving the industry partners a cumulative $90,000. If we conducted a quantitative only economic impact study, this efficient use of government resources benefitting local business owners would have been overlooked. In addition, performance-based funding metrics rarely consider workforce training and, consequently, would have missed these important collaborations and resulting upskilled employees. The qualitative research in our economic impact study highlighted the important work of community colleges and the financial savings resulting from relationships with local industry leaders. It also allowed us to make a specific recommendation on continued funding for this statewide program.
Moreover, our qualitative research identified the important access rural community college locations offer, their investment in local communities, and challenges that hinder them from having a greater impact. These findings are a part of educating legislators. We documented how rural community college locations provide salient financial, educational, and geographic access to higher education and workforce training opportunities. We also documented the lack of sufficient resources for community colleges, the underutilization of these institutions, and the challenges of existing in a decentralized community college system. While those of us in the community college world understand these things as a part of an everyday community college experience, many who have not worked, studied, or researched at a community college need to be educated on the strengths and challenges that exist for these institutions. By conducting a mixed method economic impact study, we used quantitative research to quantify the financial impact of community colleges on their communities and qualitative research findings to educate legislators, thus presenting a holistic perspective of the mission, work, and impact of community colleges.
Conclusion and Next Steps
Despite the challenges of capturing success quantitatively, quantified outputs will continue to be requested or required of community college leaders. Some legislators and higher education commission members will remain unaware of the important contributions of community colleges unless we find creative ways to integrate an education into expected, requested, or required evaluations. Community college leaders and researchers can partner in these efforts to simultaneously comply with present requirements and creatively stretch our evaluation methodologies. We believe mixed methods economic impact studies can do both.
We conducted this study prior to COVID-19 and the election of President Joe Biden, and some of the findings from our research have already shifted. However, our recommendation for the mixed method remains constant. When people learn what community colleges really do, how they impact countless individuals and local economies, they are inevitably converted to community college champions. We need to use community college ingenuity to find any and every way to share our story and educate decision makers on the importance of investing in our colleges, in our people, and in our communities.
Dougherty, K. J., & Natow, R. S. (2020). Performance-based funding for higher education: How well does neoliberal theory capture neoliberal practice? Higher Education, 80(3), 457–478. https://doi.org/10.1007/s10734-019-00491-4
Kadlec, A., & Shelton, S. (2015). Outcomes-based funding and stakeholder engagement. https://www.luminafoundation.org/files/resources/kadlec-shelton-ofb- full.pdf
Ocean, M., McLaughlin, J., & Hodes, J. (2020). “We take everyone”: Perceptions of external assessment and accountability at the community college. Community College Journal of Research and Practice. https://doi.org/10.1080/10668926.2020.1841041
Ortagus, J. C., Kelchen, R., Rosinger, K., & Voorhees, N. (2020). Performance-based funding in American higher education: A systematic synthesis of the intended and unintended consequences. Educational Evaluation and Policy Analysis, 42(4), 520–550. https://doi.org/10.3102/0162373720953128
Rouse, J., & Smith, G. (1999). Accountability. In M. Powell (Ed.), New labour, new welfare state (pp. 235-255). Policy Press. http://www.jstor.org/stable/j.ctt1t89b33.16
Saboe, M., Ocean, M., & Condliffe, S. (2020). The economic impact of rural Pennsylvania’s community colleges. The Center for Rural Pennsylvania. https://www.rural.palegislature.us/publications_reports.html
Mia Ocean is Associate Professor, Graduate Social Work; Matt Saboe is Associate Professor, Economics and Finance; and Simon Condliffe is Professor, Economics and Finance, at West Chester University in Philadelphia and West Chester, Pennsylvania. Erin Spencer, Graduate Social Work, and Keith Hazen, Economics and Finance, were research assistants and are now West Chester University of Pennsylvania alumni.
Opinions expressed in Leadership Abstracts are those of the author(s) and do not necessarily reflect those of the League for Innovation in the Community College.