Of Advocates and Entrepreneurs
Michael B. Horn, “a top proponent of higher-ed disruption,” stepped down from his position as director at the nonprofit, nonpartisan Clayten Christensen Institute, The Chronicle of Higher Education reported on October 12. In a blog post announcing his decision, Horn indicated his intention to “work on the ground with the entrepreneurs shaping the future of education,” though which specific entrepreneurs are not mentioned in his post.
Planning “to put his theories into action,” The Chronicle adds that one such entrepreneur will be Entangled Solutions, a self-described “innovation agency for Higher Education.” According to The Chronicle, Horn already serves on the board of Fidelis Education, a company offering learning relationship management solutions, described in its literature as an “integrated holistic student success solution…”
Horn’s decision to enter the private sector reflects a growing trend among public advocates and vendors of technology in higher education to offer entrepreneurial solutions for student success. While there has never been a shortage of education consultants and technology vendors to higher education during the past fifty years, the new emphasis on student success entails two decidedly different components: venture capital and data science.
Established education technology vendors already seek to leverage their position in student information systems or learning management systems markets. For example, Ellucian, the market leader of student information systems with the Banner and Colleague brands, has a section of its website devoted to “Student Success” and the software that “helps keep students on track and supports student success.” Likewise Blackboard features its “student success model” as a “cost effective, scalable solution that helps improve student outcomes….”
More notably, start-ups now seek to establish new education technology markets with the aid of venture capitalists. In late September, Civitas Learning announced an investment of up to $60 million by Warburg Pincus in its business model. The announcement by MarketWatch credits Civitas Learning as having “pioneered the use of data science and design thinking to support higher education institutions as they improve learning and increase degree completion.” The article attributes the recent round of investment funds to the growing demand by institutions of higher education to improve learning and degree completion.
Fashionable since 2008 to characterize analytic advances at LinkedIn and Facebook, the Harvard Business Review reports “data science” gained currency to describe predictive analytics and data mining with large data sets. Civitas Learning promotes its solutions as the intersection of human commitment and data science, explaining that data science identifies “student success and risk factors” and promotes the optimization of “student learning journeys” through institutional policy and practices. The new funding enables Civitas Learning to secure “strategic acquisitions” and expand globally.
Similarly, IBM recently entered the new market for student success and data science solutions. In January, IBM announced its $100 million fund for “venture investments to support an ecosystem of entrepreneurs developing” applications for its newly-formed IBM Watson Group, a suite of software and services featuring “cloud-delivered cognitive innovations.” In one of the first ventures into higher education, Deakin University in Austalia announced in the past month its partnership with IBM to become the “first university to implement the Watson Engagement Adviser.” Initially, the Watson adviser will serve new students by answering questions on “a range of topics” including admissions, financial assistance, course enrollments, job placement, skills assessment, and academic help.
Data science for student success may be the latest “management fad in higher education,” to cite Robert Birnbaum’s critical assessment of the ever-changing paradigms imported from the business sector.
Unlike prior fads, however, there are no pretensions that higher education institutions have the internal capacity or resources to implement the sweeping changes in technology and management techniques entailed by data science for student success. Whereas prior management fads emphasized the potential of institutions to implement exciting new management techniques independently, the current discourse around data science and student success acknowledges no such autonomy. In this respect, Horn’s decision to focus solely on entrepreneurial solutions also may signal doubt in the effectiveness of advocacy for “disruptive innovation.”
What, then, explains this skepticism for institutional solutions of data science for student success and the associated rise in the investment opportunities for venture capitalists?
The Role and Function of Institutional Research
In March 2014, the National Association of System Heads (NASH) released a report on the state of institutional research (IR) offices in the nation’s state university systems. The report addressed the “unprecedented public demand” for state institutions to demonstrate student access and success while somehow maintaining affordability despite the diminishing fiscal support of state governments. To gain a better handle on how to achieve these conflicting agendas, the association surveyed the 48 system offices and 349 campuses to assess the role, function, and capabilities of the IR offices in their sector of higher education.
The results proved discouraging: “The overall ability of IR offices to use data to look at issues affecting many of the cross-cutting issues of the day — such as the connections between resource use and student success — is nascent at best.” As a consequence, NASH concluded that IR is “a field that is at best unevenly positioned to support change.” In its conclusions, the association recommended “the development and testing of new approaches to IR…. in conjunction with a team of expert advisors including some from outside of higher education, from whom we have much to learn.”
The NASH report possibly marks an end-game for the profession of institutional research in that IR offices trace their origin to the history of management fads. Francis E. Rourke and Glenn E. Brooks wrote in their seminal work on the managerial revolution in higher education during the 1960s, “Institutional research lies as the heart of the trend toward the use of modern management techniques in higher education.” Large state universities, as well as a few large private universities, first established administrative IR offices in the late 1950s and early 1960s to address the need for better planning and access for the enrollment surge of the baby boomer generation. System heads, such as Clark Kerr in California, relied on substantial institutional research operations to envision the state-wide policies and campus plans of the era.
In contrast, for most colleges and universities , IR offices originated in an accountability movement from outside higher education and encroached on the academic freedom and autonomy of the institution. Rourke and Brook, representing two such institutions, portrayed institutional research as a peril to academic freedom and institutional autonomy, recommending that IR offices be subject to “an academic orientation… bolstering the point of view of the faculty.”
Administrators, including skeptical IR practitioners who favored Rourke’s and Brook’s analysis, looked to the particularity and perceived uniqueness of institutional missions and leadership to serve as a sufficient means to contain the threat of institutional research.
Two scholars from Michigan State University, Paul L. Dressel and Joe L. Saupe, defined the practice of institutional research for higher education scholars and the Association for Institutional Research (AIR) in the early 1970s. Dressel, who edited the first “handbook” of IR in 1971, drew sharp distinctions between institutional research and higher education scholarship. IR offices may “probe deeply into the workings” of a single institution, but IR could not contribute to academic scholarship for application outside of the specific Institution or system of institutions that housed the particular office. Saupe, in a paper endorsed by the AIR, further clarified IR as “specific and applied” research directed by the needs of the local institution, and therefore “institutional research should not be expected to produce knowledge of pervasive and lasting significance…”
Following Dressel’s and Saupe’s line of thought, most IR professionals subsequently organized their offices to service requests for data by the local officers of their institutions, with no substantive regard for the general accumulation of knowledge about the allocation of institutional resources or effective interventions for student success.
Twenty years later, the AIR flagship journal, Research in Higher Education (RIHE), published an article summarizing the nature and status of institutional research as constituted in this manner. Noting first that “institutional research is more an art than a science,” the article describes a field “in a pre-paradigm stage; no body of scientific theory controls the kinds of questions that can be asked or the kinds of answers sought.” The “politics of higher education” directed the research activities and arrested the development of “a cohesive body of knowledge” for institutional planning and effectiveness.
IR offices depended “on the perceptions of those within a particular institutional context,” hindering the possibility “for a discipline to emerge.” In effect, institutional researchers lacked professional autonomy within their institutions. “[O]thers set the agenda for institutional research,” the article concludes, and resigns IR professionals to the status quo: “There is little incentive for this to change.”[1]
In light of the profession’s history, the NASH survey affirms the predominance of IR offices organized as defined forty years earlier by Dressel and Saupe. Its conclusions merely re-iterate the conditions of the profession as described twenty years thereafter in the RIHE.
Following the NASH report, it is not surprising that the American Council on Education (ACE) further endorsed the need for external vendors and venture capitalists to provide student success solutions to higher education. In a quick hit paper on predictive analytics, ACE observes, “[T]here is a strong sense among pioneers in this space—and on the part of some prominent funders—that pushing into this new territory will pay significant dividends in improved student success.”
Notably, then, during the past two years the two associations representing the nation’s system heads and its college presidents have become proponents for outsourcing student success solutions as the means for colleges and universities to meet the demands of the future.
Predictive Analytics and the Privatization of Student Success
Whereas the history of the institutional research profession explains the limitations for a data science of student success within a single institution, the prohibition of a federal student unit record system creates the opportunity for the private sector to create a market to supply multi-institutional unit record systems to higher education.
A national unit record system would provide an authentically “democratic” student-level database of records for academic preparedness, progress-to-degree, and degree outcomes for every student at almost every institution in the country. The potential quality and complexity of research with a unit record system for student records and analytics is widely known in both the public and private sectors.
The application of such a system to measure institutional performance and accountability, however, effectively spurred the opposition of associations representing colleges and universities, resulting in the 2008 prohibition. The measurement of degree outcomes at the level of individual students not only would provide evidence-based solutions for student support, but also would offer a window into the differential outcomes of similar students at different institutions. The emperors would have been undressed.
In the absence of a comprehensive national unit record system, entrepreneurs and other vendors now are racing to develop the largest database of student records to serve as resources for the data science of student success.
The Predictive Analytics Reporting (PAR) Framework, “a non-profit provider of analytics as a service,” boasts a database formed from over 350 unique campuses, 2.5 million students, and 24 millions student records. Before its latest round of venture capital funding, Civitas Learning claimed 850 campuses and 2.7 million students in its unit records database. Several other established education technology vendors – including Starfish, Blackboard, and Datatel – are developing their own systems of student unit record databases with much less publicity than the PAR Framework and Civitas Learning.
Whichever brand succeeds, predictive analytic solutions for student success present several challenges to institutional autonomy in higher education. The NASH survey of IR offices notwithstanding, the primary challenge for institutional research with regards to student success solutions always has been the mobilization of faculty and staff to consider and adopt new management techniques that encroach on the deference to tradition, academic freedom, and administrative unit autonomy. Thus, despite efforts to market their solutions to faculty and students as systems of decentralization and empowerment, implementation of predictive analytics will entail the modification of behaviors by faculty and students that produce non-optimal results for student success.
For instance, Civitas Learning describes its solution, “Our data science process and predictive analytics framework unleash the potential of your institution’s diverse data sources…. to power your student success initiatives….” When an optimal course of intervention is known to be effective within the process and framework, why restrict the act of intervention to the non-optimal will and agency of faculty or staff?
Data science and predictive analytics will, by necessity, become process- or framework-directed. That is to say, the analytics system will need to assume control of optimization without regard for academic freedom or professional autonomy of college and university personnel. Rather than ask or train faculty when to act, the most efficient and effective use of predictive analytics will be the automation of messages, meetings, and mentoring for students.
No doubt, the appearance of faculty control over communications and interventions will facilitate the optimization of student behavior to a degree; students will think faculty sent messages or student support personnel requested meetings. Nonetheless, the point of the data science will be to eliminate non-optimal curriculum choices by faculty and ineffective intervention practices by student support personnel.
To overcome the traditional challenges to new management practices and fads, the data scientists must diminish the agency of faculty, staff, and students in the implementation of optimal decision-making for student success.
In this respect, IBM offers a more transparent solution to the objective of student success analytics than other vendors jostling for proprietary dominance. IBM creates no illusions about what (or who?) will be advising students: IBM’s Watson. Deakin University’s news release suggests, “'”Watson will revolutionise and simplify student problem-solving: the more questions it is asked, the more informative its answers will become.'” IBM Watson is the agent in student problem-solving. As Watson learns to answer questions better, students problem-solving becomes easier – provided students listen to Watson more readily than to faculty or staff. One assumes, as IBM Watson learns, its Engagement Advisor will be able to provide students with recommendations for programs of study, course schedules, which course/faculty to avoid, how much financial aid loans to accept, etc., etc.
In sum, the entire university experience, the new data science infers, will one day be optimized for individual student success – the best of all possible worlds.
Proprietary Data Science and the Uses of the University
Far from democratizing student data, to the contrary, the private vendors of data science for student success are currently balkanizing the nation’s student records in proprietary clouds of data storage and predictive analytic algorithms. The capacity to optimize college faculty and student behavior within a single college has its value, of course, but the real motivation and return on investment accrues from the opportunity to build multi-institutional systems of data for predicting student success.
Nearly every system to optimize student decision-making at the student unit record level is constrained by the choice of college a student attends. A single institution relying on the data in its student information system is unable to analyze individual student success with a record of choices and behaviors beyond the institution. Retention and progress-to-degree initiatives at each college and university therefore target “at risk” students and provide services designed to address perceived shortcomings in the preparedness or engagement of students.
What if, however, the problem for most students originates in the initial choices regarding application and attendance rather than in the mundane bureaucratic choices made in one institution?
Proprietary data science will begin with sorting and assigning students to courses, programs, faculty, and engagement activities at a single institution because the institutions have the data that student success vendors need. Ultimately, though, data science likely will deliver only marginal improvements in student success within institutions. In the final analysis, the multi-institutional data systems will reveal more substantial differentials in student success between institutions. The most beneficial, and therefore most lucrative, services to students will be those that draw on the immense power of multi-institutional data systems to sort and assign students between institutions, not within institutions.
As IR professionals have long known, the conceit of most institutional interventions is that students who do not succeed at the institution fail because of personal shortcomings – a lack of academic preparedness, student engagement, or the social and economic capital for the institution. The presumption is that students have a fixed and measurable level of college success, and the admissions and enrollment officers have unique insight to determine which students to admit and which to deny. As a negative corollary to the conceit, staff and faculty will point fingers at the admissions officers for recruiting “the wrong students” or “too many unprepared students” to the college when senior administration questions sub-par student outcomes. Bad outcomes, within the logic of academic preparedness, are the responsibility of individual students while good outcomes are attributable to the inherent value of each institution in higher education.
In fact, every student has an array of different probabilities of success that varies for every institution in the country. Little is known how these differential probabilities of success are distributed across institutions because opposition to institutional accountability and multi-institutional research has effectively constrained institutional research to the local outcomes at particular institutions. Nevertheless, institutions vary in their administrative service or academic responsiveness to each student, and the success or failure of students depends a great deal on how well institutions allocate resources to their entire body of students.
For example, honors students on full scholarships at regional, tuition-dependent colleges often have degree completions rates similar to all undergraduates at Ivy League institutions. At the same time, to subsidize the honors program, those tuition-dependent colleges charge non-honors students tuition that more closely approaches the list price. In addition to the differential tuition charges, the institution then allocates a greater share of its instructional costs to the honors program to support smaller class sizes, team teaching, honors college programs, and other specialized services for honors students — expenditures entirely subsidized by revenues derived from the tuition of students who are not eligible for the honors program. Thus, while honors students appear to be the most successful due to “academic preparedness,” in effect the hidden subsidies in tuition and instructional services for the honors programs likely plays a substantial role in the inequity of student outcomes at a particular institution.
Such insights into the allocation of resources and student success have been inaccessible to institutional researchers without multi-institutional unit record data sets. Consequently, proprietary data science vendors are building unparalleled access and insight into the performance of higher education institutions as they grow their data sets into the tens of millions of student records, as marketed by PAR Framework and Civitas Learning. The proprietary vendors of students success will be able to discern, directly, how inequities in the allocation of resources to students when controlling for academic records and tuition discounting influence the inequities in student success.
When the data science vendors of student success attain the capacity to project student outcomes for each individual college-goer, their proprietary systems will be in a position to displace the myriad of college ranking and ratings systems that rely on aggregate data submitted to the federal government. College-going students will be able to receive “personalized, real-time recommendations” to optimize each stage of their academic experience in the choice of: school applications, financial aid offers, applicant deposits, course enrollments, program declarations, faculty mentors, internships, job placements, etc., etc.
Initially, the original client institutions of the data science vendors may experience competitive advantages over non-clients as vendors seek to enroll and retain students at their client institutions. Over time, one may expect that data science will diminish attendance at a single institutions as optimized pathways to a college degree grow to include MOOCS and proficiency-test offerings.
While these prognostications for data science may seem far-fetched in practice at this time, these claims already exist in the marketing materials of the student success vendors. Civitas Learning has aggressively marketed its “team’s vision of dramatically improving student outcomes by developing an education analytics system.” IBM Watson will “guide students through the maze of decision-making about their future careers to enhance their employment opportunities.” As the data science for student recommendations and academic progress proliferate, the responsibility for college education may gradually shift from higher education institutions to the vendors of student success.
College a la carte
The proprietary systems of data science, in practice, eventually will function as self-contained services directly to students. “Actual vs. predicted outcomes” will become de rigeur in the evaluation of proprietary solutions – every proprietary vendor will boast of its contribution to students’ actual success over predicted success while at the same time protecting the trade secrets of its predictive models. Thus, the claims for student success will not be open to scientific scrutiny and independent verification. The market – revenue and profits – will determine which systems of proprietary data science for student success prove capable and which fold.
Management fads in higher education came and went in the past because advocates recommended management techniques designed for implementation and autonomous evaluation by local decision-makers. As the outsourcing of student success to private vendors proves profitable, the university will become an instrument in the proprietary data science of optimal college experiences for students who purchase the services directly from student success vendors. Ultimately, students will come to rely on the recommendations of data science for student success, mixing and matching institutions (dual enrollments), programs offerings (self-designed degrees), etc., etc.
In the end, colleges and universities will learn to follow the direction of the data science vendors in order to secure enrollments and remain relevant to their new managers.
Codicils
Outsourcing Student Success (Slight Return) — February 2, 2016
The Honors of Inequality, Part I — March 31, 2016
- E. Bernadette McKinney and John J. Hindera, “Science and Institutional Research: The Links,” Research in Higher Education Vol. 33, No. 1 (Feb. 1992), 19-29.↵