I cringe a little when I hear the phrase, much bandied about these days: “don’t waste a good crisis.” It seems altogether too crass, too flippant for the painful times we are living in, now defined by the deadly confluence of racism, pandemic, and unemployment. Even in higher education, no ivory tower for anyone who has been paying attention, there is enormous turbulence and distress: to open or not to open? And if we open, how? Stay “remote” or make a full-fledged commitment to going online? The core model of many colleges and universities has been threatened overnight. Some will close, forever.
And yet, the sudden upending of long-held truths — for instance, that the presence of a faculty member in a classroom defines quality education — provides an opportunity to rethink some assumptions that are long past their sell-by date. For example:
- Why is the status quo accepted, when half our students never graduate?
- Why do the worlds of learning and work need to be kept separate?
- Why should students go to college, rather than having college come to them?
Let’s take them one by one.
Only 41% of students graduate from a “four-year” college within four years. If we stretch the statistic to six years, the average graduation rate rises to 60.4%. The average three-year graduation rate at 2-year institutions is under a third. We can — rightfully — quibble with these statistics and how they are calculated — and used — but the US still ranks behind a lot of countries, including Turkey, Spain, and Russia.
In any other industry, these rates would be considered an indictment of the business, not the customers. And yet too often the prevailing attitude is that students are the problem, rather than the obstacle course higher education has itself created, which includes failure-ensuring development ed sequences; “gateway” courses in subjects like math that permanently derail students rather than helping them acquire badly needed quantitative literacy; arbitrary and capricious transfer credit rejections, incomprehensible graduation requirements, and so on. To make matters worse, so many students are attempting to navigate this obstacle course while coping with extraordinary life challenges, including hunger and homelessness. Learning itself should be challenging, but surely everything else about college doesn’t need to be.
What if we reimagined college from a backward design perspective, beginning with the key question: what should someone know and be able to do in order to begin their career? Then we could explore what combination of experiences and more formal learning would ensure they had those competencies and build from there.
Sequestering learning from work
I was once on a panel at a higher education conference when a highly respected researcher and writer dramatically rose from his chair and yelled at me from across the room: “Higher education shall not be dictated to by employers!” As he made his way to the door, I could only respond: “but wouldn’t it be nice if they talked once in a while?”
Community colleges, which typically have a workforce-focused division, have long acknowledged that most students go to college to get better jobs. But on the four-year front, many faculty still hold the belief that work and education should be kept separate, that employment-relevant learning is somehow an unfortunate concession to the outside world. But there is no “outside world,” even on the most isolated of campuses. Virtually all college students work, whether they are adults juggling school along with jobs and family or “traditional”-age students nonetheless putting in thirty hours a week or more. Despite the evidence thatt working can actually improve earning outcomes later, too often it is simply seen as an interference or impediment.
A wonderful phrase has emerged lately to describe new ways to conceptualize professional development and workforce training: “learning in the flow of work.” It reminds us that all learning can — and does — happen “in the flow of life.” This is not a bad thing. What if we embraced work-based learning — whether in the form of employment, internships, degree apprenticeships and so forth — as a key component of education and sought to integrate it into the curriculum?
“Going” to college
Too many media discussions and, unfortunately, too much public policy, still assume that a “college student” is a 18-22 year-old studying full-time at a four-year residential campus. Those students are increasingly a rarity; most college students are now “post-traditional,” i.e., older, working, part-time, and commuting and/or taking classes online. In fact, even before the pandemic, a third of all students were taking at least one class online. But when acknowledged at all, such students are presumed to be having an inferior educational experience.
It is important not to draw the wrong lessons from the overnight move to remote in the wake of the pandemic. Admittedly, the sudden switch to distance learning was not always a satisfactory experience for anyone. Under the circumstances, both schools and students did pretty well. But the conclusion should not be that online is bad; it is that higher education, having insisted that educational quality depended on the presence of a teacher in a classroom, was simply not prepared to deploy digital technology in the service of learning.
What if we saw learning in the full context of 21st century life? No other sector of our society treats technology as optional, much less antithetical to its mission. People have complex social lives online; bank, buy, play games, get medical care — and learn — online; and assume they will have 24/7 access on any device to what is important to them. Surely it’s time to recognize that digital technology, employed with skill and creativity, can deepen and extend learning, making it more rewarding, engaging, effective, and convenient. What if we put aside the exceptionalist view of higher education and committed to using all the tools at our disposal to create colleges that work for students?
Cathrael (Kate) Kazin, JD, PhD
Managing Partner, Volta Learning Group