Rethinking AoL—Moving From Compliance to Strategy
- Many schools still approach assurance of learning as a compliance burden, often collecting too much data—or the wrong data. As a result, they produce very little meaningful change.
- By integrating AI into the assessment process, schools can streamline reporting, reduce effort, and generate timely insights that faculty can apply in real time to effect curricular improvements.
- Strategically designed, mission-driven AoL empowers schools to align learning goals with their societal impact and to ensure that students graduate with skills that matter.
“Not another AoL meeting,” you groan to yourself. If you’re a business school dean, chances are you’ve felt the frustration around implementing an assurance of learning (AoL) strategy—that well-intended but often onerous process demanded by accreditation.
Under Standard 5 of AACSB’s Business Accreditation Standards, schools must have “well-documented assurance of learning processes” and produce evidence that these processes lead to curricular improvements. In theory, this makes AoL a powerful tool for improving programs. But in practice, too many of us experience AoL as a paperwork exercise—as mere data in, reports out—without enough positive change to show for it.
In a 2023 article in AACSB Insights, Karen Tarnoff notes that a significant proportion of schools have struggled with effective AoL implementation. During the 2020–21 and 2021–22 academic years, she points out, many institutions seeking AACSB accreditation were cited for AoL deficiencies. Among those that required a second year of continuous improvement review, nearly 68 percent were identified by peer review teams as having shortcomings in their AoL systems.
Juggling Acts—Managing Multiple Frameworks
Part of the problem is that AoL isn’t the only game in town. To satisfy different stakeholders, most business schools are juggling multiple, misaligned assessment systems. We map learning outcomes one way to fulfill AACSB’s requirements, another way to meet national quality assurance mandates, and yet another within internal curriculum committees.
Each system serves a purpose, but together they can fragment our efforts. It’s not unusual, for example, for a U.S. school to run parallel outcomes assessments for regional accreditation (since all regional accreditors now require it) while also collecting data for AACSB.
The result? We duplicate our efforts and trap our data in silos. Our faculty and staff spend countless hours generating reports for various audiences, yet they often struggle to connect the dots between those deliverables. Instead of creating one coherent picture of learning, we get only disjointed snapshots of students’ educational outcomes.
Pain Points—Collecting Data Without Impact
Ask business school deans about the biggest challenges of the AoL process, and you’ll hear them describe these common pain points:
Inconsistent rubrics and measures. Different departments or instructors use rubrics in different ways (or not at all), making it hard to compare or trust the data. What one course calls “critical thinking—meets expectations” another might call something else altogether.
Siloed data. Assessment evidence for each program and purpose lives in separate spreadsheets, databases, or exports from a learning management system (LMS). Little of that evidence is integrated, which makes it cumbersome to draw schoolwide insights.
Manual aggregation madness. Pulling together AoL results often requires heroic manual efforts—copying and pasting scores into Excel, emailing reminders to faculty to submit artifacts. This additional effort invites error and eats up time. By the time you’ve compiled the data, the semester is long over.
Shallow “loop-closing.” Perhaps the biggest gripe is that, after all that work, the changes schools make based on the data are often superficial. Substantive curriculum improvements are rare, which breeds faculty cynicism and assessment fatigue. When faculty then complain that so much data is collected but nothing really changes, they’re not entirely wrong.
Our faculty and staff spend countless hours generating reports for various audiences, yet they often struggle to connect the dots between those deliverables.
When they see such little payoff for their efforts, faculty become reluctant to engage in AoL at all. They may view it as busywork imposed from above, disconnected from the real teaching and learning happening in their classrooms. It’s telling that, according to Tarnoff, the most significant AoL problem “is not measurement … it is how to use the data to improve student learning.”
In short, we measure for the sake of compliance, not insight. We collect plenty of data, but we do not generate enough impact.
Our AI Toolkit—How Smart Systems Can Help
The good news is that 2025’s technology toolkit offers a way out of this rut. We finally have artificial intelligence (AI) and smart systems that can transform AoL from a compliance chore into an intelligent, streamlined process. Although these tools won’t close the loop for us, they make it possible for us to achieve that objective in a far more meaningful way.
In their 2021 article, Axel Borschbach and Tim Mescon suggest that a well-executed digital AoL system should “integrate with existing LMS to avoid data silos,” bridging data related to faculty, students, and accreditors on one platform. An integrated AoL system—one that plugs into a school’s LMS and pulls assessment data directly from courses in real time—offers several advantages:
- It collects evidence as part of teaching workflows. No more hunting down Excel files from each instructor.
- It maps and aligns outcomes across multiple frameworks. AI’s pattern recognition can ensure that we’re not maintaining separate sets of books for each stakeholder. This means that the same student project could be evaluated once and simultaneously counted toward AACSB learning competencies, national qualifications, and even internal skill benchmarks.
- It detects trends in real time. With traditional AoL methods, by the time we realize a learning outcome is underperforming, a full assessment cycle (or two) has passed. But AI can quickly crunch large data sets, enable predictive modeling, and flag issues before they fully materialize. Instead of relying on year-old information, deans and faculty can access dashboards of live AoL data. If they see that scores for “ethical reasoning” start dipping this term across sections or that one campus cohort is lagging in quantitative skills, they can intervene promptly—during the program, not after the fact.
- It dramatically reduces the grunt work. Automated data capture and aggregation allow faculty and staff to spend less time crunching the numbers and more time analyzing them. AI’s ability to sift text and qualitative feedback could bring insights to the surface (such as common themes in assessor comments or alumni surveys) that a manual process might miss.
When we use today’s technology, the AoL process becomes less about compiling static reports and more about continuously monitoring and improving learning—which is what it was supposed to do all along.
A Word of Caution—AI Is Not a Silver Bullet
That said, a critical caveat bears repeating: Technology is an enabler, not a substitute for human judgment. No AI will magically decide how to redesign a course. That responsibility still falls under the purview of faculty expertise and takes place under the oversight of curriculum committees.
In fact, rushing to plug in a fancy new tool without a solid process can backfire. Technological solutions can’t ensure an AoL process is well-developed or that faculty are engaged. AI can crunch numbers and point out patterns, but faculty still need to interpret those results, contextualize them, and act.
Smart systems can finally liberate AoL from the doldrums of compliance, free up faculty’s time, and provide greater clarity, allowing a culture of improvement to take root.
We must frame AI not as a silver bullet, but as a way to augment the human-driven continuous improvement cycle, says Eve Alcock, director of public affairs at the U.K.’s Quality Assurance Agency for Higher Education. AI, she writes, can provide the quantitative basis, but qualitative activities such as interpreting data in context are “better suited to human oversight.”
In practice, this means that the roles of deans and AoL committees would shift away from merely collecting data to facilitating rich discussions around the data insights: Why are students underperforming on a certain goal? What changes might boost their learning? How will we know if an intervention worked? AI will happily churn out charts and correlations, but it’s up to us to close the loop with wisdom and action.
When leveraged thoughtfully, these smart systems can finally liberate AoL from the doldrums of compliance, free up faculty’s time, and provide greater clarity, allowing a culture of improvement to take root. When faculty no longer see AoL as a check-box exercise but as a source of valuable data, the process becomes faster, richer, and genuinely useful to faculty and students alike.
A New Mindset—Mission-Driven AoL
Perhaps coincidentally or perhaps cosmically, as technology opens new doors for AoL, accreditation standards have evolved to encourage schools to adopt more strategic mindsets. Under AACSB’s Business Accreditation Standards, it’s explicit that AoL is not intended as a compliance exercise. In fact, peer review teams are advised to look for the “spirit and intent” of Standard 5—namely, to determine whether a school has a culture of mission-driven, evidence-based continuous improvement where data leads to action.
As AACSB emphasizes, what defines a “strong and mature AoL system” is a “systematic process, informed by the school’s mission and strategies and resulting in meaningful improvements in curriculum and learning.” The goal is not to gather a large quantity of assessment data; the goal is to draw quality information from that data and take relevant action.
This means that we need to align AoL with our missions more closely than ever. For example, if your mission emphasizes strengthening entrepreneurial leadership or developing global mindsets, your AoL system should be tracking and improving competencies in those very areas.
This mission-centric, impact-focused view elevates AoL to a strategic level. No longer is it merely about satisfying Standard 5 for the purpose of accreditation; it’s about using Standard 5 as a lever to achieve Standard 1 (mission fulfillment) and Standard 9 (societal impact). We must show not only that our students can analyze data or write case analyses, but also that our programs make a positive impact on society in line with what we claim to stand for.Assurance of learning is a key mechanism for ensuring that our graduates have the skills and values to create that impact in the real world. When done right, AoL empowers faculty to “translate assessment results into action, reassess the impact, and drive meaningful change” in curricula, emphasize Desiree Moore, Christina Perry, and Katalin Kovacs in their 2025 article. It becomes a virtuous cycle.
By leveraging AI and integrated systems, we free our people to focus on creative problem-solving and pedagogical innovation, rather than leave them to drown in data-collection duties.
The creation of clearly defined learning goals that are firmly rooted in a school’s mission leads to robust measurement of those goals. That measurement in turn leads to analysis and improvement, which leads to stronger outcomes and greater mission-consistent impact.
In such a culture of continuous improvement, assessment is not about checking boxes. Rather, as Moore, Perry, and Kovacs put it, “it’s about cultivating academic programs that are relevant, responsive, and rigorously designed to serve students, industry, and society.”
A Way Forward—AoL as a Strategic Lever
The path from compliance to strategy requires investing in smarter AoL infrastructure and nurturing the culture to use it well. This doesn’t mean chasing every new tech toy, but it does mean thoughtfully adopting tools that align with your school’s mission and make life easier for your faculty. It means convincing your community that AoL is not just an accreditation checklist, but a strategic feedback loop to ensure academic clarity, relevance, and purpose in everything you do.
How can schools begin creating a more integrated and strategic AoL process? Pragmatically, they should start small:
- Pilot an LMS-integrated assessment tool in one program. Show how automated data collection can save dozens of hours and yield richer reports.
- Provide faculty development on using assessment data. Help them see assessment not as an external imposition, but as their tool for refining curricula.
- Celebrate early wins. For instance, if AI’s analysis of AoL data helps identify a gap in digital skills and you close it with a new module, share that story widely. Nothing builds buy-in like seeing concrete improvements that trace back to assessment insight.
AoL “can and should be the driver of curriculum change,” says AoL expert Kathryn Martell. Our task as leaders is to make that outcome real by equipping and inspiring our teams.
Embracing smarter AoL isn’t about appeasing accreditors; it’s about future-proofing a school’s quality and impact. By leveraging AI and integrated systems, we free our people to focus on creative problem-solving and pedagogical innovation, rather than leave them to drown in data-collection duties. We create environments where AoL truly assures learning and where we can confidently say, “Yes, our students are achieving the learning competencies that matter—here’s the proof, and here’s what we’re doing to get even better.”
That is the kind of assurance that helps schools thrive, whether or not their goal is to achieve accreditation. And that is the promise of moving AoL from compliance to strategy. It’s a promise well worth our effort to fulfill in the years ahead.