It starts like a bad Monday morning.
A company’s systems slow to a crawl. Alerts stack up. Something’s off — and fast. Students don’t read about it. They’re in it. At the UB School of Management, a cybersecurity course has begun to feel less like a classroom and more like a control room, where decisions carry weight and time moves a little too quickly.
When textbooks fall behind
Cybersecurity doesn’t sit still. By the time a traditional case study gets printed, the threat it describes might already feel dated — like trying to study last year’s weather to predict today’s storm. That gap matters. Students can end up learning frameworks without ever feeling the pressure that defines real cyber crises. The stakes stay abstract. The urgency never quite lands.
Kevin Cleary, clinical associate professor of management science and systems, saw that disconnect firsthand. And he didn’t treat it as a minor inconvenience. It was a signal.
So, what if the case wrote itself?
Instead of relying on static materials, Cleary, who has experience as a chief information security officer, flipped the model. In MGS 650, “Information Assurance,” generative AI creates fully realized organizations on demand. Not generic ones, either. Students can shape the company they’re defending — health system, startup, regional bank — down to its tech stack and regulatory environment.
It’s a little like building a sandbox, then setting a storm loose inside it. The AI doesn’t just generate a profile. It builds the kind of world, complete with systems, data flows and vulnerabilities, that instructors used to spend weeks assembling.
Then the attack hits
Once the environment is set, the exercise shifts. Students step into a live, unfolding cyberattack simulation modeled after real tabletop exercises used in industry. An attack vector appears. Teams assess damage, weigh tradeoffs and decide what to protect first.
Then it escalates. New information surfaces. Consequences ripple. Pressure builds. It’s messy. It’s uncertain. It feels real, because it is, in all the ways that matter.
Learning that sticks
Here’s the surprising part: the technology isn’t the main story. It’s what happens to students. They stop acting like students. They start thinking like security leaders: prioritizing risk, defending assets and explaining decisions in plain language. They leave the course having already “done the job,” at least once. That shift builds something harder to measure but easy to recognize: confidence. And maybe more important, identity.
A glimpse of what business education could be
The pilot reached 42 students, a small group, but enough to prove the idea holds. What stands out isn’t just the technology. It’s the pace. Courses can now evolve as quickly as the industries they reflect. Scenarios update. Risks change. The learning keeps up. That has implications far beyond cybersecurity. Because the same approach, AI-generated scenarios paired with hands-on simulation, can extend into finance, operations, compliance and more.
The future is here
There’s a tendency to talk about AI in big, abstract terms, but this is real. It's a classroom where students make decisions under pressure, backed by tools that mirror what they’ll see on the job. And that’s precisely the point.
The distance between research and practice doesn’t disappear overnight. But sometimes, it shrinks in quiet, practical ways: one course, one simulation, one moment where a student stops asking, “What would I do?” and starts acting on it. That’s when learning clicks. And when it does, it tends to stick.
This story was written by AI and edited by a member of the UB School of Management Marketing and Communications Office.
