In Assistant Professor Michael Albert’s MBA data science class at the University of Virginia’s Darden School of Business, students analyze historical usage data for a bike-sharing service to determine a bicycle maintenance schedule. In the past, they would tackle this simulation by writing code in the Python computer programming language — a pain point for many since their interest lies in business, not computer science.
But now that generative AI can produce syntactically correct code based on a plain-language prompt, the assignment has transformed from a technical coding problem into an exercise in articulating analytical goals. The assignment now, Albert says, is more about developing critical thinking skills.
“They’re not engineers or mathematicians; what our students care about is effective decision-making — being able to look at a complex … multidimensional problem and come away with the insights that will allow them to make the right decisions,” he says. “I view ChatGPT as a way to accelerate our students’ ability to make those decisions.”
Since the public debut of OpenAI’s ChatGPT generative artificial intelligence platform in November 2022, business leaders have been avidly exploring the benefits and potentials of generative AI. And business schools have accordingly come to see the need to integrate AI training into their curricula to ensure graduates are prepared to succeed in fast-changing workplaces where generative AI may be here to stay.
“I see that generative AI is transforming work across the board, and I think that it has the ability to increase the efficiency and productivity of workers,” says Paul Brooks, department chair and professor in information systems at the Virginia Commonwealth University School of Business. “In order for our students to be competitive in the workforce, they’re going to have to know how to use this technology.”
Brooks’ analytics students use AI for assignments such as analyzing historical data to figure out how many orders a vendor should prepare, a process that requires critically assessing AI’s output and refining prompts to improve the result. Such tasks are going to become ubiquitous as these tools evolve and penetrate all aspects of business operations.
“I can’t think of one corner of the world of work that will not be impacted by this,” says Phillip Wagner, a clinical associate professor at William & Mary’s Raymond A. Mason School of Business. “I think there’s a call for every industry and academic domain to be thinking about it. Without reservation, all business schools need to be doing this.”
Closing the AI gap
While employers around the world move quickly to embrace AI, there is a disconnect between leadership perspectives and employee actions and capabilities. An August 2023 Gallup survey found that 72% of Fortune 500 chief human resources officers anticipate AI will transform their companies’ staffing needs within the next three years. Yet 70% of employees say they never use AI tools, and more than half don’t feel capable of doing so.
Business schools need to help fill this gap. An early formal example of this is Northwestern University’s MBAi Program, an AI-focused degree program run jointly by the university’s Kellogg School of Management and McCormick School of Engineering. But in most cases, business schools are just beginning to integrate AI lessons and concepts into their curricula informally or on a case-by-case basis within the context of faculty discussions.
In October, the University of Virginia’s Darden School of Business received the largest gift in its 68-year history from alumnus David LaCross and his wife, Kathleen. Their $94 million donation, among the 10 largest gifts ever received by any business school, is aimed in part at helping Darden become a pioneer in researching and teaching about artificial intelligence.
In June, the university’s Generative AI in Teaching and Learning Task Force reported that 42% of U.Va. students currently use generative AI tools to assist with coursework. The task force recommended that schools and units within the university mandate syllabus statements around expectations on AI usage. The business school has been holding monthly seminars for faculty covering issues connected to AI, and faculty members who are interested in integrating AI into their courses can consult with instructional designers.
Kushagra Arora, Darden’s chief digital officer, emphasizes that the business school has been using other forms of AI in courses for a decade, so the advent of generative AI need not change anything fundamentally.
“We still want to do things in ways that are ethical and current, and to manage risk and security,” he says. “Almost every business unit at Darden at this point is using AI in some way or another.”
Like at U.Va., other Virginia business schools are proceeding with plans for AI with much discussion, information-sharing and education but with relatively little emphasis thus far on formal policies governing its use.
Creating blanket policies for AI usage is fraught because it can impinge on academic freedom and faculty autonomy, some experts say, and such policies can also become outdated quickly as technology changes at lightning speed. As a result, most schools are focusing on providing information and guidance on how faculty can integrate AI while maintaining intellectual rigor, ethical standards and academic integrity.
“You don’t want to make a firm response and then have that become immediately outdated,” says Benjamin Selznick, associate professor and adviser of postsecondary analysis and leadership at James Madison University’s College of Business. “You want a dynamic, flexible context that’s going to evolve as the technology itself rapidly evolves.”
Meanwhile, some schools are proactively ensuring that faculty are well-versed on generative AI and equipped with needed resources.
William & Mary’s Mason School of Business has created a task force and a community of practice around generative AI. The school’s Academic Innovation team is publishing information about AI in its newsletter and is teaming up with the school’s McLeod Business Library and Center for Online Learning to hold virtual office hours for faculty on the topic. The team also created an online AI toolkit for faculty and presents updates on generative AI at every faculty meeting.
The overriding message is that faculty should play a key role in helping students learn to make the most thoughtful and effective use of AI tools.
“We need to expose our students to this technology so they can understand its limitations, its biases and the knowledge to critique what they have received as an output,” says Karen Conner, director of academic innovation at the Mason School.
Part of the AI-focused conversation at business schools revolves around how to maintain rigor, assessment standards and honor codes. It’s important for MBA programs to confidently certify that their graduates possess requisite skills, but assessing that can require creativity in a context where there are no effective filters to identify AI-generated output.
“Everyone’s handling it in their own way,” says Selznick at JMU. “It’s important to acknowledge that effectively collaborating with AI is going to be a valuable skill, but I’m also hearing about instructors who are saying, ‘I’m going back to pen-and-paper final exams because that’s the only way I’m going to know’” students aren’t cheating by using AI to complete assignments.
VCU’s Brooks says some faculty are turning to oral exams or asking students to report on their work in ways AI never could, such as by completing a task and then explaining their thought processes.
“It’s been very much disruptive to the way we’ve been doing things,” he says. “It’s caused us to rethink how we deliver the materials.”
Wagner, at William & Mary, believes that kind of rethinking can be good for faculty and their students. While he recognizes that some classes may have a stronger need for verifiable assessment than others, he encourages professors to reconsider how to appraise students’ learning.
“Instead of dwelling in the land of anxiety, I think it’s an invitation,” he says. “If your courses are ones where your students could plug your homework prompts into a machine and get an output, maybe it’s not necessarily the students’ problem alone. It’s that your teaching method needs some refreshing.”
For all kinds of educators, grappling with AI is overall a question of how to combine the possibility and disruption of emerging technology with the age-old tradition of cultivating critical thinking skills.
“Essentially, for me, this goes back to liberal arts education,” says Kenneth Kahn, professor and dean at Old Dominion University’s Strome College of Business, where faculty are focused on helping students understand the potential pitfalls of AI, such as the technology’s tendency toward hallucinations and biases, as well as exploring promising ways that AI can enhance productivity. “You have to have that critical thinking to evaluate what the output is, to decide whether or not it’s important. That’s where our MBA students and our business schools need to go.”
For some MBA faculty, using AI to further a liberal arts education means using the tools to encourage open conversation and robust self-reflection. Tracy Johnson-Hall, a clinical associate professor at William & Mary’s business school, requires her MBA students to explain how they used generative AI on each assignment.
“First and foremost, by encouraging open discussion of the use of generative AI, I want to reduce the perception of any prohibition around it and instead focus on open conversations about where it is useful and how best to leverage it for productivity,” she says. “Asking them to explain how they use it generates conversation.”
William & Mary’s Wagner began requiring his MBA students to use generative AI in the fall 2023 semester, both to teach best practices for its use, but also to offer them opportunities to reflect on the process and on themselves.
For a course called Diversity in the Workplace, he has students conduct a dialogue of at least 30 messages with a generative AI tool on the subject of “a diversity hang-up,” such as friction between diverse beliefs and religious convictions. The process allows students to learn “to ask better questions and to be questioned, which makes us better, more well-rounded critical thinkers,” he says.
For William & Mary MBA student Skander Lakhal, who spent 10 years in the oil and gas exploration industry before returning to school, such facilitation of critical thought is the most promising aspect of generative AI. He uses AI to help with research, summarize long articles, process recordings and slides from lectures, prepare for exams and generate examples of difficult concepts from his classes.
“Through these experiences, I’ve learned that AI is more than just a tool for efficiency,” he says. “It’s a catalyst for deeper understanding and innovative problem-solving.”
Great possibilities lie in generative AI tools that enable intellectual exploration and new ways of thinking and working, say business school professors who are approaching the tools with an attitude of curiosity and enthusiasm.
“The potential is quite exciting, I think,” says Conner of William & Mary. “I think we’re living in great times.”
This story was written by Katherine Gustafson and originally appeared in Virginia Business on December 31, 2023.