It’s Here: How B-Schools Have ‘Steadily & Cautiously’ Integrated AI Into Their Curricula

Slowly but surely, business schools are planning the integration of generative artificial intelligence into their curricula, according to a major new survey released this week. At many schools, the use of AI like ChatGPT is already well underway.

A new survey by the nonprofit Graduate Business Curriculum Roundtable finds a “steady and cautious” integration of the new technology into B-school curricula, backed by a majority of faculty and staff. The survey report, Generative Artificial Intelligence (AI): A Survey Of Business School Faculty And Professionals, captures the view that the new tech is important to learn about and use in the B-school classroom.

“It is already a period of fast and dramatic change for business schools, their programs and curriculum,” says Jeff Bieganek, executive director of GBC Roundtable. “Now with the growth of generative AI, business schools are presented with additional unique challenges and opportunities. It is exciting to see the thoughtful and impressive actions and programs that business schools and their leaders are developing and implementing to integrate generative AI into their programs, curriculum and operations.”

Source: GBC Roundtable


GBC Roundtable, formerly known as MBA Roundtable, is a global association of B-schools whose mission is to advance graduate business education through curricular and co-curricular innovation. In its new survey, 68 B-schools, primarily located in the United States, are represented by 72 faculty and professionals who responded to questions about generative AI and the B-school curriculum.

Three of four (74%) report their B-school already teaches generative AI as subject matter in the curriculum today. Generative AI usage in the business school is limited, however. Only 15% say that generative AI is significantly or fully taught as subject matter within the business school curriculum. Today, one in five (19%) report dedicated courses for generative AI subject matter. The most common generative AI topics taught in the B-school curriculum include Intro to AI, Ethics/Legal Implications, and Industry Innovation.

Because the tools examined in any teaching of generative AI are so prone to misuse and abuse, B-schools and universities are beginning to craft guardrails. At the time of the survey in August, only 3 in 10 (30%) reported the university or B-school had a policy regarding generative AI. One in five (20%) reported there was a formal group working on crafting a policy. Another third (33%) said discussions were underway.

“Interestingly, there is greater integration of generative AI as subject matter and as an area of faculty research among institutions with a generative AI policy compared to those without a policy — a statistical difference.”

Source: GBC Roundtable


ChatGPT and other generative AI are especially pertinent tools for business school students because their most effective usage is in the areas of job interview and resumé prep. In other findings in the new GBC survey, about a quarter (28%) of respondents say that generative AI is significantly or fully integrated as an area of faculty research. The utilization of generative AI for student assessment, content creation, curriculum design and delivery, student recruiting, and the student experience is even more limited. Yet, innovations are taking shape among early adopters as shown below:

  • An exploratory phase is underway to assess the potential of using AI chatbots to handle repetitive questions and potentially provide advising services.
  • Some professors use AI for batch grading similar answers, although this practice is relatively rare.
  • The application of AI in education is still in its early stages, with some experimentation around self-assessment using generative AI.
  • Current uses include cleaning up instructions, generating ideas for discussion board assignments, and assisting business analytics students in writing code.
  • A primary focus is on preparing students for the job market and enhancing their success. There is an emphasis on exploring ways to blend advising and career planning functions to improve student success.

Overall, most respondents “place a high level of importance on integrating generative AI as subject matter in the curriculum and as an area for faculty research, despite concerns about academic integrity, misinformation, and bias, and challenges faced, such as training. Half (50%) also say that business schools should utilize generative AI to personalize the student experience, in such areas as advising, tutoring, and career coaching.”

Source: GBC Roundtable


The GBC Roundtable survey results complement Poets&Quants‘ reporting this year. In July, HEC Paris professor Brian Hill published an academic paper based on a study he conducted in a first-year master’s-level class in behavioral economics. Hill told P&Q that the study reveals a reduction in performance when students use AI chatbots to help with their assignments — and suggests that without a better understanding of the tech involved, the professionals of tomorrow may do a significantly worse job when aided by AI than they do working without it.

“Our classroom experiment suggests that there may be situations in which the professionals of tomorrow do a considerably worse job when aided than when working alone — perhaps due to biases that have been long understood, perhaps due to some that remain to be further explored,” Hill said.

The study suggests the need for more research into performance at the human-AI chatbot interface, Hill said — and it argues for “more, rather than less, chatbots in the classroom,” he added. “One of the skills of the future, that we will need to learn to teach today, is how to ensure that they actually help. While the answer task is arguably representative of traditional work practices, the correct task may correspond more closely to many jobs in the future. If AI tools become as ubiquitous as many predict, the human role will be to evaluate and correct the output of an AI — precisely as asked of students in this task.”


Earlier this year, in June, P&Q interviewed three professors at Cambridge Judge Business School who generally see generative AI as a positive in the classroom.

Jaideep Prabhu, professor of marketing: “I think it’s a plus for educators and it’s complementary to what we are already using. For business students, it is an amazing tool. You can’t rely on it but it can be a huge aid to a business you are running. We have to get our students to use it with caution and to sharpen their judgment about it. It is relatively easy to check if it is feeding you BS but more difficult to go behind generic concepts and answers. That’s why the prompts and interrogation you use for ChatGPT are just as important as what it tells you. And if it tells you something you are unhappy with, you can ask in a different way and see how that impacts the answer.”

Thomas Roulet, professor in organization theory and deputy director of the Judge MBA program: “I teach leadership in the MBA program and it’s all about trusting yourself and your ability to manage complex situations. It would be a mistake to ban AI because it is with us now. People will have access to it in organizations all over the world, and it will be integrated with their approach to work. So we have to teach our students how to use AI as a resource for knowledge and decision-making.

“It is clear that students are already using ChatGPT. If you ask it what is the definition of psychological safety, ChatGPT will give you a better definition than Wikipedia. In the workplace, students can use ChatGPT to help them decide how to deal with a certain situation. But in the end, what will matter is their own judgment and the cues they get from their interaction with others.

“We need to make them use it and make them use it in a critical way where they can derive nuggets from ChatGPT, assess them and use them in the workplace with a human and social touch which makes the difference between a good and a bad manager. If you were to ask ChatGPT to manage people, it would be quite bad at it unless it becomes very good at reading emotions and making instinctive decisions.”

Stella Pachidi, assistant professor in information systems: “We want people to think critically. We want them to develop the skills to become able to assess the situations they will face in their organizations and that is not at all impacted by ChatGPT.

“Still, we used it in the classroom this year for the first time. Students were asked to engage in an exercise planned to explore how organizations can leverage AI so we were already discussing a healthcare case and I gave them an exercise on around sensor technology and I asked the students to use ChatGPT to identify the big players in sensor technology. The first company ChatGPT named doesn’t exist anymore because it had to close down. So they immediately started figuring out the limitations.

“What is really important is making people aware of how you have to play around with the prompt to make ChatGPT useful. You can tell if someone has used it to write an essay or a response to a question. It reads like Chat-gpt. You figure it out. You cannot obviously prove it so you cannot punish the person but it reads as such. Because it reads bland, those essays had already been marked very low because they don’t show that the person has a deep understanding of the issues we discussed in class.”


In February, INSEAD Business School Deputy Dean Peter Zemsky told P&Q that there’s plenty of questions to be asked about ChatGPT, but added that it’s “worth the hype,” representing the latest in a series of AI progressions over the last decade. First came image recognition and then AI’s ability to generate images and now, ChatGPT, the ability to generate text — an “obviously significant breakthrough new application,” he said.

Keys to understanding technology advancement are learning how or how not to use the specific technology at play. With ChatGPT, Zemsky said, the conversation — though a fair question — should be less about whether it’s used for cheating on homework or exams and more about it as a tool for improving business and society.

We are living through a kind of digital industrial revolution, Zemsky said. Wave after wave of new technology has emerged and the progression of artificial intelligence (AI), he says, has exploded. A factor of this is the cost for processing, storing and sharing data has become relatively inexpensive.

“Fundamentally, what’s driving it is terabytes and terabytes of data and better processors and now, an ecosystem of people working on the algorithms to bring the processing power to data,” Zemsky said.

See the full GBC Roundtable survey report here.


Questions about this article? Email us or leave a comment below.