How B-Schools Can Flip Their Thinking On ChatGPT by: Marc Ethier on August 19, 2025 | 534 Views August 19, 2025 Copy Link Share on Facebook Share on Twitter Email Share on LinkedIn Share on WhatsApp Share on Reddit Miami Farmer Dean Jenny Darroch: “We’ve learned first-hand that such an approach acknowledges the reality of AI in the classroom — and in the world — while creating a higher-stakes learning environment in which students have to do the most important thinking themselves. By shifting when and how we in higher education ask students to do that, they won’t lose skills because of AI — they’ll gain and develop them in new ways” As AI continues to transform higher education, an avalanche of articles has appeared in the news media. Story after story has examined various aspects of how AI can and should be used in the classroom and whether it’s ultimately a force for good or bad. Lately, the coverage has focused on what seems to be a particular kind of bad. In the past weeks, pieces in the New Yorker, Time, and trade publications like Laptop have highlighted concerns about how AI diminishes student’s critical thinking and creative abilities, bearing titles like “A.I. Is Homogenizing Our Thoughts” and “This is Your Brain on ChatGPT”—the latter with a photo of an egg frying in a pan. While not new, those concerns have become more front and center, partly in response to preliminary research at the M.I. T. Media Lab that demonstrated measurably that students who used AI when composing essays had less brain activity and fewer connections between different brain areas associated with creativity on an electroencephalogram (EEG) test than those who did not use it. Cornell University and Santa Clara University have recently released other studies showing similar results. What’s more, according to a multinational study of 3,500 educators conducted by ed-tech company Turnitin, a majority of students themselves worry about the potential loss of their critical and creative thinking and skills. In fact, they cited it along with AI overuse as the top AI risk. But while such concerns are certainly understandable, the idea that AI will unquestionably erode students’ ability to think critically and creatively is shortsighted and ultimately inaccurate. I’ve found in my work as dean and professor of marketing at the Farmer School of business at Miami University that more often it’s not AI itself but higher education’s pedagogical approach to it that’s limiting student development in these areas. AI will not be the end of creativity or critical thinking among our students. We just have to get there at a different point and in a new way through what I call flipped thinking. Like the flipped classroom concept, the idea of flipped thinking changes how we pursue teaching and learning. With flipped thinking, the end goal for our students is no longer a product, such as a paper, a workshop design, or some other kind of traditional assignment. Instead of thinking first and producing second, flipped thinking demands students to begin with a product and then think critically and creatively to understand, refine, and apply it—not just before but during and after they use AI. With AI now already generating outlines, drafting essays, creating slides, and the like, students can so easily find solutions that we must train them differently. Their learning should focus on analyzing, questioning, and improving what they receive—not just accepting it. For example, in 60+ Ideas for ChatGPT Assignments, Kevin Yee and other professors at University of Central Florida suggest, among other exercises, requiring students to not only ask ChatGPT to create an argument but to develop their own rebuttal or counterargument. At Berkeley College, Jason Gulya, a professor of English and applied media, asks students to create a short class on a topic that interests them, teach it to a custom chatbot he’s created, ask it for feedback on their teaching, and then evaluate thoroughly whether or not they agree with that teaching assessment. And in Wael Jabr’s data management class at Pennsylvania State University’s Smeal College of Business, students build an AI tool to evaluate standardized programming language (SQL) used to interact with and manage relational databases, requiring them to work through evaluation metrics and assessment logic. Those are just a few of the many possibilities along these lines. In our courses at Farmer, we regularly reach out to local community groups to offer students actual experience in using AI as just a first step in solving real-world problems. For instance, David Eyman, who teaches entrepreneurship and innovation, worked with Jennifer Loeb, CEO of Ronald McDonald House Charities of Greater Cincinnati, on an assignment that tasked students with coming up with ways the organization could engage people of all ages in volunteerism, not just those 50 and older who are most often involved. This coming semester, he plans to develop another such problem-solving student project, facilitated by an accounting consultant, with a large for-profit company. This kind of pedagogical reframing through flipped thinking meets both students and employers where they are. Students today see traditional teaching methods and approaches as often out of sync with what they are experiencing in the real world and what will help them succeed in the future in the workplace and life in general. Carson Porter, one of our students who worked on the Ronald McDonald House assignment, put it this way in a Poets&Quants story from earlier this year: “It’s easy to generate ideas with AI…. But does it actually solve the problem at hand?” Meanwhile, surveys find “soft” skills like the ability to solve problems that students learn through flipped thinking are what employers most value in the people they hire, as they can’t simply rely on technology to automate those skills. Students often come to college to learn specific vocational capabilities, or “hard” skills, to get a job. But now that AI and other technologies are increasingly able to do much of what once required such hard skills, employers are seeking graduates who stand out in other ways. As Gillian Oakenfull, professor of marketing and founding director of the Center for KickGlass Skills at our school, has observed, “The biggest challenges in AI aren’t technical—they’re ethical, strategic, and human centered.” In light of the explosive impact of AI, the center has expanded the concepts of critical thinking and creativity that can set students apart to include a set of key competencies: emotional and cultural intelligence, ethical judgment, adaptability and resilience, and communication and collaboration. For example, in the realm of ethics, our students use interactive simulations to test AI fairness and bias in real-world applications. And this coming fall, the center will be offering a “Sharpen Your Ethical Decision-Making Program” in which students can use AI but must also develop their own personal skills to grapple with ethics case studies and dilemmas that various businesses have actually confronted. For colleges to take a flipped thinking pedagogical approach effectively also requires training and encouraging our faculty members to teach this way and integrate such essential competencies into their curriculum. A survey conducted earlier this year by Elon University’s Imagining the Digital Future Center and American Association of Colleges and Universities found that the majority of academic deans and other college leaders think that their institutions aren’t preparing faculty members to use AI effectively for teaching and mentoring. To meet that need, 25 of our faculty and staff members gathered last summer to explore not only whether we were using AI in the classroom productively or not, but also how it was impacting specific disciplines, such as accounting, finance, and human capital management. We’ve also hired a new director of AI initiatives to coordinate our efforts and ensure that we are sharing best practices and effectively integrating AI into the classroom through flipped thinking. We’ve learned first-hand that such an approach acknowledges the reality of AI in the classroom—and in the world—while creating a higher-stakes learning environment in which students have to do the most important thinking themselves. By shifting when and how we in higher education ask students to do that, they won’t lose skills because of AI—they’ll gain and develop them in new ways. The goal is the same: preparing students with the capabilities that we know employers want and need. But the path to get there has changed—and it’s time for our educational approaches to catch up. Jenny Darroch is dean of the Farmer School of Business at Miami University in Ohio. © Copyright 2026 Poets & Quants. All rights reserved. This article may not be republished, rewritten or otherwise distributed without written permission. To reprint or license this article or any content from Poets & Quants, please submit your request HERE.