How Harvard Business School Uses Generative AI In Its MBA Classrooms

Harvard Business School’s generative AI policy is that its use is allowed for preparation outside of class, but inside and during exams students must use what’s in their brains

Way back in 1984, when personal computers were becoming more popular for both homes and businesses, Harvard Business School became one of the first B-schools that required PCs for some classwork. While students were not required to purchase the IBM Portable Personal Computer recommended by the school, 787 out of 800 incoming HBS students arrived on campus with the devices, according to this 1986 Harvard Crimson article.

“That move not only helped students prepare for the coming era, but it helped usher in a wave of teaching and research innovation at the school,” says Mitchell Weiss, the Richard L. Menschel professor of management practice chair.

“We felt we were moving through a similar moment in the fall of 2023.”

That moment? The arrival of generative AI.

As part of an ongoing two-year project to review and enhance the required curriculum for first-year students, Harvard Business School has been crafting a policy on generative AI and other tools since early 2023 – shortly after ChatGPT rolled out the previous fall.

HBS’s policy translates roughly to this: Use Generative AI for preparation outside of class, but inside and during exams, use what’s in your brain.


Harvard Business School’s Mitchell Weiss: “In class, students are still meant to use only what was in their heads. Of course, there are a growing number of sessions where faculty are inviting the use of tools in the classroom as part of exercises and discussions about generative AI”

ChatGPT officially rolled out to the public in November 2022. It delighted users with original rap lyrics, relationship advice, and jokes about Open AI cofounder Elon Musk. It awed with its ability to write, debug, and explain computer code. Other generative AI tools like Bard, DALL-E, and Stable Diffusion soon followed.

On the one hand, the capabilities of generative AI open immense opportunities for innovation, creativity, and efficiency – not just in classrooms, but in workplaces, research labs, and all manner of tasks of daily life.

On the other hand, it raises real and difficult questions surrounding cheating and how to evaluate what students are actually learning versus what they are regurgitating from a large language model.

Business schools, in turn, have rolled out AI policies that attempt to strike a balance between harnessing AI’s transformative capabilities while protecting academic integrity.

Shortly after ChatGPT hit the scene, The Wharton School left guidelines for AI use up to the professors. Operations Management Professor Christian Terwiesch, who famously fed his final MBA exam into ChatGPT and which the chatbot famously passed, decided he would encourage generative AI for preparation and other work, but forbid it for exams and graded homeworks. Operations Management as a discipline is still a bit more bread-and-butter skill certification, he told P&Q at the time.

Another Wharton professor, Ethan Mollick, associate professor of management teaching innovation and entrepreneurship, published his own AI policy which basically required students to use the tool in his class.

At Columbia Business School, faculty generally will indicate in their course syllabi and assignment instructions whether or not generative AI is permitted. When it is permitted, students are required to cite which platforms they used and how they used them. Using generative AI tools without this permission is considered a violation of the CBS Honor Code.

And, Stanford Graduate School of Business follows the general guidelines set up by the university’s Office of Community Standards: Instructors may use AI tools as they see fit, but must set clear expectations for its use in their syllabi while reinforcing that policy in class.


Harvard Business School’s policy can be simplified thusly: HBS invites students to use generative AI before class to prep for case discussions and after class to reflect on what was learned.

“But, in class, students are still meant to use only what was in their heads,” Weiss tells P&Q. HBS also has policies on citing and disclosing the use of generative AI, mostly in written work.

“Of course, there are a growing number of sessions where faculty are inviting the use of tools in the classroom as part of exercises and discussions about generative AI,” he says.

The policy is based on two principles: First, HBS wants its MBA students to be inventive, responsible, and flexible leaders in an era of AI. Second, AI should enrich and support the MBA experience and human judgment – not replace it. As the technology – and people’s understanding of it – develops, so will the HBS policy evolve over time.

“We believed, and continue to believe, these tools will have a massive impact in businesses and leadership broadly, and we wanted our students to begin to uncover for themselves how they would use generative AI tools and make a positive difference in the world using them well,” Weiss tells Poets&Quants.


Weiss also serves as the practice chair of the Required Curriculum (RC) Renewal Project, a two-year review of HBS’ foundational first-year courses. (A similar effort is ongoing for HBS’ second-year elective curriculum.) Part of that process means weighing how to stay current for the needs of future business leaders without chasing fads.

Like PCs 40 years ago, generative AI is certainly no fad but a tool that will transform how we work. So, even as HBS strives for balance in the classroom, it is also requiring the use of GenAI in some instances. In fact, the school provides every first-year student with a ChatGPT Plus/Advanced Data Analysis account.

“We felt it signaled how important it was for students to become proficient with these tools overall. More specifically, we also felt it was important for students to have some of the advanced data capacities as they worked to become proficient in the area of data science management,” Weiss says. “We also wanted there to be equitable exposure to these tools across the class of 900-plus.”

The school has also conducted student trainings on using the tools to advance learning, push thinking, and test their knowledge.

“We know that they could use the tools to short-circuit their learning, but we also know that they came to HBS to learn, so we showed them ways they could use the tools to increase their learning instead.”


Questions about this article? Email us or leave a comment below.