An illustration of a diverse group of people interacting with a large tablet in a scenic landscape.

Becoming ‘AI Ready’

Getting your Trinity Audio player ready...

With its new “AI Readiness” initiative, Lehigh embraces an AI future—where students will both learn about and with AI, and do it ethically.

Photography by

Christa Neu, Illustration by Eva Vázquez

No recent issue has roiled education more than artificial intelligence (AI). Especially since the rise of generative AI three years ago, educators have struggled to determine how and whether to allow AI into the classroom, and how it might be taught and used for learning. Even as academics have wrung their hands, however, the business world has had no such compunctions. Across the world, companies have embraced AI with both hands, integrating new tools into their workflows to improve innovation and efficiency.

Higher education has come to an inflection point. If universities continue to vacillate on allowing AI into the curriculum, they risk leaving students behind in preparing for the world they’ll soon enter. Recognizing that fact, Lehigh has been on the forefront of embracing AI as a force reshaping not only what students learn, but also how the university teaches. Starting in early 2023, the university announced resources and tools to help faculty members integrate AI into their classrooms. In October, the university announced a new “AI Readiness” initiative to prepare students for an AI-driven future.

The new initiative takes a campus-wide, multidisciplinary approach to AI to prepare every student, regardless of major, to use its tools. From faculty experimenting with AI tutors and other classroom supports to AI trainings that provide students with the knowledge and skills needed for the working world, Lehigh is embedding AI into every facet of academic life, positioning Lehigh as a leader in AI education and research. According to Nathan Urban, provost and senior vice president for academic affairs, the goal is threefold: to teach about AI, to teach with AI and to teach ethical use of AI.

They’ll be expected to use it and will be competing against people who know how to use it—and so we want to teach them how to use AI tools effectively, while also teaching them to use them ethically.

Provost Nathan Urban

“If we care deeply about the education of our students, and there is a tool that can help facilitate that, we need to figure out how best to use it,” he says.

While today’s specific tools may be obsolete soon, he notes, the key is to teach students to be curious about how such technologies work and how they might shape the future of industries students will enter after Lehigh.

“They’ll be expected to use it and will be competing against people who know how to use it—and so we want to teach them how to use AI tools effectively, while also teaching them to use them ethically.”

Man in a blue suit and glasses standing outdoors, with blurred greenery and lights in background.

Provost Nathan Urban

That means knowing when and how to use AI the right way so that it supports learning rather than replacing it, he says. He borrows an analogy from journalist Kevin Roose between “weightlifting,” in which AI can be used as a tool to strengthen students’ minds, and “forklifting,” in which it can be used to replace some tasks to provide greater capabilities.

“If you are trying to accomplish a job of moving heavy things in a warehouse, a forklift is a very useful tool,” Urban says. “But if you take that forklift to the gym to lift weights, you’ve defeated the purpose of working out. So the question of whether it should be used or not depends on the purpose of the activity.”

In that regard, Lehigh has followed a decentralized approach, providing tools to faculty and students to integrate AI into the curriculum, but ultimately leaving it up to the individual how to adopt them. The end result is neither to ban nor to celebrate AI, but to experiment with it to assess its effects on learning, understanding where it benefits and where it harms student learning.

“We’re not assuming that any particular use case is a good one,” Urban says. “We want to assess whether it is aiding learning or impairing learning, and be able to adapt our usage moving forward.”

From Classrooms to Careers

Along with the administration’s push to seed AI around campus, Lehigh’s Center for Career and Professional Development (CCPD) has performed a survey of dozens of employers—many of them Fortune 100 or 500 companies—to best determine the skills they are looking for in graduates.

“AI has transformed how work gets done in almost every industry, and what we’re hearing from our employer partners is that AI isn’t necessarily replacing jobs, but it’s reshaping positions,” says Lori Kennedy, senior director of CCPD, who led the survey.

Among the many examples they found, engineers are using AI to prototype ideas; legal and compliance teams are using it to review documents; and scientific writers are using it to generate literature reviews and first drafts.

“That means AI has become a baseline expectation, not a bonus or ‘nice to have’ skill,” Kennedy says. One employer said they “expect new hires to be using AI tools by their third day on the job.”

AI has transformed how work gets done in almost every industry, and what we’re hearing from our employer partners is that AI isn’t necessarily replacing jobs, but it’s reshaping positions.

Lori Kennedy, senior director of CCPD

For graduates, that means at a minimum arriving with fluency in using large language models (LLMs) such as ChatGPT, Copilot, Gemini and Claude, engineering effective prompts and critically evaluating AI output. Depending on the industry, however, certain employers might look for proficiency in specialized AI tools as well, such as TensorFlow or PyTorch for developing deep learning models, or Jasper or Parse.ly for creating communications and marketing content.

That doesn’t necessarily mean knowing how to code or understanding the back-end of AI software, so much as being able to effectively use AI interfaces.

“It was refreshing to hear that AI is for everyone,” Kennedy says. “You don’t have to be a computer scientist—they are just looking for people who want to use AI tools with curiosity.”

Smiling woman with short black hair and a dark blazer against a neutral background.

Lori Kennedy, senior director of CCPD

Just as importantly, Kennedy says, employers stressed the importance of ethical awareness.

“They want graduates who understand bias, privacy concerns and data sensitivities, and who can articulate when to use AI and when not to,” she says. “They are still emphasizing critical thinking and human judgment.”

For Lehigh students hitting the job market, that means being able to explain in interviews how they’ve used AI tools in coursework, projects and internships in ways that demonstrate that fluency and ethical judgment.

Kennedy recommends students start experimenting with AI early in their college careers, and stay updated as tools evolve; at the same time, she adds, students must always engage in AI with an attitude of developing their expertise in their field, not substituting it or taking shortcuts.

“As multiple employers emphasized, you must understand your field deeply before AI can add value,” she says.

In order to aid in that development, CCPD has unveiled a new suite of LinkedIn Learning pathways, curating thousands of videos and online classes on LinkedIn’s platform and packaging them into courses for beginners through advanced learners to develop their AI proficiency. Upon completion, she says, LinkedIn will develop a certification it will post right on a student’s profile that can serve as an independent verification of their knowledge and skills.

Those aren’t the only tools provided to students. Lehigh also provides a central hub called Data Camp where students can take trainings in specific AI tools and receive certifications as well. In addition, Lehigh has recently partnered with the Google AI for Education Accelerator, which offers access to Google’s suite of AI tools beyond Gemini, and free access to Coursera courses about use of generative AI tools, prompt engineering and data science that also provide certificates students can include on their resume.

A professor standing in front of a classroom, engaging with students seated at desks.

Bilal Khan, professor of biostatistics and health data science, listens to a student during one of his classes.

Faculty at the Front Lines

It’s not only students that Lehigh is preparing for an AI future, however, but also faculty.

According to Dominic Packer, vice provost for educational innovation and assessment, more faculty members than ever are experimenting with AI in their classes. That includes a group of 35 faculty who are currently developing AI tutors—chatbots specially trained on course material including syllabus, slides and readings—to provide students with extra help outside of class.

“They’re essentially available for the first line of questions for students, if they want to work through a problem set not in the textbook or help them study for tests or exams,” Packer says. “Students are reportedly finding them quite helpful in specific instances, though they are not without their glitches.”

Other faculty are using AI to create simulations—for example, in a medical context presenting as a patient with a set of symptoms to help students learn to diagnose ailments. Some are encouraging students to use AI in homework assignments to help them digest higher-level readings.

“You could assign a complex paper to a sophomore you’d usually assign to a senior,” Packer says. “It's not that we want it to replace students being able to read these papers eventually, but maybe it gives people earlier access to things.”

Lehigh is providing faculty with technical support through Library and Technology Services, as well as a broader framework of pedagogical principles to help them think through when and how to best use AI.

Smiling man in a suit with short blonde hair, warm background.

Dominic Packer, vice provost for educational innovation and assessment

“These are not rules, saying ‘you must do this’; it’s more of high-level guideposts, such as ‘We will use AI when it increases the effectiveness of student learning, which means we’ll rigorously assess how effective these tools are,’” Packer says.

In the spirit of experimentation, Packer plans to ramp up assessment next semester in a more formal way, surveying faculty and students to find out where they find AI useful and in what areas it misses the mark.

“I’ve been saying this is semester zero,” Packer says. “We’re trying to provide insight to faculty, and then feed that insight back into the university so we get a bigger picture view of how it’s being used and where it’s effective.”

As part of that experimentation, some faculty are taking it upon themselves to create course offerings that focus even more deeply on AI-based learning. Bilal Khan, professor of biostatistics and health data science, uses machine learning and other AI tools in his own research on “just in time” interventions that use continuous data collection and forecasting models to give behavioral nudges to patients. Intrigued by the growing prevalence of generative AI, he spent the summer writing a book and developing a new course called “The Art of AI Conversation” to help students learn how to engineer prompts to more effectively interact with LLMs.

I’ve been saying this is semester zero. We’re trying to provide insight to faculty, and then feed that insight back into the university so we get a bigger picture view of how it’s being used and where it’s effective.

Dominic Packer, vice provost for educational innovation and assessment

“It’s very clear we have to take generative AI head-on,” he says. “There’s no pretending this is going to go away.”

Yet, too many students enter classes from high school under the assumption that use of AI in class is prohibited. Or if they are using it, they are approaching it in a haphazard way without any real training.

“Most people I come across are essentially using LLMs in a ‘seat of the pants’ kind of way without a lot of method to it,” Khan says. “So I decided to systematize it and give students a toolbox.”

One of the first exercises in class is using an LLM to develop hooks for TikTok videos to appeal to a Gen Z audience. By using AI for a task in which most young people can easily judge success or failure, they are able to quickly see the strengths and limitations of the tool.

“I’m then able to say, ‘If I gave you a prompt that had to do with biochemistry and you got an answer for an assignment, would you feel as confident sending it to your biochemistry professor?’” Khan says.

From there, Khan teaches students various ways to help improve the accuracy and outputs of LLMs, for example, by breaking a complex problem down into a chain of logic, assigning a role to an LLM to draw from a particular field of expertise or even feeding answers back into AI to check work. The course will end with students using AI tools to complete research on a complex topic.

By learning how to better use AI, Khan hopes that he is able to help students see it as less of a mysterious “oracle” spitting out answers from beyond, and more of a mechanism they can manipulate to generate trustworthy answers to help them solve problems.

Paola Cereghetti, teaching associate professor of physics, is similarly taking an experimental, inquiry-based approach with a “Big Questions” course on using AI for medical imaging. By analyzing various approaches, she and students were able to examine together areas in which AI might improve diagnoses of medical conditions, as well as where it falls short.

A teacher gestures toward a chalkboard while students sit at desks, focused on the lesson.

Paola Cereghetti, teaching associate professor of physics, during one of her courses.

The problem with AI, she’s able to point out to students, is that it struggles in areas where it confronts something it’s never seen before, making identification of novel patterns difficult, which makes it no substitute in the end for human judgment.

“It seems we may be far away from really having AI replace our radiologist,” she says.

At the same time she and students are exploring the use of AI in medicine, Cereghetti has also experimented with using AI tools in class, giving students assignments to use generative AI to brainstorm presentation topics and analyze scholarly articles. They also interact with an AI tutor she designed.

In all of these practices, Cereghetti reviews transcripts of students’ interactions with AI and critiques the results, using it as a jumping-off point to explore places in which AI helped facilitate learning and where it fell short of the mark, so students can help develop better judgment and prompting strategies that might help them in other contexts.

I don’t believe in fighting against it. If we want students to think critically,
we have to show them how to use AI well—not just tell them not to use it.

Paola Cereghetti, teaching associate professor of physics

Like Khan, Cereghetti believes that rather than prohibiting AI, students will be better served by using it with the permission and guidance of faculty, who themselves are also perhaps experimenting with how best to use it.

“I don’t believe in fighting against it,” she says. “If we want students to think critically, we have to show them how to use AI well—not just tell them not to use it.”

Beyond the Walls of Campus

Cereghetti’s experimental approach underscores a broader truth at Lehigh: AI isn’t being inserted into the curriculum in a way that replaces traditional teaching methods. Rather, it’s being integrated with inquiry that represents the best practices of academic learning.

In addition to the specific areas in which students will be exposed to AI tools in class or within CCPD, Urban has also spurred a campus-wide conversation around the future of AI for learning.

As part of Lehigh’s 2025-26 Compelling Perspectives series, Lehigh has invited media innovator and Huffington Post founder Arianna Huffington and technology pioneer and Apple co-founder Steve Wozniak to lead conversations on the broader implications for AI and the future, as well as Senator Dave McCormick to discuss the government’s role in supporting artificial intelligence and innovation.

Two speakers engage in conversation on stage, flanked by decorative plants and banners.

Arianna Huffington, founder and CEO of Thrive Global and founder of the Huffington Post, discussed the human side of AI during the first event in Lehigh’s 2025-26 Compelling Perspectives series in November.

In addition, as part of the Donald M. Gruhn ’49 Distinguished Finance Speakers Series, the university invited Carter Lyons ’97, co-CEO of Two Sigma Investments, to campus in October to discuss AI in the workforce. The events reinforce the idea that AI is not just confined in computer labs and classrooms, but is now part of Lehigh’s intellectual fiber.

“Many of our larger public lectures this year will be at least in part about AI,” says Urban, “whether that’s from a technical perspective, or an ethical perspective or from an application perspective. That’s very intentional. We’re trying to create an environment in which students, faculty and staff have many opportunities to learn about and hear about different perspectives on these tools and their use.”

In October, Lehigh’s Center for Advancing Community Electrification Solutions (ACES) hosted “Innovating Energy and Water Solutions for Tomorrow's AI Data Centers,” a day-long symposium that brought together leaders from education, industry and government for discussions that centered on infrastructure, including water and electric use, in regard to data centers and the growing demand of AI.

Urban is looking beyond campus itself to position Lehigh as a hub for AI education in the Lehigh Valley, which could support local companies, school districts and other organizations in their own use of AI.

“I want us to be a resource to institutions and individuals who are right now wondering if they are even going to be able to keep up,” Urban says. “It’s our responsibility to share our expertise broadly, so others are also able to lead—or at least not fall behind.”

What do I want our students to be? Among the best in the country when it comes to using AI in the context of their discipline, and in solving the problems of the greatest interest.

Provost Nathan Urban

Support from the H.S. Lee Family Foundation, Inc. will enable these goals to take root. Through this funding, all graduate students and faculty will gain access to advanced AI tools, allowing the integration of artificial intelligence into courses and research at an accelerated pace. With this approach, graduate students and faculty can remain on the forefront of AI innovations, making them valuable partners for competitive organizations and industry. This gift will also support an AI prize that incentivizes graduate students to design breakthrough AI applications and cultivate the next generation of entrepreneurial leaders.

For Lehigh students and graduates, Urban says, he wants to ensure that no one outcompetes them when it comes to learning and using AI tools.

“What do I want our students to be?” he asks. “Among the best in the country when it comes to using AI in the context of their discipline, and in solving the problems of the greatest interest.”

Story by Michael Blanding

Photography by

Christa Neu, Illustration by Eva Vázquez