FEATURED
AI Comes to Campus: Westmont's Digital Frontier
by Caylie Cox ’21 and Nancy Phinney ’74

Generative artificial intelligence (GenAI), one of the most tionizing everything from search engines to healthcare to email. Within two months of its introduction, ChatGPT attracted 100 million users. This rapid growth has raised many questions and concerns about GenAI’s impact on society. Are the companies creating it carefully considering the dangers this technology poses to our shared humanity, creativity and dignity? Will it promote human flourishing?
Chief Information Officer Reed Sheard, professors and administrators at Westmont understand the importance of using any technology with wisdom and discernment. As departments implement GenAI in different ways, the campus community engages in thoughtful conversations to develop a uniquely Christian approach to it.
Westmont’s Center for Technology, Creativity and Moral Imagination sponsors some of the discussions underway. Supported by a grant from the Fletcher Jones Foundation, TCMI seeks to assist students, as well as leaders in higher education and industry, learn about and engage technology with a moral and ethical perspective.
This initiative raises important questions. How can we use GenAI in a profitable and sustainable way that also contributes to a flourishing society? “Christians need to know how to deploy it in a co-creative way that aligns with the work of God in the world,” Sheard says. “We understand that driving innovation without a context of greater responsibility to humanity and ethical action will likely result in a predictable outcome: Some people will become incredibly rich — and bad actors will grow more capable.”
“Christians need to know how to deploy it in a co-creative way that aligns with the work of God in the world,” Sheard says.
To advise TCMI, Sheard has gathered a board of 25 people with varying perspectives on the GenAI industry. “Part of figuring out how to respond is surrounding ourselves with smart people who bring different backgrounds, experiences and expertise,” he says. “It’s a true liberal arts approach to seeking advice.” The board includes data people, AI scientists, lawyers specializing in privacy and intellectual property, clergy, a philosopher, a mathematician, a business consultant, and even a pundit with an AI podcast for Microsoft.
Professors also seek how best to respond to GenAI and teach students about it. Faculty reading groups and roundtable discussions help professors learn more about GenAI, and those more experienced with it have joined panels to provide their perspectives and recommendations to students and others. The weekly Faculty Forum has dedicated several sessions to presentations about the technology.
Westmont has adopted Gradescope, an online grading tool that evaluates a variety of assignments, math problems, computer code and even handwritten responses. Professors experimenting with it limit their use to lower-level work such as true/false and multiple choice tests. They continue evaluating papers and written exams themselves.
“A lot of uncertainty exists about GenAI,” says Sara Skripsky, associate professor of English, who directs the Writing Center and coordinates Writing across the Curriculum. “Some disciplines, such as STEM, show more interest in the technology, with the humanities and social sciences being more reluctant.”
“It’s an evolving situation with an improvisational approach so far,” says history professor Alastair Su. “Trying to orchestrate a systemic approach presents challenges because every discipline is different. No one size fits all.”
GenAI creates major concerns about the integrity of submitted assignments. How do professors determine if students turn in work they themselves have written? As faculty explore tools for monitoring work, they seek to understand how best to deploy them. Skripsky says Google Docs tracks step-by-step development, with an extension (Draftback) creating a 15-second video reproducing the entire process. While some professors use this tool, Skripsky follows her own practice: meeting with students regularly one-on-one, reviewing outlines, and checking in several times on the work in progress.
“Westmont faculty have a high regard for relationships with students and believe in a personal, high-touch approach,” Skripsky says. “Students who use GenAI miss the mentoring and relationships that occur through engaging with professors.”
An early adapter familiar with GenAI’s capabilities, Su tries to stay four or five steps ahead of students. “It’s transformative tech, and I want to equip students to use it critically,” he says. “I’ve designed some assignments that require its use. History teaches us how to understand different sources of information, how they’re constructed, and how to think critically about them, so it’s part of what we do.
“But I want to avoid a situation where education is reduced to a student pushing a button and a professor pushing a button with no learning taking place. That’s not what we want at Westmont.”
Theresa Covich, instructional services librarian, says more professors now require students to complete writing assignments in class. “Work without assistance from GenAI needs to be done in class,” she says.
“Students face a lot of temptations around efficiency and fatigue with deadlines, creating a prime condition for cheating,” Skirpsky says.
Westmont has updated its policy on academic integrity to include the use of GenAI: “A student may not submit AI-generated content or ideas (e.g., from ChatGPT or Grammarly) as one’s own original work.” Professors clearly state this expectation in class.
As a Christian, Su finds two aspects of GenAI troubling. “A lot of these programs depend on intellectual theft and pirated books,” he says. It’s unjust — AI is illegal tech in some ways. I’m also concerned about the environment cost with 98 percent of all electrical use dedicated to server farms. It’s not logical to bleed the environment so we can embrace this technology. As stewards, we need to use it ethically and sustainably.”
Covich has researched statements about GenAI by other institutions and organizations. Biola’s Biblical Principles for Understanding and Using AI lists three guidelines: seek knowledge and wisdom from God and his word first and then from wise Christians; people bear God’s image and are creative and relational beings; and God commands us to steward the resources he has given us. “These apply in many other areas and remind us of what we hold important,” Covich says. “We need to be willing and unafraid to engage GenAI — and willing to be countercultural and trust in the word of God and our biblical training.”
“We need to be willing and unafraid to engage GenAI — and willing to be countercultural and trust in the word of God and our biblical training.”
The University of Notre Dame revised its honor code: “When students use generative AI (such as ChatGPT) to replace the rigorous demands of and personal engagement with their coursework, it runs counter to the educational mission of the University and undermines the heart of education itself.” The Modern Language Association proclaims that written communication only happens between human writers and readers.
“We misrepresent ourselves by using GenAI and not our human voices,” Skripsky says. “It breaks our relationships and prevents transparency. I’m concerned about the relational costs of students opting out of human tutoring.
They miss out on accurate, contextual information specific to Westmont courses and instructors as well as the relational aspects of tutoring, such as encouragement from their peers.”
The number of students seeking help from the Writer’s Center has dropped 40 percent this year from a five-year average. The demand for peer tutoring through the library has also declined.
Skripsky cites two likely factors: the use of GenAI and faculty assigning less written work outside of class or converting writing to oral exams. While the technology can be superficially correct, it’s less reliable with details. “Students are in the worst possible place to gauge what is true and what is not when GenAI gets it wrong,” she says.
“We want our graduates to be unpredictable, but GenAI works on predictability,” Covich says. “When students converse with it, their experience greatly differs from speaking to the person sitting next to them.”
AI companies now push the technology on people doing searches and sending email, so it can be difficult to avoid. “Ads prey on students’ vulnerability by emphasizing efficiency and competency, but it’s transactional, not relational,” Skripsky says. “Our professors and student tutors offer much better assistance. GenAI lacks the heart and soul of a Westmont education — and it can’t write you a letter of recommendation.”
Covich expresses concerns about AI-generated summaries, which pop up online. “They may be hard to resist, but they miss so much and fail to provide the deeper reading professors expect from students,” she says. “I worry that institutions and faculty may be too accommodating of GenAI, especially for students at the beginning of their college careers.”
“We should never accommodate shortcuts and misuse of AI,” Skripsky says. “We can embrace technology but hold to ethical boundaries. That’s best for students and aligns with the college’s mission. GenAI will keep changing, and the college needs to continue to learn and adapt. This also applies to tech not yet invented. These issues will persist.”
“We can embrace technology but hold to ethical boundaries. That’s best for students and aligns with the college’s mission. GenAI will keep changing, and the college needs to continue to learn and adapt. This also applies to tech not yet invented. These issues will persist.”
Su notes that Westmont lacks reliable data about the use of GenAI on campus, and he finds students generally wary of it. He’s most concerned about its effect on general education as GenAI has demonstrated the ability to pass such courses. Nevertheless, Su uses it for many things, including his own research and teaching. “It’s a competent assistant,” he says.
Recent research supports Su’s sense about students approaching GenAI with reservations. “Voices of Gen Z | How American Youth View and Use Artificial Intelligence,” a report from the Walton Family Foundation, GSV Ventures and Gallup, finds that “young people are skeptical about the effects generative artificial intelligence will have on their lives while actively considering the risks these products pose. Overall, they are more likely to feel anxious than excited about AI and are less likely to trust work products that were created using these tools.” While they recognize potential benefits and know GenAI will likely be required for their careers, just 10% of those surveyed prefer AI customer service representatives over a person.
Students involved with the Center for Applied Technology (CAT) use GenAI in some of their CATLab projects. “This student program offers real-world experience developing technology for Westmont,” Sheard says. “It’s a unique initiative in Christian liberal arts education.” Students are developing a GenAI chatbot to assist their peers with administrative tasks. “I foresee the chatbot benefiting students by providing access to information more quickly and easily than before,” says Connor Rogstad ’26, the principal student developer. “The chatbot may answer questions about scheduling, college policies or campus life.”
CATLab has completed and implemented another AI project predicting electrical usage on campus. The new model estimates energy use and costs based on historical data. “During peak consumption times, we can invite the campus to lower its footprint on energy,” Sheard says. “It’s good for the environment and for managing our expenditures.”
Aware that many companies now use AI to screen applicants’ resumes before any human sees them, Westmont has licensed an AI tool that provides feedback on resumes. “We hope this assessment will lead to successful job searches and lessen the likelihood that a student’s resume gets on the wrong side of an algorithm,” says Lori Ann Banez, who directs the Career Resource Center. Read more about the CRC on page 39.
Sheard also serves as vice president for college advancement, and his staff explore AI-automated communication tools that may free up their time. “Could AI help with some perfunctory things so employees can focus on essential, higher-level functions?” Sheard asks. “While some businesses and universities attempt to replace human workers with GenAI, Westmont is committed to using it to support staff, not the other way around.
“Westmont students will graduate in a world hugely influenced by the reality of artificial intelligence. Does GenAI augment or potentially even replace human contribution? How can we be truly creative in a world with AI?
“Christian institutions like Westmont can make valuable contributions to the worldwide conversation about the ethical use of GenAI and whether it promotes or detracts from human flourishing. We must bring wisdom to this task, not just knowledge, as we become competent with AI and seek to serve humanity and further the work of God. We think deeply about “whatever is true, whatever is noble, whatever is right, whatever is pure, whatever is lovely, whatever is admirable” (Philippians 4:8), and we have something to say.”

This is a story from the Spring 2025 Westmont Magazine