This is the third in a series of blogs by the CSU Student Success Network (Network) focusing on the California State University’s (CSU’s) systemwide approach to generative AI (GenAI). The first blog offered an introduction to CSU’s strategic approach to GenAI. The second described students’ perspectives about GenAI, based on survey findings across CSU campuses.
In a recent blog about GenAI, Advisory Board, associate director of the greater LA region for the Success Center for the California Community Colleges, writes that “AI is already embedded in educational practice. The question is no longer whether institutions will respond, but how intentionally and equitably they will do so.” Regarding GenAI adoption and use, Nagthall claims that educators tend to fall within one of three orientations:
Nagthall suggests that representatives from each of these groups should be at the table as campus departments seek to develop policies for GenAI.
This blog on faculty perspectives about GenAI in the CSU draws from interviews with two CSU professors who are members of the Network’s Advisory Board and who are experienced in Sonoma State University,using GenAI in the classroom: Dr. Emily Acosta Lewis, professor of Communication and Media Studies at, and Dr. Svetlana V. Tyutina, Spanish professor and graduate program director at CSU Northridge. Acosta Lewis and Tyutina could be called AI pragmatists in that they are clear-eyed about the challenges and potential of GenAI, and they are developing practices that inform and empower students to be in charge of their AI use.
Both Tyutina and Acosta Lewis emphasized that governance policies and teaching practices regarding student use of GenAI need to be developed based on the skills and subject matter being taught. “There is no one-size-fits-all,” Tyutina said. She teaches Spanish translation and Spanish literature, and she uses different approaches for each.
Regarding translation, Tyutina said that Google Translate and similar tools have been improving significantly over the past decade. As a result, the field of translation was already changing before the CSU adopted its systemwide approach to GenAI. According to Tyutina:
“What I see now is a shift toward less translation per se and more focus on high-quality editing, where AI takes on the more mechanical part and provides a base translation, but more editors are needed to improve accuracy. In our translation courses, a full ban on AI would be a disservice to students. It wouldn’t prepare them for the reality of a market that is changing very quickly. Rather, there has to be strategic use of AI to balance students doing the work themselves versus simply claiming AI output as their own, which would be plagiarism and academic misconduct. At the same time, students need to know how to use these tools ethically and understand their limitations. We’re in a very tumultuous period.”
For her literature courses, Tyutina has a more restrictive approach to student use of GenAI. She said, “AI can attempt analysis [including drafting paragraphs or essays], but it still lacks depth. It’s useful for brainstorming, but even there it has its limitations.” She allows students to use GenAI for some assignments, and not for others:
“In my courses, it’s a constant topic of discussion. Some assignments are more prone to unsanctioned AI use, so I include gentle reminders about what is and isn’t allowed. We’re trying to move students into this new era of AI, not just ban it as cheating. Whether we like it or not, it’s here to stay, and we need to teach students how to use it ethically, such as acknowledging when it’s being used. There’s a stigma around it, so there’s a lot to unpack.”
Similarly, Acosta Lewis emphasized that there is no one-size-fits-all policy for addressing GenAI in the classroom and that professors who teach writing, for example, will approach GenAI differently than those who do not. For her communications classes, she said she doesn’t “police” student use of GenAI. Rather, she integrates learning goals and activities about GenAI’s strengths, limitations, biases, and inaccuracies into assignments, discussions, and assessments, so that students gain hands-on experience using it as a support tool.
“I do hope that in a few years this is scaffolded into our curriculum so that all students graduate with a base level of AI skills, which they’re going to need for their career,” Acosta Lewis said. “My biggest concern for students is that they’ll use it as a replacement for thinking. We need to show them how to use AI to advance their thinking.”
In instructing students about GenAI use, Acosta Lewis and Tyutina emphasized the importance of teaching students to be assertive in adopting critical-thinking roles in their use of AI tools. Acosta Lewis said that GenAI tools are likely to improve over the next five years, but meanwhile they’re limited in specific ways, and students need sufficient practice with them to understand how they’re limited:
“ChatGPT is really good at formulaic writing. And it’s good for brainstorming. I think it’s really on us to change up our assignments and assessments to include AI use. Instead of asking students for a summary essay, we can ask them first to collect primary data, analyze it, and turn that in. And rather than writing a traditional paper, maybe have them do a video, or a podcast, or an infographic.”
The idea is to work with students throughout the process, guiding their use of prompts for GenAI tools and assessing them in their analysis of the results, so that they’re sharpening their critical thinking skills.
Likewise, Tyutina incorporates GenAI into her syllabi, learning goals, assignments, and discussions. For the brainstorming phase of a project, for example, she encourages students to consider GenAI as a peer mentor who doesn’t get everything right and who needs ideas, feedback, and direction from the student. “We discuss how to use AI productively so that AI is not brainstorming for you, it’s brainstorming with you,” she said.
As with Acosta Lewis, Tyutina has assignments that require students not just to collect data and produce citations, but to evaluate the results. “We look at the AI results together and I point out the red flags,” she said. “We examine what AI produces and unpack the issues, one by one, together. It’s not punitive. It’s about understanding AI’s limitations on a deeper level.”
According to Acosta Lewis, CSU’s systemwide approach to provide the paid, education (EDU) version of ChatGPT to all students helps to protect their privacy, compared with the free version: “If you’re not paying for it, you are the product. They’re using your data in all kinds of ways.” On the other hand, she said that “the EDU version locks down to my campus what I put in there. It’s not using my chats to train their model.”
She also said that the EDU version of ChatGPT is much more useful and powerful than the free version. As a result, its availability for all students represents a huge boost for equity. According to Acosta Lewis, “The students with substantial family resources are already using the paid version, learning how to use it, getting sophisticated with it, which is giving them a leg-up on students who aren’t using it or are using only the free version.”
Tyutina warned that for many students without generational academic knowledge, AI can be another burden they have to surmount. Some of these students are wary of being accused of cheating if they were to use GenAI. “There’s a big learning curve right now,” she said. “Students need more conversation and guidance to support ethical use.”
Acosta Lewis said that she, herself, was a first-generation college student. She suggested that widespread access to ChatGPT can serve to level the playing field in ways that faculty may not realize:
“First-generation and other students are using AI as a tool to decipher our academic jargon. They’re literally putting our syllabi and our assignment descriptions into ChatGPT and asking, ‘What does this mean? What does my professor actually want from this assignment?’ Or they’re asking, ‘What are the points that I need to pull out of this article in order to synthesize it and put it into a research paper?’ From an equity perspective, it’s leveling the playing field because students don’t necessarily want to ask questions like that in front of the class. They feel like they would be judged, or it’s a stupid question. But they can ask ChatGPT at 2:00 in the morning when they’re actually working on their assignment.”
Acosta Lewis said that faculty can encourage these kinds of uses by students. She also said that faculty can support students in analyzing and discussing ethics and bias issues in GenAI results. For example, she said that ChatGPT:
“Is mainly trained on white, privileged, male perspectives. If you go to ChatGPT and say, ‘Show me 10 business professors,’ they’re almost all going to be white and older, and maybe there’s one white woman included. One of the examples we used yesterday was ‘Give me a room of parolees,’ and they were all young African-American men. For the students, seeing the actual images makes the bias in the results very apparent, compared with text output. These outputs by ChatGPT are problematic, yes, but they can lead to important discussions about ethics and bias.”
Both Tyutina and Acosta Lewis provide GenAI workshops or trainings to support their colleagues. They said that integrating GenAI into the classroom involves revising syllabi, assignments, and assessments, which can be daunting. Acosta Lewis suggested that faculty make these changes incrementally:
“I was hosting an AI workshop yesterday with faculty who were working to redesign at least one assignment to incorporate AI in some way. One of the first things we suggested was to upload your syllabus, which includes your learning outcomes, into ChatGPT, and upload your assignment descriptions. Ask it to evaluate how well the assignments match up with your learning outcomes.”
“I do that with some of my own assignments, maybe something that I’ve used for years and needs to be updated. I would maybe ask, ‘How can I tailor this for the Trump era? How can I tailor this for first-gen college students? Or how can I make this more accessible for students who were in high school during COVID and might be lacking some interpersonal skills?’ So I use it as a tool for brainstorming. Whatever it suggests, whether it’s good or bad or in between, it helps me move forward, and then maybe I ask it for ideas on incorporating a five-minute presentation related to this assignment.”
She said that the brainstorming process can be refreshing and the results can spur you to cultivate new ideas or develop new methods.
There is no effective one-size-fits-all policy for faculty to encourage student use or nonuse of GenAI in higher education. Governance policies and teaching practices need to be developed by discipline, based on the subject matter and skills being taught. If students are encouraged to use GenAI tools, they need support in developing critical thinking skills in their use. Currently, one of the best ways to do this involves developing assignments and assessments in which students gain practice using AI as an assistant in brainstorming, so as to understand its limitations and biases. Similarly, faculty might consider using GenAI tools to brainstorm ways to change their own assignments and assessments in ways that include student AI use.
In terms of equity, CSU’s approach to provide the EDU version of ChatGPT to all students helps to even the playing field concerning who has access to the best GenAI tools. The tools themselves are changing quickly, however, and over time the equity implications may depend on how effective faculty are in supporting all students in developing their critical thinking skills.
The Network is facilitated by the Education Insights Center (EdInsights) at Sacramento State University.