The Social Impact of Artificial Intelligence
Long-time CSU Monterey Bay lecturer George Station argues that regarding Artificial Intelligence (AI), “nobody could afford not to have this conversation.”
Station is focused on the “social impact of the data, unknown and known implications of the technology.” He observes that AI has been “basically released into the wild” without regulation, and minimal internal controls have been implanted into the technology we depend on daily.
“It’s impacting the entire campus community—students, staff, faculty, and administrators—because it’s embedded everywhere and hard to turn off. That’s because it’s in Microsoft, it’s in Google, and it’s within our learning management systems. So basically, even the tools we want students to use,” said Station.
“AI is basically a large language model that is predictive. It puts words together and tells you what the words are,” explains Station. “When it creates images or videos, it’s making stuff up. And that’s the problem. Trusting AI and letting it go into our systems is a problem.”
The technology has been struggling from the beginning and remains problematic.
In 2016, Microsoft released a Twitter chatbot called “Tay.” It quickly reflected the worst and most routine parts of Twitter (now X). After 16 hours, it morphed into a racist, sexist bot Tweeting enthusiastically about Hitler and was shut down.
Amazon workers have complained that AI bots fired them without human interaction. Surveillance apps track how often workers leave their desks. AI tools for resume screenings have been notorious for discriminating against diverse people and tend to have a particular bias against darker skin.
Station explained that AI image generators created Black and APIDA Nazis because it has no moral compass or ways to filter what it is producing. A picture of the “founding fathers” might be composed of Hamilton graphics. AI doesn’t have a memory to recall from, nor does it cite its sources. AI co-mingles disparate pieces of our collective knowledge to generate information and respond to inquiries.
Station’s interest in teaching about tech and science dates back to his days teaching physics at the US Naval Academy in Annapolis, Maryland. Happy to land in California, Station has been a lecturer at CSU Monterey Bay for almost 25 years. He has taught a broad range of courses over his career, with a current focus on lower-division social science and upper-division teacher-ed courses.
Station originally became an active CFA member through what was then called the Council on Affirmative Action, later becoming the Council for Racial and Social Justice. Station was often one of the few Black faculty wherever he was, so having CFA members bring Black faculty together has meant a lot, so much so that he serves as the Co-chair of the Black Caucus. For many, unions help us find our people. Through CFA, Station has found a place for solidarity and friendship and to make the university accessible and successful for its variety of communities that make up the CSU.
CFA Associate Vice President of Lecturers, North, and Monterey Bay Lecturer Meghan O’Donnell has been friends with Station for 14 years and has seen firsthand how Station’s fierce compassion inspires a lot of people, including herself.
“He lives and breathes compassion and empathy, and he has this very clear understanding that both the university and our union are nothing without a strong culture of care that acknowledges the whole person,” said O’Donnell. “You can see it in his concerns over AI, to his career-long commitment to anti-racism and social justice, his leadership in disability rights, his steadfastness in addressing the enduring impact of COVID. George just brings a soulfulness and a human-centered approach to all aspects of our work.”
“Knowing George and having him be part of my CFA family has impacted me in profound ways and I continue to learn and gain so much from his wisdom and friendship,” she reflected.
At Monterey Bay, Station also supports faculty in their teaching and professional development by serving as a faculty associate with the Center for Teaching, Learning, and Assessment. He always thinks about pedagogy, technological innovations, and the intersection of social change and social justice.
From the perspective of the instructor, Station is concerned with how AI is “affecting the trust relationship and affecting pedagogy.”
“Where the technology is either misused or misunderstood in its use breaks down that relationship of trust between faculty and students and faculty and staff and administrators, for that matter,” said Station.
Ultimately, teaching is about people and relationships. Station wants to remind folks that while technology is essential, we can’t overlook what students bring with them into the classroom. For Station, the effects of the pandemic are persistent and visible.
“People have lost family members. People are taking care of grandparents,” said Station. “There’s stuff happening that wasn’t necessarily going on the same way in 2019 that the CSU has yet to recognize.” It may be 2024, but so many are still burning from the wounds of the pandemic. For some, their lives completely collapsed, and there won’t be a recovery.
Station notes that our interactions with technology dramatically altered during the pandemic. Increasingly intrusive tech was deployed to proctor student testing, shifting the pedagogical relationship to monitoring and policing student compliance with the tech.
“It’s actually laying bare any trust issues that may have already existed with police and the surveillance state overall,” said Station.
When the Internet was first deployed, it was described triumphantly as a public forum for exchanging ideas, the “information superhighway.” Now, the web economy is consolidated in the hands of billionaires. Platforms like Salesforce, Adobe, and Facebook dominate and extract the most profit and surveil our behavior. Station does not want AI to be the same.
AI is here and is inevitable, but its progression is unevenly distributed. Even the application of AI is gentrified, and it mirrors the inequities in our society.
“The free tools that are out there are far less reliable and helpful than the tools that people can pay for, which means that for the CSU, CSU students are already behind. The best and newest advancements will go to private enterprises or better-funded universities.”
There is already talk of higher education administrators using AI to cover teaching sections or assignments, thereby forcing fundamental changes in our curriculum.
The contentious SAG-AFTRA negotiations signaled to the broader labor movement that AI needs to be considered a topic of bargaining. Instead of urging the employer to give workers authority over AI, we can bargain relevant protections and rights as it relates to the emerging technology in our contract.
Join California Faculty Association
Join thousands of instructional faculty, librarians, counselors, and coaches to protect academic freedom, faculty rights, safe workplaces, higher education, student learning, and fight for racial and social justice.