At the California Institute of the Arts, it all started with a videoconference between the registrar’s office and a nonprofit.
One of the nonprofit’s representatives had enabled an AI note-taking tool from Read AI. At the end of the meeting, it emailed a summary to all attendees, said Allan Chen, the institute’s chief technology officer. They could have a copy of the notes, if they wanted — they just needed to create their own account.
Next thing Chen knew, Read AI’s bot had popped up inabout a dozen of his meetings over a one-week span. It was in one-on-one check-ins. Project meetings. “Everything.”
The spread “was very aggressive,” recalled Chen, who also serves as vice president for institute technology. And it “took us by surprise.”
The scenariounderscores a growing challenge for colleges: Tech adoption and experimentation among students, faculty, and staff — especially as it pertains to AI — are outpacing institutions’ governance of these technologies and may even violate their data-privacy and security policies.
That has been the case with note-taking tools from companies including Read AI, Otter.ai, and Fireflies.ai.They can integrate with platforms like Zoom, Google Meet, and Microsoft Teamsto provide live transcriptions, meeting summaries, audio and video recordings, and other services.
Higher-ed interest in these products isn’t surprising.For those bogged down with virtual rendezvouses, a tool that can ingest long, winding conversations and spit outkey takeaways and action items is alluring. These services can also aid people with disabilities, including those who are deaf.
But the tools can quickly propagate unchecked across a university. They can auto-join any virtual meetings on a user’s calendar — even if that person is not in attendance. And that’s a concern, administrators say, if it means third-party productsthat an institution hasn’t reviewedmay be capturing and analyzing personal information, proprietary material, or confidential communications.
“What keeps me up at night is the ability for individual users to do things that are very powerful, but they don’t realize what they’re doing,” Chen said. “You may not realize you’re opening a can of worms.“
The Chronicle documented both individual and universitywide instances of this trend. At Tidewater Community College, in Virginia, Heather Brown, an instructional designer, unwittingly gave Otter.ai’s tool access to her calendar, and it joined a Faculty Senate meeting she didn’t end up attending. “One of our [associate vice presidents] reached out to inform me,” she wrote in a message. “I was mortified!”
THIS HAPPENED AT WORK!!!
One of the parties in a grievance mediation had Otter.ai installed on his computer for a previous meeting. He thought (and, honestly, had been led to believe by the company) that he was the one triggering when it was used, and had wanted it to provide captions and a transcription for another meeting. He intended to use it once. Unbeknownst to him, it activated on EVERY MEETING. The worst part is no one noticed, so it is actually unclear how many meetings he’d been in that the AI had been activated on, but for this particular meeting, it sent the meeting host (my colleague) an email saying that it was RECORDING (which is illegal in this line of work, highly illegal, there’s hearings in Congress right now on someone recording a negotiations meeting) the proceedings.
The goal of the email was for her to see how “helpful” of a tool that it was so that she could download it as well and enable it in her meetings. It sent her 1) an attendance summary (private); 2) a transcript of the meeting so far (illegal) and 3) a snippet of audio from the meeting (highly illegal). They had to stop the mediation entirely, switch to old school phones to see where the issue was and who had this enabled on their computer. The man was horribly embarrassed, and had to get help from his IT department to get the program uninstalled from his computer.
Genuinely, these AI tools are viruses. Because of this, we’ve been asking external people at the start of meetings if anyone else is present off-screen (a different story) or if anyone has AI programs installed on their computer. But most people don’t KNOW because Copilot is now installed behind their backs, and it’s being sneakier than other programs (like Microsoft isn’t going to email someone and say “Hey, by the way, we’ve been listening into your meetings”), but that doesn’t mean it isn’t doing the same things.
If you are downloading and using these programs, please be aware of this and please fucking uninstall them.