Are AI notetakers legal in two-party consent states?
- Claire Baker
- Dec 28, 2025
- 2 min read
Is Granola even legal?
I think maybe not.
Spybots... I mean, 'AI notetakers' freak me out.
Not because I’m paranoid. Because I sit in on a lot of meetings that people wouldn’t want recorded.
Luckily, I live in California, which is a two-party consent state.
Meaning those nosey little spybots need to announce themselves.
(Okay, I know those little “this meeting is being recorded” disclosures happen everywhere. But who do you think started it, huh?)

“Ugh! I wish I’d written it down when he told us in the meeting,” I said to my colleague.
“I’ll check the transcript,” she said.
I was surprised.
I hadn’t noticed the disclosure. I'm in the habit of checking every time someone says something that should be kept private.
I learned to check after What Happened that one time everyone forgot.
“You recorded it?” I asked.
“I use Granola. It gives me a transcript of all of my meetings. But I just read the bulleted summary.”
“Wait. Can you do that without telling people?”
I went down a rabbit hole.
🤫 Yes. In California, transcripts for AI notetakers still require consent from all parties.
🎤 Yes. AI notetakers count as a “recording.”
😬 Yes. Rules still apply even if it’s only for personal use.
As of January 1, 2025, California’s AB 2905 requires all users of virtual notetakers to disclose their use.
Violations can result in a $500 fine per occurrence.
Once there’s a law about an online tool in California, the rest of the country isn’t far behind.
Recording consent laws usually revolve around a “reasonable expectation of privacy.”
In our line of work, nearly every conversation comes with a reasonable expectation of privacy.
So when someone mentioned Granola at an AI in HR conference in San Francisco a few weeks later, I brought it up.
“I don’t think that thing’s legal,” I said to the HR leaders around me.
“It doesn’t share it with anyone if you tell it not to.” They waived me off and turned back to the landing page. “Cool! I need this!”
“No really, guys. People have a reasonable expectation of privacy when they talk to us. These notes can be discoverable.”
“It’s only for personal use,” they said, turning away.
Mark my words.
There’s a Coldplay jumbotron/”Oops ChatGPT made my roadmap discoverable” scandal coming related to these silent spybots. Don’t say I didn’t warn you.
👋 I'm Claire. I'm not doing anything nefarious. I just understand that there are situations where people need to speak plainly. If you're looking for someone with a healthy level of paranoia to anticipate risks and handle the most sensitive data in your organization...



Comments