Happy Thursday, everyone. I'm Frank Richardson, an organisational psychologist observing the workplace with curiosity and care. Each week, I share insights to help HR leaders better understand the people behind the processes and build cultures where both individuals and organisations can thrive.
This week in workplace whiplash 🌀
A few workplace stories from the past week that are worth paying attention to:
🧠 Meta’s layoffs may be part of a longer wave
Meta’s reported May cuts are being framed as just the beginning, with ongoing restructuring expected as the company doubles down on AI and efficiency. The shift reflects a broader pattern across tech, where headcount is being recalibrated alongside growing investment in automation.
👉The HR Digest⚖️ PepsiCo settles disability discrimination case
PepsiCo has settled an EEOC lawsuit alleging it failed to accommodate an employee’s disability before terminating them, reinforcing that employers may need to go beyond baseline leave policies when considering reasonable adjustments.
👉HR Dive🎤 Employee benefits are getting a celebrity glow-up
In a piece by HR Executive, celebrities like Tom Brady and Lady Gaga are highlighted as influencing workplace perks, pushing companies toward more personalised, lifestyle-driven offerings. It points to a broader shift away from standardised benefits and toward experiences that feel more curated, branded, and, in some cases, aspirational.
👉HR Executive
Work is becoming easier to track, measure, and document. But is it also becoming a lot more performative?
🤝 This edition is kindly brought to you by Metaview
AI has blown up inbound applications. More volume. More noise. More right candidates getting missed.
Metaview's Application Review surfaces the 8% worth your time — and gets sharper with every decision your team makes.
"I picked it up fast and it kept getting sharper as I gave it feedback. It's a no-brainer." — Amandeep Shergill, TA Leader at Automattic
There used to be an unwritten rule: meeting note taking fell to the most junior person in the room (or at least, the most junior woman 🙄).
They were expected to half-listen, half-participate, and capture “the key points” from a conversation that had usually derailed by about minute 12. If they were really on it, you might get a few dot points in an email afterwards. But let’s be honest, that rarely happened.
Most meeting notes were vague, incomplete, and largely confined to the headlines. Which, in hindsight, was part of what made meetings work. Conversations could be messy. People could test a half-baked idea, contradict themselves, and go off on a tangent. Not every thought had to survive beyond the room.
But now, increasingly, every thought does.
A recent piece in The Times looked at the rise of AI meeting assistants quietly joining calls, transcribing everything, and producing clean summaries of what was said and what needs to happen next. What used to be fleeting is now searchable and shareable.
On paper, it sounds efficient, but it also changes something more subtle. Once a meeting stops being temporary, people can start to speak a little differently.
🧠The behavioural science lens
This isn’t about avoiding accountability. It’s about recognising what different conditions encourage people to say:
We perform differently when we feel observed: The Hawthorne effect captures a simple truth: people adjust their behaviour when they know they are being watched. A running transcript might feel passive, but it’s enough to make people more careful, more measured, and less willing to take conversational risks.
We start managing how we’ll be read, not just how we’re heard: Research on impression management shows that people actively shape how they come across to others. When contributions are captured and circulated, the audience extends beyond the room. Over time, ideas can become more polished and more aligned to how they will appear in the summary rather than how they are being worked through in real time.
Messy thinking becomes harder to do out loud: In user research, we actively rely on people thinking out loud. Methods like think-aloud protocols are designed to capture how people reason in real time, including hesitations, contradictions, and half-formed ideas, because that is often where the most useful insight sits (and where things actually get interesting). When everything is being recorded and summarised, there is more pressure to speak in finished thoughts. Conversations become clearer, but they can also lose the very moments where understanding actually develops.
Psychological safety becomes more fragile when everything is recorded: Amy Edmondson’s work on psychological safety highlights how important it is for people to feel able to speak up without fear of judgement. When conversations feel more permanent, the perceived stakes rise, and that willingness can shrink.
🚀 What this means for leaders
As AI note-taking becomes standard, this shift doesn’t get announced, it just starts to show up in how people speak:
Be intentional about which conversations need precision and which need space: Not every meeting is about capturing decisions. Some are just about figuring things out. Treating both the same can push teams toward clarity at the expense of exploration.
Actively create room for unfinished thinking: Once people know their words will be recorded, they may naturally edit themselves. Leaders who want better ideas need to signal that rough thinking and uncertainty are part of the process.
Pay attention to what disappears from conversations: It is rarely the confident contributions that drop away. It is the tentative ones. Over time, their absence can make discussions feel a lot thinner.
Be wary of summaries that feel a little too neat: AI tools may capture everything that’s said, but what they produce is still a version of events. Summaries are designed to be clear, structured, and resolved, which often means smoothing over disagreement, compressing uncertainty, and presenting conversations as more aligned than they felt in the moment. That can be useful, but it can also be misleading.
Do you speak differently in meetings when you know they’re being recorded or transcribed word for word?
💬 Final thoughts
There’s a difference between capturing a conversation and actually having one.
AI notetaking does the first extremely well. The second relies on something messier.
When everything is recorded, people tend to adjust. Not dramatically, just enough to sound a little more certain, a little more polished, and a little less like they are still figuring things out.
And that’s often where the best thinking begins, before it’s fully formed.
How's the depth of today's edition?
If something here speaks to you, I’d love to hear it.
Until next week,
Frank
P.S. If you want to get a feature about your own story, reply to this email. If you’d like to reach our newsletter audience (founders, creators, and marketers), click the button below.
If you’re new here, I’m over the moon you’ve joined us! To help me craft content that’s actually useful (and not just noise in your inbox), I’d love it if you took 1 minute to answer this quick survey below. Your insights help shape everything I write.
✨ Insane Media is more than one voice
💡 Dive into our other newsletters - where psychology meets the founders, creator economy, e-commerce marketing, and AI founders.







