As Microsoft cuts thousands of jobs from its international workforce, one company executive suggested that laid-off workers seek out AI chatbot Microsoft Copilot to receive emotional support and career counseling.
After seeing the post, critics said that this person was insensitive to the needs of the former employees.
What the Now-Deleted Post Says
In a now-removed LinkedIn post spotted by Aftermath, Xbox Game Studios Publishing Executive Producer Matt Turnbull urged the laid-off to take advantage of tools such as Copilot or ChatGPT to deal with the mental stress of unemployment.
"I know these types of tools engender strong feelings in people, but I'd be remiss in not trying to offer the best advice I can under the circumstances," Turnbull said in the post. "I've been experimenting with ways to use LLM AI tools (like ChatGPT or CoPilot) to help reduce the emotional and cognitive load that comes with job loss."
Microsoft Layoffs Persist Amid AI Industry Transformation
Microsoft started July with an immediate layoff involving roughly 4% of its workforce. That's around 9,000 people who lost their jobs a week ago. To date, it's the biggest layoff cycle for the software maker.
This comes after a series of layoffs in recent months, including 6,000 cuts earlier in 2025 and another 10,000 in 2023. The most recent reductions impacted the Xbox division particularly severely.
According to Mashable, Microsoft justified the layoffs as a strategic move to adapt to a "dynamic marketplace," a sentiment echoed across the tech sector as companies restructure operations in response to the rise of generative AI.
Executives from firms such as Meta and Klarna have even been more outspoken, publicly announcing plans to displace human work with AI automation. Microsoft, in a similar vein, is doubling down on Copilot and other AI-enabled tools to enhance in-house efficiency and spur customer adoption.
In the subreddit r/Games, many people criticized Turnbull's post. One person sarcastically wrote that someone needs to drop the moral compass to be an executive of such a corporation.
"He was working at Wendy's when he got hired at griptonite back in like, 99. Known the guy forever and we knew he was a tool bag but we never imagined what he would pupate into. I sent this link to his old roommates and the quote was 'He hasn't aged a day. Good for him. Bad for everyone else' so yeah I guess it fits," another Reddit user replied.
"I cannot imagine saying anything more heartless and soulless than 'Yeah, we just kicked you all out on the street, partly because we have a large budget allocated to AI and we need the money to go to that instead, but if you're in need of help, you can consult with Copilot! You know... the thing that's partially responsible for why we can't afford to continue paying you. Please help train our AI model while receiving nothing in return for that.' Obviously, he didn't say it in those words, but he might as well have," another Redditor said.
Is Copilot Something More Than Just an Office Assistant?
Originally marketed as a productivity aid, Microsoft Copilot has become a virtual companion with emotionally empathetic capabilities. In May, Microsoft AI CEO Mustafa Suleyman revealed to Fortune that the firm now markets Copilot as a trusted friend capable of sensing user discomfort, diagnosing emotional pain, and providing personalized advice.
For Gen Z and millennials, talking to a chatbot is like talking to a digital friend who they trust. Microsoft has supposedly made Copilot use obligatory for staff members, as part of an in-house initiative to embody and campaign for its advantages outwardly.
Critics Caution Against Using AI as a Therapeutic Substitute
Regardless of Microsoft's marketing, mental health professionals warned people about chatbots. Last January, the American Psychological Association called on the Federal Trade Commission to prosecute chatbots that present themselves as providing therapeutic assistance.
Experts cautioned that AI applications, even those with emotionally intelligent features, fail to match the maturity and ethical protections of trained therapists.
Be warned that if you're sharing your deepest secrets with chatbots, you're also sharing your personal data with the companies behind them. As AI expert Mike Wooldridge said, it's "extremely unwise" to trust AI with all your thoughts, especially those best kept in private.
Originally published on Tech Times
Join the Conversation