OpenAI says it will strengthen its safety protocols and notify law enforcement earlier when it detects credible threats of violence on its platform.

The move follows scrutiny from Canadian officials after it was revealed that the suspect in the 2025 Tumbler Ridge, British Columbia mass shooting had a second account on ChatGPT. The company had previously banned the suspect’s original account due to potential warnings of real-world violence but did not notify authorities at the time.

According to reporting from Politico and `The Washington Post, OpenAI has now pledged to adjust its internal policies.

What will change

OpenAI’s vice president of global policy, Ann O’Leary, reportedly told Canadian officials that the company will:

  • Improve detection systems to prevent banned users from creating new accounts
  • Notify law enforcement of “imminent and credible” threats
  • Share information even if a user does not specify a target, method, or timing
  • Establish a direct point of contact for Canadian law enforcement

Under the revised approach, OpenAI says it would have alerted police when the suspect’s account was first banned in 2025.

The company only identified the shooter’s second account after the suspect’s name became public, at which point authorities were informed.

Regulatory pressure mounts

Canadian officials viewed the failure to report the original account as a serious lapse. Political leaders, including British Columbia Premier David Eby, have pressed OpenAI executives for answers. CEO Sam Altman has reportedly agreed to meet with provincial leadership.

The Canadian government has signaled that if AI companies cannot demonstrate adequate safeguards, regulation of AI chatbots could follow.

It remains unclear whether OpenAI will implement identical reporting policies in the United States and other countries. However, this shift marks a significant tightening of how the company handles credible threats of violence detected through ChatGPT conversations.

Categorized in:

AI, News,

Last Update: February 28, 2026

Tagged in:

, ,