Family Sues OpenAI, Alleges Company Knew Shooter Planned Tumbler Ridge Massacre and Didn't Alert Police
Image: The Hindu

Family Sues OpenAI, Alleges Company Knew Shooter Planned Tumbler Ridge Massacre and Didn't Alert Police

10 March, 2026.Technology and Science.5 sources

Key Takeaways

  • Family sued OpenAI alleging it knew shooter planned attack but failed to alert police.
  • Twelve-year-old Maya Gebala suffered catastrophic brain injuries and remains hospitalized.
  • Mass shooting at Tumbler Ridge on February 10 became one of Canada's worst school shootings.

Civil suit over school shooting

Families of victims have filed a civil lawsuit in British Columbia Supreme Court accusing OpenAI of having foreknowledge of the Feb. 10 Tumbler Ridge school shooting.

- Published The family of a girl critically injured during a mass shooting at a Canadian school is suing ChatGPT-maker OpenAI, external, claiming it had been aware the suspect had been planning an attack but failed to alert the authorities

BBCBBC

The lawsuit accuses OpenAI of failing to notify police about the alleged foreknowledge.

Image from BBC
BBCBBC

The claim is described as being brought on behalf of Maya and Dahlia Gebala and their mother.

It alleges OpenAI 'had specific knowledge' that the shooter used ChatGPT to plan the attack.

Reporting also states that a family affected by the shooting has filed a civil claim against OpenAI in B.C. Supreme Court.

Alleged OpenAI warnings

The complaint alleges specific internal warnings and operational failures.

According to the plaintiffs, about 12 OpenAI employees flagged the shooter’s prompts as presenting an 'imminent risk of serious harm' and recommended calling police.

Image from News Arena India
News Arena IndiaNews Arena India

The plaintiffs say leadership allegedly rebuffed those warnings and only banned the shooter’s initial account.

The Globe and Mail summary likewise points to 'flagged interactions with its ChatGPT system' as the basis for the family's contention that OpenAI had notice of violent intentions.

Lawsuit alleges shooting injuries

The suit describes 12-year-old Maya as having been shot three times, including a bullet to the head.

It says she suffered "catastrophic traumatic brain injury, permanent cognitive and physical disability, right-sided hemiplegia, scarring, and PTSD; currently hospitalized with an uncertain prognosis."

The suit says her sister Dahlia endures "PTSD, anxiety, depression, sleep disturbances."

The Globe and Mail notes the lawsuit was filed to "uncover the truth, hold parties accountable, obtain redress, and help prevent future shootings."

Other reports summarize Maya as "left with life-altering injuries" and having "a catastrophic brain injury causing permanent cognitive and physical disabilities."

OpenAI response to shooting

News Arena India reports OpenAI acknowledged flagging the activity as early as June 2025 for the 'furtherance of violent activities' but said it did not report to the RCMP because it did not meet the threshold for doing so.

The Hindu says the company considered but did not alert police before the shooting and only notified authorities after the attacker killed eight people and herself.

Image from The Hindu
The HinduThe Hindu

Reports also say the shooter’s ChatGPT account had been closed but that she had evaded the ban using a second account.

The matter has drawn political criticism, with coverage noting the company's handling has been criticised by B.C. Premier David Eby and federal ministers.

The available articles use slightly different spellings for the attacker’s name (for example, "Jesse Van Rootselaar" and "Jesse Van Roostselaar"), so the sources present a discrepancy on the exact spelling of the shooter’s name.

More on Technology and Science