Premier Eby calls AI warnings before Tumbler Ridge shooting “disturbing” as police gather evidence from digital platforms.
AI Warnings Before Tragedy Spark Concern
British Columbia Premier David Eby has called reports that AI systems may have flagged troubling behaviour before the Tumbler Ridge mass shooting “profoundly disturbing.” The comments come after a Wall Street Journal report suggested employees at the artificial intelligence company OpenAI considered alerting authorities months before the tragedy.
The incident, which left eight people dead—including six students at the local secondary school—shocked the tight-knit community on February 10, 2026.
Police Act on Digital Evidence
Eby confirmed that RCMP are actively pursuing legal orders to preserve potential evidence held by digital service companies, including social media and AI platforms. “We are ensuring that any information that can help the investigation is protected and properly reviewed,” Eby said.
RCMP officials added that the platform contacted police only after the shooting. Investigators are now meticulously collecting, prioritizing, and processing both digital and physical evidence as part of the ongoing probe.
AI Systems Flagged Troubling Content
According to the Wall Street Journal, shooter Jesse Van Rootselaar interacted with ChatGPT about violent scenarios last June. These interactions were automatically flagged by OpenAI’s monitoring system. The report raises questions about whether earlier intervention could have prevented the tragedy.
Communities Mourn and Reflect
Vigils and memorials continue in Tumbler Ridge as families and neighbours grieve. At one service, a carved piece of wood inscribed with “TR” and the victims’ names was displayed, symbolizing the community’s mourning and resilience.
Premier Eby emphasized that the allegations are not just “disturbing for the victims’ families” but for all British Columbians, highlighting the growing debate around AI responsibility, online warnings, and timely intervention.