Canada’s privacy watchdog is investigating Elon Musk’s X. Here’s what you need to know

Investigation into X highlights Ottawa’s limits of power in regulating U.S. Big Tech

The world’s richest man found himself in the middle of another Canadian-led protest last month when Canada’s privacy commissioner, Philippe Dufresne, announced that the agency had opened an investigation into X after receiving a complaint.

Canada’s privacy watchdog is now set to “examine whether X is meeting its obligations” under Canada’s privacy regime, also known as the Personal Information Protection and Electronic Documents Act (PIPEDA) — which lays out how companies can collect, use and disclose Canadians’ personal information. The privacy office’s investigation comes as Canada and the U.S. spar over tariffs and digital taxes on U.S. tech firms, alongside growing scrutiny over how tech companies use online and personal data to develop their artificial intelligence (AI) models. Here’s what you need to know about the privacy probe.

What happened?

Under PIPEDA, when a complaint is received, Canada’s privacy office will carry out an investigation unless the legislation specifies a reason not to, explained Teresa Scassa, the Canada research chair in information law and policy at the University of Ottawa (UOttawa).

The privacy office launched their investigation into X after receiving Masse’s complaint. It will probe X’s compliance with federal privacy laws, focusing on its collection, use and disclosure of Canadians’ personal information to train its AI models. The office will produce a report of its findings within one year of the complaint being filed, though investigations of complex matters can take longer — and this case is set to be a complex one, Scassa said.

Why are they investigating?

In February 2023, Canada banned TikTok from government devices over national security and privacy concerns, and later ordered the app to shut down its Canadian offices. TikTok’s parent company, ByteDance Ltd., is headquartered in Beijing, and Ottawa raised concerns that TikTok could be compelled to share sensitive Canadian user data — from location and search history to biometrics — with the Chinese authorities. TikTok said that it has “never provided user data to the Chinese government, nor would we if asked.”

Musk, meanwhile, came under fire in November 2024 when X’s updated terms of service stated that user data would be used to train AI models — operated by X or other third-parties — without giving users the choice to opt out. A 2023 update also allowed the platform to obtain users’ encrypted messages and their biometric data. X’s approach “poses a serious risk to individual rights,” said an Amnesty International report at the time.

X’s AI models trained on content that has not been moderated for hate speech, misinformation and disinformation could also be used to “optimize engagement by promoting the most polarizing content, even if it’s false or harmful, (which) has huge implications for election manipulation. AI-driven microtargeting could be used to push political interests,” said Sonja Solomun, the deputy director of McGill’s Centre for Media, Technology and Democracy.

X last year agreed to suspend its use of personal data collected from EU users for training its AI chatbot Grok after the Irish Data Protection Committee — which is the lead EU regulator for major U.S. Internet firms — issued proceedings against the company in the Irish High Court. The EU has strict data privacy laws outlined in its general data protection regulation (GDPR).

What happens next?

Canada’s privacy regime relies on voluntary compliance, meaning that even if the commissioner finds that X violated PIPEDA, its final guidance will likely lack teeth.

If the office finds that X was non-compliant, it will explain the company’s breaches in its final report and provide recommendations to address those breaches. But the watchdog’s “findings and recommendations are just that — they are not enforceable (as) PIPEDA does not give the commissioner order-making powers,” Scassa said.

If the office seeks to enforce an order, it will need to file a Federal Court application. But even if a Canadian court makes an order against X, it is “not automatically enforceable against the company in the U.S. A company that is not interested in complying with Canadian law has many tools at its disposal to challenge or delay any enforcement,” Scassa explained.

Canada’s approach to regulating technology companies is still evolving and the country lacks a comprehensive digital governance strategy, Solomun said. The Liberal government in 2022 introduced the now-defunct Bill C-27 that sought to strengthen Ottawa’s powers in regulating tech firms, including giving the privacy commissioner the power to order a company to stop collecting data or using personal information, and to establish fines for non-compliant companies. It also included an act that aimed to set guardrails on the development of AI, which was criticized by tech firms for being too broad and by civil society groups for not being tough enough.

The widening policy gap between the U.S. and jurisdictions like the EU — which has led the charge in regulating American Big Tech in a bid to constrain their power — poses challenges for Canada’s own approach to regulation, since Ottawa has historically followed in U.S. footsteps, said Vivek Krishnamurthy, director of the Technology Law and Policy Clinic at University of Colorado Boulder and a former law professor at the University of Ottawa who served on Canada’s online safety expert advisory group.

As Scassa puts it, “Action is needed on privacy reform and AI governance. Unfortunately, right now, we are in limbo.”

Related Posts


This will close in 0 seconds