"Extraordinary claims require extraordinary evidence" – Carl Sagan

Navigating recent media coverage in search of concrete information on the Canadian government’s stern decision to block TikTok from running a business in Canada feels like a wild goose chase, as there is a definite absence of clarity behind the claim of mysterious ‘national security risks’.

Since I’m composing this article on Carl Sagan’s birthday, a quote is in order.

The Honorable François-Philippe Champagne, Canada's federal Minister of Innovation, Science and Industry is well known for his efforts to “decouple” the North American economy from China. When pressed by the CBC for details on where that leaves Canadians looking to understand the move, he simply said that Canadians will have to "draw their own conclusions".

While that answer is about as opaque as they get, back in 2023 TikTok pre-emptively created a Transparency and Accountability Center to offer authorities a behind-the-scenes view into their algorithms and content moderation practices, even as U.S.A. lawmakers were pressing the company to disclose its information access and processing practices.

Last year, when Canada announced a ban of TikTok on government devices, I obviously supported the move, asking why any work devices even had access to such distractions as social media applications in the first place.

TikTok has further offered transparency through Project Texas, a program to relocate data to American servers and undergo third-party audits. Canada, however, has not engaged in or acknowledged such transparency efforts, possibly bypassing a cooperative solution in favor of more drastic restrictions.

The Seeds of Fear, Uncertainty and Doubt

I've never been a TikTok user and have no more interest in the platform than I have in the dumpster fire of toxic bile that is the outfit formerly known as Twitter, but from where I sit, the Canadian government's handling of TikTok raises critical concerns, from reliance on secrecy to potential human rights implications.

Claiming – without offering any substantial evidence - that national security risks are so severe as to not even be shareable with those most impacted by them, citizens are simply advised to feel free to continue using the app, ostensibly at their own risk (wink). Such an obvious appeal to FUD seems to be intentionally crafted to create cognitive dissonance, not only by reinforcing an authoritarian stance but more importantly eroding everyone’s understanding of security, risk and privacy.

Secrecy is Nothing More Than Security by Obscurity

By opting for a secretive national security review, Canada has avoided releasing specifics about the alleged risks. Such actions set a dangerous precedent, promoting a “guilty until proven innocent” mindset. This opaque approach could also foster a chilling effect, dissuading foreign investments in Canada, especially in digital sectors. But we can safely assume that, of all people, Canada's federal Minister of Innovation, Science and Industry has worked these risks into his calculus.

The secrecy surrounding this decision raises questions about its underlying motivations, suggesting a potential inclination toward controlling information of public interest rather than sharing it with stakeholders. Whether this was intended to send a message to other Chinese companies in Canada remains to be seen, but such firms currently operate in retail, e-commerce, banking, energy and resources sectors and are no doubt closely watching the proceedings, particularly as five other China-linked companies were equally unceremoniously shut down in the past two years.

As a result, it seems more likely than not that Canadian companies operating in China – such as Magna, Bombardier, Saputo, and the Bank of Montreal, among others - may soon face some retaliatory headwinds when it comes to doing business in the Asian country.

Setting a Precedent Against Human Rights

By censoring a platform primarily due to foreign ownership, Canada could be setting a precedent that threatens global standards for internet freedom. Such actions risk empowering governments worldwide to impose restrictions on platforms and services in the name of security, potentially stifling freedom of expression and access to information.

When I previously wrote about Zoom and how its obscure development and IP-access practices poses a particular risk to the privacy and confidentiality of children and students during the pandemic, I said:

China’s understanding of privacy is vastly different: the data belongs to the organizations that collect it and any such organizations must grant unfettered access for government inspection, in the name of safety and security. Article 77 of its Cybersecurity Law ensures that data is collected and stored in China where full transparency and access must be provided to the Ministry of Public Security. Period.
It is unclear just how much Zoom data is stored or archived in China, but if it can be inspected, decrypted or accessed by/for Chinese authorities as Citizen Lab’s research indicates, chances are that storage would also be taking place in that country.

Naturally, Zoom does not have offices in Canada, so it’s arguably more difficult to request support, information and evidence of privacy compliance. As such, once TikTok’s offices are shut down and hundreds of employees are laid off, it will likely be difficult for Canadians to get access to information about the company’s safety procedures, ask about online moderation and initiate Privacy Commissioner investigations, simply because they will no longer exist in our country.

Misdirection with Claims of Addiction

In some cases, officials and critics have emphasized TikTok's addictive nature as part of the rationale for scrutiny. This argument appears misdirected; issues of app design and usage belong in the domain of consumer protection, not national security. Conflating concerns about app addiction with national security risks may dilute focus from legitimate security discussions, confusing the public about the core issues.

That said, I assume this is not the case and any invocation of national security could be construed to hint at the possibility of recruiting Canadians for nefarious purposes, influencing users ‘en masse’ to have a theoretical impact on elections or ‘seeding’ the public consciousness with narratives that diverge from the official stance of the Canadian government. Either way, these are not things that should be left to the public to fantasize about, for the precise reason that Canadians are owed clarity and transparency simply as a matter of principle.

Impact on Government Credibility

I can’t help but wonder whether Canada’s current approach to TikTok reflects a troubling mix of secrecy, questionable justifications, and possibly authoritarian motivations that do not align with the values of a free and open society. Embracing transparency, considering cooperative solutions, and offering concrete evidence would not only bolster credibility but also reassure citizens that policies truly serve public interests.

I certainly do not expect to have access to privileged information, but the opaque nature of Canada’s TikTok (or is it its parent company, ByteDance?) expulsion risks undermining public trust in government decisions at a time when it could far better serve as an ideal opportunity for raising awareness among Canadians for genuine security concerns.

If the public perceives this move as an excessive, disrespectful overreach under the guise of security, it may bring into question foreign policy decisions and corporate law enforcement practices. Ultimately, the manufactured dichotomy between a heavy-handed approach to urgent corporate expulsion and the resulting inability for government agencies to conduct future privacy investigations on behalf of Canadians appears both intentional and calculated.

In closing, I feel compelled to again quote Carl Sagan (who in turn channelled Rees, Wright, Housman et al.) when he wrote that “absence of evidence is not evidence of absence”.As such, it is important to note that the head of Canada’s spy agency has been quoted as characterizing TikTok as “a real threat” as he told CBC "My answer as director of [the Canadian Security Intelligence Service] is that there is a very clear strategy on the part of the government of China ... to be able to acquire ... personal information from anyone around the world,"

While this is objectively true of all social media companies, it is also factually true to that TikTok has, at least by all measures publicly available, demonstrated a degree of transparency easily comparable with their industry peers.

A recent paper from Georgia Tech and Emory University sought to assess the extent of the harms to political accountability and trust in media posed by the liar's dividend claiming that “that strategic and false allegations of misinformation (i.e., fake news and deepfakes) benefit politicians by helping them maintain support in the face of information damaging to their reputation. This concept is known as the "liar's dividend"(Chesney and Citron 2018) and suggests that some politicians profit from an informational environment saturated with misinformation.” Not to be outdone, a study from Cambridge University indicates that by using a counterintuitive approach to "crying wolf", "politicians may seek to undermine confidence in the informational environment, by invoking informational uncertainty".

More to the point, the idea that in the modern age of misinformation and AI one can make claims about anything being scary, threatening or fake simply due to the indisputable prevalence of lies and the liars who tell them (to quote the venerable Al Franken), is a slippery slope that desensitizes and renders the public more compliant and likely to provide consent. That consent may neither be informed nor genuine, but it is increasingly difficult to argue that it is not meaningful.

An abridged version of this article appears in The Conversation.