FCC commissioner Brendan Carr has asked Apple and Google to remove TikTok from their app stores, and I don't think he's crazy.
Carr called the app a "serious" national security threat, alleging that parent company ByteDance is allowing its China-based employees to access loads of sensitive US user data and funnel this data to the Chinese government. Now, even if this is true, I would be shocked if either Apple or Google actually removed the app from their app stores; the FCC can't force Apple or Google to do this, and I wouldn't expect either company to voluntarily take such an unpopular action.
But if these claims are true — that the Chinese government indeed has access to droves of US TikTok user data — we should be treating this situation as a code red.
I think a lot of Americans, particularly younger Americans that might use TikTok, are pretty laissez-faire about what sort of information companies know about them. They ask questions like, "Why should I care that Facebook knows my birthday?" or "Why does it matter that TikTok knows I like videos of epileptic dogs?" But the danger here isn't in what information you might be sharing with the Chinese government but in how the Chinese government might eventually use that data to share information with you.
TikTok's For You algorithm is the most powerful content personalization algorithm in the world. It knows what you like before you know you like it, or worded another way, it knows exactly what type of content you are very likely susceptible to. It can pinpoint exactly who might enjoy before-and-after videos of car detailers, and it can locate with surgical precision who might be weirdly satisfied by watching someone's morning coffee routine, so I don't think it's any stretch of the imagination to think it can probably identify exactly who might be vulnerable to anti-American misinformation or pro-Chinese propaganda. This could be the QAnon conspiracy or the Russian election interference scandal on steroids if we're not careful.
I'm not saying this is actively happening. I'm just saying there is clear line-of-sight to how this could happen if the allegations in Carr's letter are true, and we need to protect ourselves from this scenario in that case.
In other words, don't underestimate the danger in taking a powerful content personalization algorithm and equipping an authoritarian government with all the user data it needs to exploit it. I think we need to take Carr's letter very seriously.
Comments