Eight Academics Investigated TikTok Censorship Claims and Found Nothing

Eight academics analyzed viewership metrics across more than 100,000 TikTok videos and found no evidence of systemic top-down political censorship on the...

Eight academics analyzed viewership metrics across more than 100,000 TikTok videos and found no evidence of systemic top-down political censorship on the platform following its U.S. ownership transition. The study, published in Good Authority and reported on February 4, 2026, examined trending patterns for videos about ICE raids, the fatal shooting of Alex Pretti in Minneapolis, the killing of Renee Good by an ICE agent, and keywords like “Trump” and “Epstein.” Instead of targeted suppression, the researchers concluded that a data center outage appeared to have disrupted all categories of posts equally, whether political content or food recipes.

The findings matter for investors because the censorship narrative had already begun influencing user behavior and regulatory scrutiny. California Governor Gavin Newsom and EU lawmakers called for investigations, users downloaded TikTok alternatives in droves, and the hashtag #TikTokCensorship gained traction on X. For anyone holding positions in Oracle, social media competitors, or the broader digital advertising space, the distinction between a technical outage and deliberate content suppression carries real financial weight. This article breaks down what the researchers actually found, who owns TikTok now, why the censorship claims emerged, and what the study’s limitations mean for the platform’s credibility going forward.

Table of Contents

What Did Eight Academics Find When They Investigated TikTok Censorship Claims?

The research team compared how often tiktok‘s algorithm recommended political content versus non-political posts such as food recipes and Oscar-related videos. They looked at specific high-profile topics that users had flagged as being suppressed, including coverage of ICE enforcement actions and the names of individuals involved in those events. Their method was straightforward: if TikTok were systematically throttling political content, you would expect to see political videos underperform relative to non-political content in a measurable, consistent way. That pattern did not materialize. The data showed that a data center outage disrupted all categories of posts equally, meaning political videos, cooking tutorials, and entertainment content all experienced the same dip in visibility. There was no selective targeting.

This is a critical distinction for investors tracking the TikTok ownership story. A technical outage is a routine operational issue. Deliberate censorship, on the other hand, would represent a governance crisis with regulatory, legal, and reputational consequences for Oracle and the rest of the new ownership consortium. The researchers were careful to note that their finding is not a blanket exoneration. It means the available data does not support the specific claim that the new owners flipped a switch to suppress political speech. That is a narrower conclusion than “nothing is wrong,” and the difference matters.

What Did Eight Academics Find When They Investigated TikTok Censorship Claims?

Who Owns TikTok Now, and Why Did Censorship Allegations Surface?

The censorship claims emerged after a consortium led by Oracle’s Larry Ellison took control of TikTok’s U.S. business. The new ownership structure also includes Silver Lake, the private equity firm, and MGX, an Emirati-based investment company. ByteDance retains a minority stake and still owns the underlying algorithm, though the plan calls for it to be retrained on American data under Oracle’s supervision. A TikTok spokeswoman said no changes have been made to the algorithm since the new investors took over. The ownership transition itself is what made users suspicious.

Because the TikTok divestiture law effectively gave the president the power to select the new owners, critics worried the platform could be tilted in favor of the current administration. Anupam Chander, a professor of law and technology at Georgetown University, put it plainly: “The new owners will have to earn the trust of Americans. Because the TikTok law effectively gave the president the power to select the new owners, TikTok U.S. will have to show that it is not biased in his favor.” However, if you are an investor evaluating Oracle’s expanded role here, the perception problem is almost as consequential as the reality. Even though researchers found no evidence of censorship, the speed at which users migrated to alternatives and politicians called for probes shows how fragile platform trust can be. Oracle now carries reputational risk that extends well beyond its traditional enterprise software business.

TikTok Censorship Study – Video Categories AnalyzedICE Raids20%Alex Pretti / Renee Good15%Trump Keyword25%Epstein Keyword15%Non-Political (Food/Oscars)25%Source: Good Authority study of 100,000+ TikTok videos (February 2026)

The Data Center Outage That Explains the Visibility Drop

The researchers identified a data center outage as the most likely explanation for the drop in video visibility that users interpreted as censorship. This is not an unusual event for platforms operating at TikTok’s scale. What made it politically charged was the timing. The outage coincided with the ownership transition, and users who were already primed to look for censorship naturally interpreted the disruption through that lens. The key evidence was that the outage affected all categories of posts.

If TikTok had been deliberately suppressing political content, you would expect food videos and Oscar commentary to maintain normal reach while political posts cratered. That did not happen. Every content category experienced a comparable decline, which is consistent with an infrastructure problem rather than an editorial decision. For investors, this is a useful case study in how narrative can outrun evidence on social media platforms. The #TikTokCensorship hashtag spread on X and triggered real-world consequences, including regulatory calls for investigation, before anyone had done a systematic analysis of the data. The stock implications of platform controversies often front-run the facts, which means the correction can be just as tradeable as the initial panic.

The Data Center Outage That Explains the Visibility Drop

What Researchers Could and Could Not Measure

The study has meaningful limitations that investors and analysts should not ignore. The researchers acknowledged that it is still possible the new owners have begun to reconfigure content rules in ways that would not show up in aggregate trending data. Small numbers of posts could have been removed or shadowbanned in a way that is not visible in the overall trends. The study captures the macro pattern but cannot rule out targeted actions against individual accounts or specific pieces of content. One claim the researchers could not investigate at all was the allegation that the word “Epstein” was being blocked in private direct messages. That data is simply inaccessible to outside researchers.

This is not a minor caveat. If content moderation is happening at the DM level, it represents a different kind of intervention than algorithmic suppression of public posts, and it would be invisible to any study relying on public viewership metrics. The tradeoff here is between breadth and depth. Analyzing more than 100,000 videos gives the study strong statistical power for detecting broad, systematic censorship. But it lacks the granularity to catch surgical interventions. Both types of content moderation matter, and the study only addresses one of them.

TikTok’s Transparency Problem and Its Implications for Trust

Researcher Guinaudeau told NPR something that should concern anyone with a financial interest in TikTok’s ecosystem: “Right now, TikTok can say just about anything related to algorithm changes and we can’t verify it.” He added that “until they make more extensive data available to researchers it’s nearly impossible to detect subtle changes to their ‘For You’ recommender system.” This is the core problem. TikTok does not grant researchers the type of data access needed for comprehensive content moderation reviews. This lack of transparency creates a structural risk that no single study can resolve. Even a well-designed analysis covering 100,000 videos can only test what it can see.

If TikTok’s recommendation algorithm is adjusted in subtle ways, such as slightly reducing the boost given to certain political keywords without removing videos outright, it would be nearly undetectable with publicly available data. The platform’s opacity means that every future controversy will play out in the same information vacuum. For investors in Oracle, Silver Lake, or companies competing with TikTok, this is a warning. The platform’s credibility will remain vulnerable to censorship allegations precisely because there is no mechanism for independent verification. Until TikTok opens its data to researchers in a meaningful way, each new political cycle will bring fresh accusations that are difficult to either prove or debunk conclusively.

TikTok's Transparency Problem and Its Implications for Trust

How the Censorship Narrative Affected the Competitive Landscape

The censorship allegations had tangible effects beyond hashtags and headlines. Many users downloaded TikTok alternatives in response to the claims, representing real user acquisition for competitors and potential churn for TikTok. For investors tracking companies like Snap, Meta’s Instagram Reels, or YouTube Shorts, these migration events are worth monitoring. They rarely result in permanent shifts in market share, but they can accelerate adoption of competing platforms during a narrow window when users are actively shopping for alternatives.

The regulatory response also carries competitive implications. When Governor Newsom and EU lawmakers call for investigations, the compliance costs and executive attention required to respond are resources diverted from product development and growth. Oracle, as the primary U.S. partner, absorbs some of that burden, which was likely not priced into its original deal calculus.

What Comes Next for TikTok Under New Ownership

The path forward depends largely on whether TikTok’s new ownership consortium takes concrete steps toward transparency. The algorithm is set to be retrained on American data under Oracle’s supervision, but the details of that process remain unclear. If Oracle can demonstrate verifiable independence in how the recommendation system operates, it would go a long way toward defusing future censorship claims. If the process remains opaque, the platform will continue to be a lightning rod for political suspicion from all sides.

Chander’s observation about the new owners needing to earn trust is the investment thesis in miniature. TikTok’s value to its new owners depends on maintaining its user base, which depends on trust, which depends on transparency that does not yet exist. The eight academics who investigated the censorship claims found nothing this time. Whether that finding holds up over the next political cycle is a question that only better data access can answer.

Conclusion

The study of more than 100,000 TikTok videos by eight academics found no evidence of systemic political censorship following the platform’s U.S. ownership transition. A data center outage, not deliberate suppression, appears to explain the visibility drop that triggered the controversy. The new ownership consortium led by Oracle, including Silver Lake and MGX, has stated that no algorithmic changes have been made, and the aggregate data supports that claim at the macro level.

But the study’s limitations are as important as its findings. Small-scale content removals, DM-level filtering, and subtle algorithmic adjustments remain undetectable without the kind of data access TikTok does not currently provide. For investors, the takeaway is that the immediate censorship crisis appears to have been a false alarm, but the structural conditions that made the false alarm possible, and damaging, have not changed. Oracle’s expanding role in TikTok’s operations adds a layer of reputational and regulatory risk that the market is still learning to price.


You Might Also Like