CNBC on July 21, 2025 discussed how Russian and Chinese state actors use organized YouTube platform hacking to spread propaganda, skirt sanctions, and finance influence operations against global publics. At least 350 channels from the two nations collectively have millions of monthly viewers, leveraging YouTube’s algorithmic recommendations and monetisation features for spreading geopolitical agendas and generating revenue.

Prime Tactics Exposed:
Bypassing Bans through Stealth Networks
Russian television networks like RT and Sputnik—officially banned in the EU—are back on the airwaves in disguised avatars. Duplicate accounts appear sharing old content with clean handles (e.g., “Science Today” or “History Explorers”), bypassing platform filtering. CNBC discovered channels that have the same thumbnails, metadata, and cross-promotion tactics.
China’s “Pink Propaganda” Blitz
State-sponsored Chinese accounts avoid blatant politics, instead flooding YouTube with entertainment and culture content (panda vlogs, cooking shows, travelogues). Such “soft power” strategy gets CCP accounts used to it without creating enough outrage to flip over. Claims over the South China Sea were peddled to 500k viewers by a Beijing-funded channel pretending to be a “language learning” clique.
AdSense: Pay for Deception
In contravention of YouTube guidelines, state-run accounts are Google AdSense advertised. CNBC verified payments to U.S./EU-sanctioned Russian entities via crypto transactions and shell businesses. A purported network made more than $120,000 a month in ad revenues from pro-Kremlin conflict videos.
Shortcomings on the platform Algorithmic Amplification: YouTube algorithmic recommendation feature amplifies state-produced content pushed to geopolitics-starved users, often without labels. Pro-Russian accounts had 65% more algorithmic recommendations after invading Ukraine.
Moderation Blind Spots: Channels are not banned by deleting offending videos hours after upload, but leaving accounts open. Others use AI voiceovers and pirated material to avoid copyright strikes
Monetisation Loops: Gaps in Google’s ad structures’ capacity to detect state-backed actors have allowed sanctioned actors to monetise.
Findings: Case studies show government-linked propaganda efforts monetising on YouTube: 1. Russia’s “News Network 24”, a “redescaled” RT campaign that has 2.4M monthly views across India, Brazil, and Hungary, and monetises pro-war disinformation. 2. China’s “Dragon Culture TV”, a CCP-front that reached 800k subscribers with content focused on kung fu tutorials to “stealthily indoctrinate” viewers with Beijing’s “national rejuvenation” narrative.
YouTube: YouTube confirmed to remove 47 channels in the CNBC evidence, and signalled wider issues. “We’ve invested in A.I. detection and human review, but actors constantly adapt.” Critics argue enforcement remains reactive, not preventive.
International Implications:
The campaign erodes democratic processes’ trust, hijacks the diaspora community, and discredits humanitarian work. Pro-Kremlin sites promoted disinformation narratives in Germany’s 2025 elections, while Chinese sites overwhelmed the comment streams of critical articles so that critical voices were silenced and discussions of human rights abuses were hidden.
Closing
The universality and accessibility of YouTube have made it the preferred target for digital authoritarian practice. Until YouTube becomes more actively interventionist, for example, by making more evident state affiliation labelling on such material, more rigorous policing of the ad revenues of state-linked channels, and greater attention to algorithmic amplification, it will continue to be a tool of geopolitical information manipulation. As governments-turned-hacktivists escalate their tactics and tools, the article ends on a note of call to duty for more platform responsibility—less reactive take-downs—to protect the integrity of global information space