North Korean Know Kim Jong Un Is A Liar, They Just Don’t Know How Much

(THIS ARTICLE IS COURTESY OF THE BUSINESS INSIDER)

 

North Koreans understand their government lies, but there’s one thing they don’t know, according to a defector

North Korea
Servicepersons of the Ministry of People’s Security met on August 10, 2017 to express full support for the Democratic People’s Republic of Korea (DPRK) government statement.
Reuters/KCNA
  • North Korean defector Kim Young-il is the 39-year-old founder of People for Successful Corean Reunification (PSCORE). He escaped North Korea at 19 years old.
  • Kim said that it is obvious to North Koreans that the government of Kim Jong Un is lying to the people about the country’s situation and its reality.
  • The one thing it is impossible for North Koreans to understand, however, is how big the difference in prosperity is between their country and developed nations like the US and South Korea.

North Koreans understand that their government regularly lies to them and feeds them propaganda that contradicts their current situation, but few understand the true discrepancy between their country and the outside world, according to North Korean defector Kim Young-il.

Kim, the 39-year-old founder of People for Successful Korean Reunification (PSCORE), escaped North Korea when he was 19 years old. PSCORE is a nonprofit that promotes reunification, raises awareness about human rights issues in North Korea, and helps defectors adjust to life in South Korea.

In 1997, Kim and his father left the country in the midst of a four-year-long famine and economic crisis that some estimates suggest claimed the lives of between 240,000 and 3.5 million North Koreans, out of a population of 22 million.

The dire situation made it obvious to North Koreans at the time that the government was not telling the truth about country, Kim told Business Insider in a recent interview. Kim, whose organization helps defectors escape North Korea and China and assists them once they reach South Korea, said that, even now, the situation is much the same; North Koreans know their government is lying.

“The people know these are all lies because it’s obvious. When the government says, there is prosperity in terms of food and rice, we see it ourselves and see that there is a drought and there is no food for us,” Kim said.

“When they see that what they say doesn’t match with what is actually happening, they understand the government is lying.”

The one thing that North Koreans can’t know, according to Kim, is the actual disparity between the country and other nations like the US, South Korea, or China.

“They know [those countries are more prosperous and developed], but they don’t know at what level and how different the countries are. They have no frame of reference. All the government says are lies, Kim said. “They have no way to obtain information about what South Korea or the United States look like.”

As Kim told the International Business Times last year, he and his family thought it was normal to “have our freedoms restricted.” It was only upon arriving in South Korea that Kim said he realized “how unhappy we were.”

China’s Xi Jinping Is A Master Of Propaganda Which Is “Fake News”

(THIS ARTICLE IS COURTESY OF VOX NEWS)

 

China is perfecting a new method for suppressing dissent on the internet

America should pay attention.

Chinese leader Xi Jinping 
Getty Images

The art of suppressing dissent has been perfected over the years by authoritarian governments. For most of human history, the solution was simple: force. Punish people severely enough when they step out of line and you deter potential protesters.

But in the age of the internet and “fake news,” there are easier ways to tame dissent.

new study by Gary King of Harvard University, Jennifer Pan of Stanford University, and Margaret Roberts of the University of California San Diego suggests that China is the leading innovator on this front. Their paper, titled “How the Chinese Government Fabricates Social Media Posts for Strategic Distraction, Not Engaged Argument,” shows how Beijing, with the help of a massive army of government-backed internet commentators, floods the web in China with pro-regime propaganda.

What’s different about China’s approach is the content of the propaganda. The government doesn’t refute critics or defend policies; instead, it overwhelms the population with positive news (what the researchers call “cheerleading” content) in order to eclipse bad news and divert attention away from actual problems.

This has allowed the Chinese government to manipulate citizens without appearing to do so. It permits just enough criticism to maintain the illusion of dissent and only acts overtly when fears of mass protest or collective action arise.

To learn more about China’s internet propaganda machine, I reached out to Roberts, one of the authors of the paper. I asked her how successful China has been at manipulating its population and, more importantly, if she thinks this brand of online propaganda will become a model for authoritarianism in the digital age.

You can read our full conversation below.


Sean Illing

How does China use the internet to manipulate its population?

Margaret Roberts

With this particular study, we were motivated by rumors of what’s called the “50 Cent Party” in China [more on this below]. People were convinced that China was engaged in a widespread online propaganda campaign that targeted its own population. But we never had direct evidence that this was ongoing.

Then in 2014, there was a massive leak that revealed what China was doing and how they organized their propaganda machine. So that gave us an opportunity to look at the actual posts the Chinese government was producing and spreading on the web for propagandistic purposes.

We gathered up all the data from the leaked email archive, and that allowed us to explore the content of the propaganda, which is something that no one had done before.

Sean Illing

And what did you find?

Margaret Roberts

We had always thought that the purpose of propaganda was to argue against or undermine critics of the regime, or to simply persuade people that the critics were wrong. But what we found is that the Chinese government doesn’t bother with any of that.

Instead, the content of their propaganda is what we call “cheerleading” content. Basically, they flood the web with overwhelmingly positive content about China’s politics and culture and history. What it amounts to is a sprawling distraction campaign rather than an attempt to sell a set of policies or defend the policies of the regime.

Sean Illing

I want to dive deeper into that, but I want to make sure we don’t glide past the “50 Cent Party” reference. Can you explain what that is?

Margaret Roberts

The 50 Cent Party is a kind of informal branch of the Chinese government that carries out its online propaganda campaign — so these are the foot soldiers who post the content, share the posts, etc. The name stems from the rumor that the members were each paid 50 cents for every post that helped the government. We didn’t find evidence that people were being paid 50 cents, however. It turns out posting online propaganda is just part of a government job.

Sean Illing

Do we have any idea how many members there are or how many people occupy these posts?

Margaret Roberts

The rumor before we started studying this is that it’s something like 2 million people, but we simply don’t know for sure. But we estimate that the government fabricates and posts 448 million social media comments a year.

People stage a rare large-scale protest not far from Tiananmen Square in Beijing on July 24, 2017, in connection with a recent crackdown on a company suspected of being involved in a pyramid scheme.
 Getty Images

Sean Illing

So let’s talk about China’s strategy. In the paper, you point out that China’s government actively manipulates its population, but that it doesn’t necessarily appear that way to its citizens. Part of the reason for this is China’s unusual approach to propaganda, which is to avoid refuting skeptics or defending policies and instead flood the digital space with happy news. What’s the strategic logic behind this approach?

Margaret Roberts

We think the purpose is distraction, because these posts are highly coordinated within certain time periods and the posts are written uniformly over time. They’re actually really bursty (meaning lots of similarly themed posts at the same time). The basic idea seems to be to flood the internet with positive noise in order to drown out bad news and distract from more serious or problematic issues.

Sean Illing

And they believe this is the most effective way to control political discourse?

Margaret Roberts

I think they realized that politics is about controlling the narrative and setting the agenda. Politicians and government officials in China want people to talk about the issues that reflect well on them. Their calculation is pretty simple: If they engage critics on issues that are complicated or reflect poorly on the government, they only amplify the attention those issues receive. So their approach is to ignore the criticisms and shift attention to other topics, and they do that by deluging the internet with positive propaganda.

Sean Illing

Are these positive stories actually true, or are we talking about “fake news”?

Margaret Roberts

This is a really interesting question. A lot of what we found in the leaked archive isn’t fake news. What they’re creating are stories that promote patriotism. They want people talking about and responding to content that favors the regime. But they also want people to think that content is coming from civilians and not from the government, which is why most of this is presented as someone’s opinion.

Sean Illing

What form does this cheerleading content take? What kinds of stories do they promote?

Margaret Roberts

The most common articles we found discussed how great it is to live in China or how wonderful Chinese culture is or how dominant China’s sports teams are — that kind of stuff. We’re not really talking about fact-based material here. It’s just positive stories that flatter the regime and the country.

Again, the point isn’t to get people to believe or care about the propaganda; it’s to get them to pay less attention to stories the government wants to suppress.

Sean Illing

Something else that jumped out at me in the paper was this idea that they want to permit just enough criticism to offer the illusion of dissent, but they want to make sure that there’s never enough criticism to spark collective action.

Margaret Roberts

China monitors the online information environment in order to collect information about the public and what they’re thinking. In that sense, they want people communicating freely. But a problem arises when you have too many people criticizing the government at the same time. There’s a constant risk of collective action or mass protest.

China’s government does its best to distinguish between useful criticisms (the kinds of criticisms that help them figure out how to satisfy the citizenry) and dangerous criticisms (the kinds of criticisms that might lead to mass protest events). They usually wait until there is a possibility for major mobilization against the government before they engage in overt censorship.

Sean Illing

Is China’s use of the internet unique or new? Are other governments doing similar things?

Margaret Roberts

I think there are aspects of the Chinese model that are new and unique, and certainly they’ve been at the forefront of trying to figure out how to control the internet. There is some evidence that other countries are learning from China, but nothing definitive.

Sean Illing

In the paper, you suggest this research might lead us to rethink the notion of “common knowledge” in theories of authoritarian politics — what does that mean?

Margaret Roberts

I think historically a lot of people have thought that common knowledge about things the government maybe has done wrong is detrimental to the regime. This is the idea that any criticism is detrimental to the regime. What we find in China is that criticism can be very helpful to the regime because it can allow them to respond.

But the type of common knowledge that’s really dangerous to the regime is knowledge of protests or other forms of collective action activity. That’s a major threat because it can spread so easily. We’ve seen this over and over throughout world history: Regimes are most vulnerable when small protests escalate into something much broader. This is what China’s government is determined to prevent.

People lie on the ground in Beijing on July 24, 2017, in protest against police for closing the road to a gathering where at least several thousand people staged a rare large-scale rally not far from Tiananmen Square in connection with a recent crackdown on a company suspected of being involved in a pyramid scheme.
 Getty Images

Sean Illing

To be clear, you call China’s approach “strategic distraction,” but it’s really about undercutting the possibilities for organized dissent. Regimes have always tried to capture people’s attention and redirect it in less dangerous directions. The only thing new about China’s operation is its use of the internet.

Margaret Roberts

I think that’s exactly right on.

Sean Illing

Do you think China’s approach to suppressing dissent is uniquely effective in an age of “fake news” and “post-truth”?

Margaret Roberts

The internet has created an environment in which there is a vast amount of information. That means it’s difficult for people to separate out “good” and “bad” information. Because many people have short attention spans online, they can easily be affected by information that looks like something it is not. That’s what China’s online propaganda and fake news have in common — they both take advantage of our short attention spans on the web.

Sean Illing

Is this a model for authoritarianism in the digital age? Should we expect more of this from other governments?

Margaret Roberts

The difficulty with online propaganda, and we’re seeing this in the US and other democracies around the world right now, is that it doesn’t function overtly like traditional forms of censorship. Most people object to blatant censorship. But online propaganda is a form of participation as well as form of censorship, so it’s difficult to know what the right policy is.

People want to introduce information on the web en masse, and that means a lot of noise and opinions and bots and commentators. Are there ways of regulating all of this without censoring ourselves? I think that’s a really hard question, and I don’t have the answers. But I think the world will have to struggle with this new reality of online propaganda, because it isn’t going away.

Russian propaganda effort helped spread ‘fake news’ during election, experts say

(THIS ARTICLE IS COURTESY OF THE WASHINGTON POST)

Russian propaganda effort helped spread ‘fake news’ during election, experts say

November 24 at 8:27 PM
The flood of “fake news” this election season got support from a sophisticated Russian propaganda campaign that created and spread misleading articles online with the goal of punishing Democrat Hillary Clinton, helping Republican Donald Trump and undermining faith in American democracy, say independent researchers who tracked the operation.Russia’s increasingly sophisticated propaganda machinery — including thousands of botnets, teams of paid human “trolls,” and networks of websites and social-media accounts — echoed and amplified right-wing sites across the Internet as they portrayed Clinton as a criminal hiding potentially fatal health problems and preparing to hand control of the nation to a shadowy cabal of global financiers. The effort also sought to heighten the appearance of international tensions and promote fear of looming hostilities with nuclear-armed Russia.

Two teams of independent researchers found that the Russians exploited American-made technology platforms to attack U.S. democracy at a particularly vulnerable moment, as an insurgent candidate harnessed a wide range of grievances to claim the White House. The sophistication of the Russian tactics may complicate efforts by Facebook and Google to crack down on “fake news,” as they have vowed to do after widespread complaints about the problem.

There is no way to know whether the Russian campaign proved decisive in electing Trump, but researchers portray it as part of a broadly effective strategy of sowing distrust in U.S. democracy and its leaders. The tactics included penetrating the computers of election officials in several states and releasing troves of hacked emails that embarrassed Clinton in the final months of her campaign.

“They want to essentially erode faith in the U.S. government or U.S. government interests,” said Clint Watts, a fellow at the Foreign Policy Research Institute who along with two other researchers has tracked Russian propaganda since 2014. “This was their standard mode during the Cold War. The problem is that this was hard to do before social media.”

During a Facebook live discussion, reporter Caitlin Dewey explained how fake news sites use Facebook as a vehicle to function and make money. (The Washington Post)

Watts’s report on this work, with colleagues Andrew Weisburd and J.M. Berger, appeared on the national security online magazine War on the Rocks this month under the headline “Trolling for Trump: How Russia Is Trying to Destroy Our Democracy.” Another group, called PropOrNot, a nonpartisan collection of researchers with foreign policy, military and technology backgrounds, planned to release its own findings Friday showing the startling reach and effectiveness of Russian propaganda campaigns.

The researchers used Internet analytics tools to trace the origins of particular tweets and mapped the connections among social-media accounts that consistently delivered synchronized messages. Identifying website codes sometimes revealed common ownership. In other cases, exact phrases or sentences were echoed by sites and social-media accounts in rapid succession, signaling membership in connected networks controlled by a single entity.

PropOrNot’s monitoring report, which was provided to The Washington Post in advance of its public release, identifies more than 200 websites as routine peddlers of Russian propaganda during the election season, with combined audiences of at least 15 million Americans. On Facebook, PropOrNot estimates that stories planted or promoted by the disinformation campaign were viewed more than 213 million times.

Some players in this online echo chamber were knowingly part of the propaganda campaign, the researchers concluded, while others were “useful idiots” — a term born of the Cold War to describe people or institutions that unknowingly assisted Soviet Union propaganda efforts.

Consider these points before sharing a news article on Facebook. It could be fake. (Monica Akhtar/The Washington Post)

The Russian campaign during this election season, researchers from both groups say, worked by harnessing the online world’s fascination with “buzzy” content that is surprising and emotionally potent, and tracks with popular conspiracy theories about how secret forces dictate world events.

Some of these stories originated with RT and Sputnik, state-funded Russian information services that mimic the style and tone of independent news organizations yet sometimes include false and misleading stories in their reports, the researchers say. On other occasions, RT, Sputnik and other Russian sites used social-media accounts to amplify misleading stories already circulating online, causing news algorithms to identify them as “trending” topics that sometimes prompted coverage from mainstream American news organizations.

The speed and coordination of these efforts allowed Russian-backed phony news to outcompete traditional news organizations for audience. Some of the first and most alarming tweets after Clinton fell ill at a Sept. 11 memorial event in New York, for example, came from Russian botnets and trolls, researchers found. (She was treated for pneumonia and returned to the campaign trail a few days later.)

This followed a spate of other misleading stories in August about Clinton’s supposedly troubled health. The Daily Beast debunked a particularly widely read piece in an article that reached 1,700 Facebook accounts and was read online more than 30,000 times. But the PropOrNot researchers found that the version supported by Russian propaganda reached 90,000 Facebook accounts and was read more than 8 million times. The researchers said the true Daily Beast story was like “shouting into a hurricane” of false stories supported by the Russians.

This propaganda machinery also helped push the phony story that an anti-Trump protester was paid thousands of dollars to participate in demonstrations, an allegation initially made by a self-described satirist and later repeated publicly by the Trump campaign. Researchers from both groups traced a variety of other false stories — fake reports of a coup launched at Incirlik Air Base in Turkey and stories about how the United States was going to conduct a military attack and blame it on Russia — to Russian propaganda efforts.

The final weeks of the campaign featured a heavy dose of stories about supposed election irregularities, allegations of vote-rigging and the potential for Election Day violence should Clinton win, researchers said.

“The way that this propaganda apparatus supported Trump was equivalent to some massive amount of a media buy,” said the executive director of PropOrNot, who spoke on the condition of anonymity to avoid being targeted by Russia’s legions of skilled hackers. “It was like Russia was running a super PAC for Trump’s campaign. . . . It worked.”

He and other researchers expressed concern that the U.S. government has few tools for detecting or combating foreign propaganda. They expressed hope that their research detailing the power of Russian propaganda would spur official action.

A former U.S. ambassador to Russia, Michael A. McFaul, said he was struck by the overt support that Sputnik expressed for Trump during the campaign, even using the #CrookedHillary hashtag pushed by the candidate.

McFaul said Russian propaganda typically is aimed at weakening opponents and critics. Trump’s victory, though reportedly celebrated by Putin and his allies in Moscow, may have been an unexpected benefit of an operation that already had fueled division in the United States. “They don’t try to win the argument,” said McFaul, now director of the Freeman Spogli Institute for International Studies at Stanford University. “It’s to make everything seem relative. It’s kind of an appeal to cynicism.”

The Kremlin has repeatedly denied interfering in the U.S. election or hacking the accounts of election officials. “This is some sort of nonsense,” Dmitry Peskov, press secretary for Putin, said last month when U.S. officials accused Russia of penetrating the computers of the Democratic National Committee and other political organizations.

RT disputed the findings of the researchers in an e-mail on Friday, saying it played no role in producing or amplifying any fake news stories related to the U.S. election. “It is the height of irony that an article about “fake news” is built on false, unsubstantiated claims. RT adamantly rejects any and all claims and insuations that the network has originated even a single “fake story” related to the US election,” wrote Anna Belkina, head of communications.

The findings about the mechanics of Russian propaganda operations largely track previous research by the Rand Corp. and George Washington University’s Elliott School of International Affairs.

“They use our technologies and values against us to sow doubt,” said Robert Orttung, a GWU professor who studies Russia. “It’s starting to undermine our democratic system.”

The Rand report — which dubbed Russian propaganda efforts a “firehose of falsehood” because of their speed, power and relentlessness — traced the country’s current generation of online propaganda work to the 2008 incursion into neighboring Georgia, when Russia sought to blunt international criticism of its aggression by pushing alternative explanations online.

The same tactics, researchers said, helped Russia shape international opinions about its 2014 annexation of Crimea and its military intervention in Syria, which started last year. Russian propaganda operations also worked to promote the “Brexit” departure of Britain from the European Union.

Another crucial moment, several researchers say, came in 2011 when the party of Russian President Vladimir Putin was accused of rigging elections, sparking protests that Putin blamed the Obama administration — and then-Secretary of State Clinton — for instigating.

Putin, a former KGB officer, announced his desire to “break the Anglo-Saxon monopoly on the global information streams” during a 2013 visit to the broadcast center for RT, formerly known as Russia Today.

“For them, it’s actually a real war, an ideological war, this clash between two systems,” said Sufian Zhemukhov, a former Russian journalist conducting research at GWU. “In their minds, they’re just trying to do what the West does to Russia.”

RT broadcasts news reports worldwide in several languages, but the most effective way it reaches U.S. audiences is online.

Its English-language flagship YouTube channel, launched in 2007, has 1.85 million subscribers and has had a total of 1.8 billion views, making it more widely viewed than CNN’s YouTube channel, according to a George Washington University report this month.

Though widely seen as a propaganda organ, the Russian site has gained credibility with some American conservatives. Trump sat for an interview with RT in September. His nominee for national security adviser, retired Lt. Gen. Michael T. Flynn, traveled to Russia last year for a gala sponsored by the network. He later compared it to CNN.

The content from Russian sites has offered ready fodder for U.S.-based websites pushing far-right conservative messages. A former contractor for one, the Next News Network, said he was instructed by the site’s founder, Gary S. Franchi Jr., to weave together reports from traditional sources such as the Associated Press and the Los Angeles Times with ones from RT, Sputnik and others that provided articles that often spread explosively online.

“The readers are more likely to share the fake stories, and they’re more profitable,” said Dyan Bermeo, who said he helped assemble scripts and book guests for Next News Network before leaving because of a pay dispute and concerns that “fake news” was crowding out real news.

In just the past 90 days — a period that has included the closing weeks of the campaign, Election Day and its aftermath — the YouTube audience of Next News Network has jumped from a few hundred thousand views a day to a few million, according to analytics firm Tubular Labs. In October alone, videos from Next News Network were viewed more than 56 million times.

Franchi said in an e-mail statement that Next News Network seeks “a global perspective” while providing commentary aimed at U.S. audiences, especially with regard to Russian military activity. “Understanding the threat of global war is the first step to preventing it,” he said, “and we feel our coverage assisted in preventing a possible World War 3 scenario.”

Correction: A previously published version of this story incorrectly stated that Russian information service RT had used the “#CrookedHillary” hastag pushed by then-Republican candidate Donald Trump. In fact, while another Russian information service Sputnik did use this hashtag, RT did not.

A Light Circle

Welcome to a Light Circle

Sketches from Berlin

Poetry, Fiction, Essays & Art by M.P. Powers

Just Brian

"Not all who wander are lost..."

stillfugue

Becoming the Artist I Want to Be - Fiction, Poetry, and Essays

Le Fictiologue

Julien Hirt - Auteur

Donna Maria

Be the best you can be no matter what the world tells you

Katherine's Blog

In Kate's World

FINANCIAL 365

Financial and services channel

%d bloggers like this: