Uncategorized

Fake News Writers Are Exploiting Americans For Profit

The war on fake news has a number of fronts. Governments, companies, professionals, and regular people are figuring out ways to identify and combat disinformation, deep fakes, and other kinds of manipulation. Some approaches may have consequences for free expression, while others take a less-intrusive method. Disinformation and propaganda have been around longer than the internet, but we’re seeing first hand just how quickly fake news can spread online and, with a little effort, be debunked. A Macedonian fake news writer provided a window into a publication that created and spread false stories and why it’s such a big business. A doctor in the US is taking a more professional approach to fighting health-related disinformation, and Twitter recently announced they acquired an AI startup to analyze how disinformation is shared.

Last week, the BBC released an interview with a woman in Veles, Macedonia to shed light on how fake news writers are recruited and how they create and disseminate disinformation. The writer, referred to as Tamara, was hired to take articles from right-wing US publications and rewrite them for copycat websites targeting US audiences. Tamara is a self-described liberal and said she was horrified by the content of the articles re-wrote. She shared the spreadsheets of stories she had to rewrite and the sites she pulled the content from. According to the BBC, Tamara said, “As you can see I just typed ‘Muslim attacks’ and there are so many articles about Muslims attacking people. Many of these I believe are not even true, they are just making it up.”

One of the sites Tamara used had almost a hundred pages of results for “muslim attacks,” many of which had inaccuracies and images pulled from totally different incidents. Tamara claimed she was directed to pull images from Google, even if they weren’t related to the story she was writing. Although many of the stories she produced were based on real events, they were framed in a way to provoke fear and anger and play into the readers’ prejudices. Tamara explained that even though many of the stories were fabricated, some elements were true.

She said, “That thing happened, the people were there, the place was there. So it was never fake stories” in the sense of fabricating every detail. “It was propaganda and brainwashing in the way of telling the story.”

Tamara doesn’t believe that the people who wrote the original stories believed what they were publishing either. “To even make up an article like this, you have to be very aware of what you are writing. This can’t come out of stupidity… I don’t think they believe in the stories they are writing, they know it is fake news, they know they are producing a lie. How delusional do you have to be to think that this is real?”

The stories were written to be shorter and easier for sharing on social media, which is where these pages generated revenue before being removed by facebook. Her boss, referred to as Marco, ran two fake news sites which Tamara said had a combined following of two million on Facebook. According to a 2017 CNN interview with another owner of a fake news site, the typical million-follower sensational clickbait news page made around two thousand dollars per day. Tamara made about three euros per post, around twenty-four euros per day. She claims that’s triple what she would’ve made from working a local job. You’d think that even for that kind of money, the nature of the work would weigh on her conscience, but Tamara justifies it by putting the blame on the readers. She told the BBC, “I try to split myself and my own beliefs from the stuff I was writing. So I tried to stay as out of it as I can. I just saw it as writing words. I tried not to think about writing propaganda. My take was that if people are stupid enough to believe these stories, maybe they deserve this. If they think this is the truth, then maybe they deserve this as a way of punishment.”

The site Tamara wrote for was taken down in the facebook fake news purge. She said that her boss, Marco, was shook up over having his pages and personal account shut down. She didn’t hear from him for a long time and then he called her up asking if she wanted to write for another one of his fake news sites. She declined.

Twitter is expanding its approach to combating fake news by acquiring Fabula AI, a startup that examines patterns of how disinformation is shared compared to genuine news. Twitter CTO Parag Agrawal announced in a blog on Monday, “We are excited to announce that, to help us (advance the state of machine learning), we have acquired Fabula AI, a London-based start-up, with a world-class team of machine learning researchers who employ graph deep learning to detect network manipulation. Graph deep learning is a novel method for applying powerful machine learning techniques to network-structured data. The result is the ability to analyze very large and complex datasets describing relations and interactions, and to extract signals in ways that traditional ML techniques are not capable of doing.”
The Fabula AI team is joining an internal research group led by Twitter’s head of Machine Learning and AI engineering. Fabula’s chief scientist and co-founder Michael Bronstein says they are focusing on strategic areas like natural language processing, reinforcement learning, machine-learning ethics, recommendation systems, and graph deep learning. Twitter says that the strategic investment in Fabula’s abilities will be a key driver in working to help people feel safe and see relevant information on twitter. The team believes analyzing millions of tweets, retweets, and likes per day on Twitter will help them improve the health of conversation on the platform and Twitter’s features.

Fabula AI focuses on how disinformation spreads and who is spreading it, rather than looking at the content itself, which is how others approach the issue. Fabula’s patented algorithms use geometric deep learning to detect information in datasets are so massive and complicated that traditional machine learning methods struggle with it. It’s not clear yet whether their tools will be used exclusively for Twitter or if they’ll be making them available for other platforms to utilize as well, but Fabula says they intend to offer an API for other publishers and platforms later this year. Fabula’s founders decided that being acquired Twitter gives them an opportunity to have a much larger and deeper impact than if they were to stay decentralized and open. Twitter’s plans for the technology is not clear either, but they still have the option to make it available to other platforms. When asked by TechCrunch if they would share it with other platforms, a twitter spokesperson said, “There’s more to come on how we will integrate Fabula’s technology where it makes sense to strengthen our systems and operations in the coming months. It will likely take us some time to be able to integrate their graph deep learning algorithms into our ML platform. We’re bringing Fabula in for the team, tech and mission, which are all aligned with our top priority: Health.”

Health is a top priority for others combating fake news within the medical field. Dr. Austin Chiang is an Assistant Professor of Medicine at Jefferson Health in Philadelphia. He also serves as the first Chief Medical Social Media Officer, leading the fight against fake news online. Dr. Chiang believes the best way to combat disinformation is to eradicate untrustworthy content with posts from actual medical experts that Americans can relate to and trust. On Instagram, he has over twenty thousand followers, making him a valued medical influencer.

Although his following isn’t as large as other medical influencer accounts like “Medical Medium,” a psychic with nearly two million followers, who praises vegetables as a cure for diseases ranging from depression to diabetes.

According to CNBC, Dr. Chiang believes, “This is the biggest crisis we have right now in health care. Everyone should be out there, but I realize I’m one of the few.” According to Chiang, doctors have been reluctant in the past to build their own following on social media for many reasons. Most either view it as a waste of time, lack the skills to properly run a media account, or are concerned about saying the wrong thing and facing consequences. Most online-consumers do not examine the latest scientific literature, which is why Dr. Chiang is calling upon health professionals to take time to connect with people online where they spend most of their time– social media. He is currently recruiting a group of physicians, nurses, patient advocates, and other health professionals to get online and help in the fight against misinformation. Dr. Chiang is creating instructions for doctors and other health professionals on how to use digital tools to assist in their fight, disclosing any conflicts of interest. The more transparency with ties to the industry will help doctors build trustworthy relationships with the public. He has set up a new non-profit organization for health professionals dubbed the Association for Healthcare Social Media, building campaigns to drive public awareness, including “hashtag verify healthcare” to promote ideas about disclosures, and “hashtags don’t go viral” countering anti-vaxxer content. Currently, in the US, Measles cases are growing and health professionals blame the parents who refuse to vaccinate their children. Dr. Chiang wants this to change, his purpose is to get doctors to be more active online in social media, so individuals who want access to accurate information have a reliable source.

Who dictates reliability online is quickly becoming a hot issue, which is why it’s important for individuals to make those distinctions in addition to, and not instead of, companies and institutions. We will continue to cover developments on the fake news front so be sure to check back here for further updates. We’ll have new videos every monday through thursday, and soon every day of the week.

Now Hiring: Content Creator

Senior Producer / Content Creator – Salary negotiable

Responsibilities:

  • Produce and host video content
  • Pitch project ideas
  • Participate in live events
  • Community engagement

Skills:

  • Writing and production
  • Ability to work solo or with a team
  • Ethical journalistic standards

*Research and on-the-ground content may expose you to violent and offensive imagery.

We are striving for the highest standard in journalistic ethics. We will not tolerate content pushing a political agenda. Our mission is to promote critical thinking by presenting our audience with facts from which they can form their own informed opinion or analysis.

Send your cover letter and resume to pitch@subverse.net

Now Hiring: Junior Researcher / Production Assistant

Junior Researcher / Production Assistant – $15/hr

Responsibilities:

  • Daily research and fact checking
  • Writing articles and scripts for a teleprompter
  • Pitch story ideas
  • Coordinate interviews
  • Community engagement
  • Assist on shoots and live events
  • Potential to host content

Skills:

  • Media literacy and writing proficiency
  • Ethical journalistic standards
  • Ability to work on solo and team projects
  • Must be fit for on-the-ground shoots that can be physically demanding
  • Camera and/or production skills a plus (Adobe Creative Suite preferred)
  • Fluency in other languages a plus

*Research and on-the-ground content may expose you to violent and offensive imagery.

We are striving for the highest standard in journalistic ethics. We will not tolerate content pushing a political agenda. Our mission is to promote critical thinking by presenting our audience with facts from which they can form their own opinion or analysis.

Send your cover letter and resume to pitch@subverse.net

Scroll to top