<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	
	xmlns:georss="http://www.georss.org/georss"
	xmlns:geo="http://www.w3.org/2003/01/geo/wgs84_pos#"
	>

<channel>
	<title>Mark Zuckerberg &#8211; Faith Matters</title>
	<atom:link href="https://www.faith-matters.org/tag/mark-zuckerberg/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.faith-matters.org</link>
	<description>Working with Faith Communities Countering Extremism, Supporting Integration &#38; Challenging Hatred. Founded by Fiyaz Mughal</description>
	<lastBuildDate>Tue, 26 May 2020 10:25:45 +0000</lastBuildDate>
	<language>en-GB</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.7.1</generator>

 
<site xmlns="com-wordpress:feed-additions:1">95725945</site>	<item>
		<title>US: Social media platforms dismantle disinformation campaigns</title>
		<link>https://www.faith-matters.org/us-social-media-platforms-dismantle-disinformation-campaigns/</link>
		
		<dc:creator><![CDATA[Faith Matters]]></dc:creator>
		<pubDate>Wed, 22 Aug 2018 13:00:39 +0000</pubDate>
				<category><![CDATA[News]]></category>
		<category><![CDATA[politics]]></category>
		<category><![CDATA[bots]]></category>
		<category><![CDATA[cybersecurity]]></category>
		<category><![CDATA[Facebook]]></category>
		<category><![CDATA[fake news websites]]></category>
		<category><![CDATA[FireEye]]></category>
		<category><![CDATA[Google]]></category>
		<category><![CDATA[hacking]]></category>
		<category><![CDATA[Instagram]]></category>
		<category><![CDATA[Iran]]></category>
		<category><![CDATA[Mark Zuckerberg]]></category>
		<category><![CDATA[military intelligence]]></category>
		<category><![CDATA[Politics]]></category>
		<category><![CDATA[propaganda]]></category>
		<category><![CDATA[Russia]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[Syria]]></category>
		<category><![CDATA[Twitter]]></category>
		<category><![CDATA[U.S]]></category>
		<category><![CDATA[Ukraine]]></category>
		<category><![CDATA[Washington]]></category>
		<category><![CDATA[YouTube]]></category>
		<guid isPermaLink="false">https://www.faith-matters.org/?p=7823</guid>

					<description><![CDATA[Facebook Inc., Twitter Inc. and Alphabet Inc. collectively removed hundreds of accounts tied to an alleged Iranian propaganda operation on Tuesday, while Facebook took down a second campaign it said was linked to Russia. Facebook CEO Mark Zuckerberg said the accounts identified on his company&#8217;s platform were part of two separate campaigns, the first from [&#8230;]]]></description>
										<content:encoded><![CDATA[<p><a class="a2a_button_facebook" href="https://www.addtoany.com/add_to/facebook?linkurl=https%3A%2F%2Fwww.faith-matters.org%2Fus-social-media-platforms-dismantle-disinformation-campaigns%2F&amp;linkname=US%3A%20Social%20media%20platforms%20dismantle%20disinformation%20campaigns" title="Facebook" rel="nofollow noopener" target="_blank"></a><a class="a2a_button_x" href="https://www.addtoany.com/add_to/x?linkurl=https%3A%2F%2Fwww.faith-matters.org%2Fus-social-media-platforms-dismantle-disinformation-campaigns%2F&amp;linkname=US%3A%20Social%20media%20platforms%20dismantle%20disinformation%20campaigns" title="X" rel="nofollow noopener" target="_blank"></a><a class="a2a_button_linkedin" href="https://www.addtoany.com/add_to/linkedin?linkurl=https%3A%2F%2Fwww.faith-matters.org%2Fus-social-media-platforms-dismantle-disinformation-campaigns%2F&amp;linkname=US%3A%20Social%20media%20platforms%20dismantle%20disinformation%20campaigns" title="LinkedIn" rel="nofollow noopener" target="_blank"></a><a class="a2a_button_whatsapp" href="https://www.addtoany.com/add_to/whatsapp?linkurl=https%3A%2F%2Fwww.faith-matters.org%2Fus-social-media-platforms-dismantle-disinformation-campaigns%2F&amp;linkname=US%3A%20Social%20media%20platforms%20dismantle%20disinformation%20campaigns" title="WhatsApp" rel="nofollow noopener" target="_blank"></a><a class="a2a_dd a2a_counter addtoany_share_save addtoany_share" href="https://www.addtoany.com/share#url=https%3A%2F%2Fwww.faith-matters.org%2Fus-social-media-platforms-dismantle-disinformation-campaigns%2F&#038;title=US%3A%20Social%20media%20platforms%20dismantle%20disinformation%20campaigns" data-a2a-url="https://www.faith-matters.org/us-social-media-platforms-dismantle-disinformation-campaigns/" data-a2a-title="US: Social media platforms dismantle disinformation campaigns"></a></p><p style="text-align: justify;">Facebook Inc., Twitter Inc. and Alphabet Inc. collectively removed hundreds of accounts tied to an alleged Iranian propaganda operation on Tuesday, while Facebook took down a second campaign it said was linked to Russia.</p>
<p style="text-align: justify;">Facebook CEO Mark Zuckerberg said the accounts identified on his company&#8217;s platform were part of two separate campaigns, the first from Iran with some ties to state-owned media, the second linked to sources which Washington has previously named as Russian military intelligence services.</p>
<p style="text-align: justify;">Officials in Iran, where it is a holiday to mark the Muslim Eid al-Adha festival, were not immediately available to comment. Moscow has repeatedly denied using hacking or fake social media accounts to influence foreign elections. The Russian embassy in Washington did not immediately respond to a request for comment.</p>
<p style="text-align: justify;">The move by Facebook and others is the latest attempt by global social media giants to guard against political interference on their platforms. It comes as concerns are rising about foreign attempts to disrupt the U.S. midterm elections in November.</p>
<p style="text-align: justify;">The United States earlier this year indicted 13 Russians for alleged attempts to meddle in U.S. politics, but the latest alleged Iranian activity, exposed by cybersecurity firm FireEye Inc. suggests the problem may be more widespread.</p>
<p style="text-align: justify;">&#8220;It really shows it&#8217;s not just Russia that engages in this type of activity,&#8221; Lee Foster, an information operations analyst with FireEye, told Reuters.</p>
<p style="text-align: justify;">FireEye said the Iranian campaign used a network of fake news websites and fraudulent social media personas spread across Facebook, Instagram, Twitter, Google Plus and YouTube, to push narratives in line with Tehran&#8217;s interests.</p>
<p style="text-align: justify;">The Iranian mission to the United Nations did not respond to a request for comment.</p>
<p style="text-align: justify;">The activity was aimed at users in the United States, Britain, Latin America and Middle East up through this month, FireEye said, and included &#8220;anti-Saudi, anti-Israeli, and pro-Palestinian themes&#8221; as well as advocacy of policies favorable to Iran such as the U.S.-Iran nuclear deal.</p>
<p style="text-align: justify;">FireEye said the Iranian activity did not appear &#8220;dedicated&#8221; to influencing the upcoming election, though some of the posts aimed at U.S. users did adopt &#8220;left-leaning identities&#8221; and took stances against President Donald Trump.</p>
<p style="text-align: justify;">That activity &#8220;could suggest a more active attempt to influence domestic U.S. political discourse&#8221; is forthcoming, Foster said, but &#8220;we just haven&#8217;t seen that yet.&#8221;</p>
<p style="text-align: justify;"><strong>&#8216;DISTINCT CAMPAIGNS&#8217;</strong></p>
<p style="text-align: justify;">Facebook said the Russia-linked accounts it removed were engaged in &#8220;inauthentic behavior&#8221; related to politics in Syria and Ukraine. It said that activity did not appear to be linked to the Iranian campaign.</p>
<p style="text-align: justify;">&#8220;These were distinct campaigns and we have not identified any link or coordination between them. However, they used similar tactics by creating networks of accounts to mislead others about who they were and what they were doing,&#8221; the company said in a statement.</p>
<p style="text-align: justify;">Facebook last month removed 32 pages and accounts tied to another misinformation campaign without describing its origins, but which U.S. lawmakers said likely had Russian involvement.</p>
<p style="text-align: justify;">Microsoft said this week that hackers linked to the Russian government sought to steal email login credentials from U.S. politicians and think tanks, allegations the Russian foreign ministry described as a &#8220;witch-hunt.&#8221;</p>
<p style="text-align: justify;">FireEye said the U.S.-focused Iranian activity ramped up last year, just months after Trump took office, with websites and social media accounts posting memes and articles, some of which were apparently copied from legitimate U.S. and Iranian news outlets.</p>
<p style="text-align: justify;">In some cases, the domains for the fake websites like &#8220;US Journal&#8221; and &#8220;Liberty Free Press&#8221; were originally registered years before the 2016 election, in 2014 and 2013, but most remained inactive until last year, FireEye said.</p>
<p style="text-align: justify;">Arabic-language, Middle East-focused websites appear to be part of the same campaign, the company added.</p>
<p style="text-align: justify;">The technology companies variously said they linked the accounts to Iran based on user phone numbers, email addresses, website registration records and the timing of account activity matching Iranian business hours.</p>
<p style="text-align: justify;">FireEye expressed &#8220;moderate confidence&#8221; about the Iranian origins, but said it has not been able to tie the accounts back to a specific organization or individuals.</p>
<p style="text-align: justify;">Hundreds of thousands of people followed one or more of the Facebook pages implicated in the campaign, Facebook said.</p>
<p style="text-align: justify;">It shared examples of removed posts, including a cartoon depicting an Israeli soldier executing a Palestinian and a fake movie poster showing President Trump embracing North Korean leader Kim Jong-un.</p>
<p style="text-align: justify;">Postings cited by FireEye expressed praise for U.S. politicians and other Twitter users who criticized the Trump administration&#8217;s decision in May to abandon the Iranian nuclear pact, under which Iran had agreed to curb its nuclear weapons program in exchange for loosening of sanctions.</p>
<p style="text-align: justify;">Some Twitter and Facebook accounts were designed to appear as if they were real people in the U.S., Britain and Canada, according to FireEye. The accounts used a combination of different hashtags to engage in U.S. culture, including &#8220;#lockhimup,&#8221; &#8220;#impeachtrump&#8221; and &#8220;notmypresident.&#8221;</p>
<p style="text-align: justify;">Twitter, which called the effort &#8220;coordinated manipulation,&#8221; said it removed 284 accounts.</p>
<p style="text-align: justify;">Facebook said it removed 254 pages and 392 accounts across its flagship platform as well as its Instagram service. Some of the accounts had events and groups associated with them.</p>
<p style="text-align: justify;">The accounts spent about $12,000 to advertise through Facebook and Instagram using a variety of currencies, Facebook said. The company said it had notified the U.S. Treasury and State departments of the purchases, which may potentially violate sanctions.</p>
<p style="text-align: justify;">Alphabet, which includes Google and YouTube, did not respond to a request to comment.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">7823</post-id>	</item>
		<item>
		<title>Myanmar: Why Facebook is losing the war on hate speech in Myanmar</title>
		<link>https://www.faith-matters.org/myanmar-why-facebook-is-losing-the-war-on-hate-speech-in-myanmar/</link>
		
		<dc:creator><![CDATA[Faith Matters]]></dc:creator>
		<pubDate>Thu, 16 Aug 2018 10:56:48 +0000</pubDate>
				<category><![CDATA[hate speech]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Burma]]></category>
		<category><![CDATA[Burmese speakers]]></category>
		<category><![CDATA[Ethnic cleansing]]></category>
		<category><![CDATA[Facebook]]></category>
		<category><![CDATA[Mark Zuckerberg]]></category>
		<category><![CDATA[Myanmar]]></category>
		<category><![CDATA[repatriation]]></category>
		<category><![CDATA[Rohingya]]></category>
		<guid isPermaLink="false">https://www.faith-matters.org/?p=7749</guid>

					<description><![CDATA[In April, Facebook founder Mark Zuckerberg told U.S. senators that the social media site was hiring dozens more Burmese speakers to review hate speech posted in Myanmar. The situation was dire. Some 700,000 members of the Rohingya community had recently fled the country amid a military crackdown and ethnic violence. In March, a United Nations [&#8230;]]]></description>
										<content:encoded><![CDATA[<p><a class="a2a_button_facebook" href="https://www.addtoany.com/add_to/facebook?linkurl=https%3A%2F%2Fwww.faith-matters.org%2Fmyanmar-why-facebook-is-losing-the-war-on-hate-speech-in-myanmar%2F&amp;linkname=Myanmar%3A%20Why%20Facebook%20is%20losing%20the%20war%20on%20hate%20speech%20in%20Myanmar" title="Facebook" rel="nofollow noopener" target="_blank"></a><a class="a2a_button_x" href="https://www.addtoany.com/add_to/x?linkurl=https%3A%2F%2Fwww.faith-matters.org%2Fmyanmar-why-facebook-is-losing-the-war-on-hate-speech-in-myanmar%2F&amp;linkname=Myanmar%3A%20Why%20Facebook%20is%20losing%20the%20war%20on%20hate%20speech%20in%20Myanmar" title="X" rel="nofollow noopener" target="_blank"></a><a class="a2a_button_linkedin" href="https://www.addtoany.com/add_to/linkedin?linkurl=https%3A%2F%2Fwww.faith-matters.org%2Fmyanmar-why-facebook-is-losing-the-war-on-hate-speech-in-myanmar%2F&amp;linkname=Myanmar%3A%20Why%20Facebook%20is%20losing%20the%20war%20on%20hate%20speech%20in%20Myanmar" title="LinkedIn" rel="nofollow noopener" target="_blank"></a><a class="a2a_button_whatsapp" href="https://www.addtoany.com/add_to/whatsapp?linkurl=https%3A%2F%2Fwww.faith-matters.org%2Fmyanmar-why-facebook-is-losing-the-war-on-hate-speech-in-myanmar%2F&amp;linkname=Myanmar%3A%20Why%20Facebook%20is%20losing%20the%20war%20on%20hate%20speech%20in%20Myanmar" title="WhatsApp" rel="nofollow noopener" target="_blank"></a><a class="a2a_dd a2a_counter addtoany_share_save addtoany_share" href="https://www.addtoany.com/share#url=https%3A%2F%2Fwww.faith-matters.org%2Fmyanmar-why-facebook-is-losing-the-war-on-hate-speech-in-myanmar%2F&#038;title=Myanmar%3A%20Why%20Facebook%20is%20losing%20the%20war%20on%20hate%20speech%20in%20Myanmar" data-a2a-url="https://www.faith-matters.org/myanmar-why-facebook-is-losing-the-war-on-hate-speech-in-myanmar/" data-a2a-title="Myanmar: Why Facebook is losing the war on hate speech in Myanmar"></a></p><p style="text-align: justify;">In April, Facebook founder Mark Zuckerberg told U.S. senators that the social media site was hiring dozens more Burmese speakers to review hate speech posted in Myanmar. The situation was dire.</p>
<p style="text-align: justify;">Some 700,000 members of the Rohingya community had recently fled the country amid a military crackdown and ethnic violence. In March, a United Nations investigator said Facebook was used to incite violence and hatred against the Muslim minority group. The platform, she said, had &#8220;turned into a beast.&#8221;</p>
<p style="text-align: justify;">Four months after Zuckerberg&#8217;s pledge to act, here is a sampling of posts from Myanmar that were viewable this month on Facebook:</p>
<p style="text-align: justify;">One user posted a restaurant advertisement featuring Rohingya-style food. &#8220;We must fight them the way Hitler did the Jews, damn kalars!&#8221; the person wrote, using a pejorative for the Rohingya. That post went up in December 2013.</p>
<p style="text-align: justify;">Another post showed a news article from an army-controlled publication about attacks on police stations by Rohingya militants. &#8220;These non-human kalar dogs, the Bengalis, are killing and destroying our land, our water and our ethnic people,&#8221; the user wrote. &#8220;We need to destroy their race.&#8221; That post went up last September, as the violence against the Rohingya peaked.</p>
<p style="text-align: justify;">A third user shared a blog item that pictures a boatload of Rohingya refugees landing in Indonesia. &#8220;Pour fuel and set fire so that they can meet Allah faster,&#8221; a commenter wrote. The post appeared 11 days after Zuckerberg&#8217;s Senate testimony.</p>
<p style="text-align: justify;">The remarks are among more than 1,000 examples Reuters found of posts, comments, images and videos attacking the Rohingya or other Myanmar Muslims that were on Facebook as of last week. Almost all are in the main local language, Burmese. The anti-Rohingya and anti-Muslim invective analysed for this article – which was collected by Reuters and the Human Rights Center at UC Berkeley School of Law – includes material that&#8217;s been up on Facebook for as long as six years.</p>
<p style="text-align: justify;">The poisonous posts call the Rohingya or other Muslims dogs, maggots and rapists, suggest they be fed to pigs, and urge they be shot or exterminated. The material also includes crudely pornographic anti-Muslim images. The company&#8217;s rules specifically prohibit attacking ethnic groups with &#8220;violent or dehumanising speech&#8221; or comparing them to animals. Facebook also has long had a strict policy against pornographic content.</p>
<p style="text-align: justify;">The use of Facebook to spread hate speech against the Rohingya in the Buddhist-majority country has been widely reported by the U.N. and others. Now, a Reuters investigation gives an inside look at why the company has failed to stop the problem.</p>
<p style="text-align: justify;">For years, Facebook – which reported net income of $15.9 billion (£12.5 billion) in 2017 – devoted scant resources to combat hate speech in Myanmar, a market it dominates and in which there have been regular outbreaks of ethnic violence. In early 2015, there were only two people at Facebook who could speak Burmese reviewing problematic posts. Before that, most of the people reviewing Burmese content spoke English.</p>
<p style="text-align: justify;">To this day, the company continues to rely heavily on users reporting hate speech in part because its systems struggle to interpret Burmese text.</p>
<p style="text-align: justify;">Even now, Facebook doesn&#8217;t have a single employee in the country of some 50 million people. Instead, it monitors hate speech from abroad. This is mainly done through a secretive operation in Kuala Lumpur that&#8217;s outsourced to Accenture, the professional services firm, and codenamed &#8220;Project Honey Badger.&#8221;</p>
<p style="text-align: justify;">According to people familiar with the matter, the project, which handles many Asian countries, hired its first two Burmese speakers, who were based in Manila, just three years ago. As of June, Honey Badger had about 60 people reviewing reports of hate speech and other content posted by Myanmar&#8217;s 18 million active Facebook users. Facebook itself in April had three full-time Burmese speakers at a separate monitoring operation at its international headquarters in Dublin, according to a former employee.</p>
<p style="text-align: justify;">Honey Badger employees typically sign one-year renewable contracts and agree not to divulge that the client is Facebook. Reuters interviewed more than a half-dozen former monitors who reviewed Southeast Asian content.</p>
<p style="text-align: justify;">A Facebook official said outsourcing its content monitoring is more efficient because the companies it uses are specialists in ramping up such operations. He declined to disclose how many Burmese speakers the company has on the job worldwide, saying it was &#8220;impossible to know, to be definitive on that.&#8221;</p>
<p style="text-align: justify;">&#8220;It&#8217;s not enough,&#8221; he added.</p>
<p style="text-align: justify;">For many people in this emerging economy, Facebook is the internet: It&#8217;s so dominant, it&#8217;s the only site they use online. Yet, the company ignored repeated warnings as far back as 2013 that it faced trouble.</p>
<p style="text-align: justify;">Researchers and human rights activists say they cautioned Facebook for years that its platform was being used in Myanmar to promote racism and hatred of Muslims, in particular the Rohingya.</p>
<p style="text-align: justify;">&#8220;They were warned so many times,&#8221; said David Madden, a tech entrepreneur who worked in Myanmar. He said he told Facebook officials in 2015 that its platform was being exploited to foment hatred in a talk he gave at its headquarters in Menlo Park, California. About a dozen Facebook people attended the meeting in person, including Mia Garlick, now the company&#8217;s director of Asia Pacific policy, he said. Others joined via video. &#8220;It couldn&#8217;t have been presented to them more clearly, and they didn&#8217;t take the necessary steps,&#8221; Madden said.</p>
<p style="text-align: justify;">In a statement, Garlick told Reuters: &#8220;We were too slow to respond to the concerns raised by civil society, academics and other groups in Myanmar. We don&#8217;t want Facebook to be used to spread hatred and incite violence. This is true around the world, but it is especially true in Myanmar where our services can be used to amplify hate or exacerbate harm against the Rohingya.&#8221;</p>
<p style="text-align: justify;">She added that Facebook is focussed on addressing challenges that are unique to Myanmar &#8220;through a combination of people, technology, policies and programs.&#8221; The company also said it has banned several &#8220;hate figures and organizations&#8221; on Facebook in Myanmar.</p>
<p style="text-align: justify;">Facebook&#8217;s struggles in Myanmar are among much broader problems it faces. Zuckerberg&#8217;s congressional testimony in April primarily focussed on the company&#8217;s mishandling of user data, whether it censors conservative views and Russia&#8217;s exploitation of Facebook to meddle in the 2016 U.S. presidential election.</p>
<p style="text-align: justify;">Of all of Facebook&#8217;s travails, though, Myanmar may be the bloodiest. The Myanmar military stands accused by the U.N. of having conducted a brutal campaign of killings, mass rape, arson and ethnic cleansing against the Rohingya. The government denies the allegations.</p>
<p style="text-align: justify;">The social media giant doesn&#8217;t make public its data on hate speech in Myanmar. It says it has 2.2 billion global users and each week receives millions of user reports from around the world about problematic content.</p>
<p style="text-align: justify;">In compiling examples of hate speech for this article, Reuters found some that Facebook subsequently removed. But the vast majority remained online as of early August.</p>
<p style="text-align: justify;">After Reuters alerted Facebook to some of the derogatory posts included in this story, the company said it removed them. &#8220;All of it violated our policies,&#8221; it said.</p>
<p style="text-align: justify;">Reuters itself sometimes flags to Facebook threats posted on the platform against its reporters. These include the Burmese journalists Wa Lone and Kyaw Soe Oo, who are on trial in Myanmar on charges of violating a state secrets law. The two were arrested in December while reporting on the massacre of 10 Rohingya men and have received a deluge of death threats on social media over their story. Facebook has removed such content several times at the news agency&#8217;s request.</p>
<p style="text-align: justify;"><strong>&#8216;SENDING FLOWERS&#8217;</strong></p>
<p style="text-align: justify;">Myanmar emerged from decades of military rule in 2011, but religious violence has marred its transition to democracy. In 2012, clashes in Rakhine State between ethnic Rakhine, who are Buddhists, and the Rohingya killed scores of people and left 140,000 displaced – mostly Muslims.</p>
<p style="text-align: justify;">Facebook&#8217;s extraordinary dominance in Myanmar began taking root around the same time. But not by design.</p>
<p style="text-align: justify;">As recently as six years ago, Myanmar was one of the least connected countries on earth. In 2012, only 1.1 percent of the population used the internet and few people had telephones, according to the International Telecommunication Union, a U.N. agency. The junta that had ruled the country for decades kept citizens isolated.</p>
<p style="text-align: justify;">That all changed in 2013, when a quasi-civilian government oversaw the deregulation of telecommunications. The state-owned phone company suddenly faced competition from two foreign mobile-phone entrants from Norway and Qatar.</p>
<p style="text-align: justify;">The price of SIM cards dropped from more than $200 to as little as $2 and people purchased them in droves. By 2016, nearly half the population had mobile phone subscriptions, according to GSMA Intelligence, the research arm of the industry&#8217;s trade association. Most purchased smartphones with internet access.</p>
<p style="text-align: justify;">One app went viral: Facebook. Many saw it as an all-in-one solution – offering a messaging system, news, and videos and other entertainment. It also became a status symbol, said Chris Tun, a former Deloitte consultant who advised the government. &#8220;If you don&#8217;t use Facebook, you&#8217;re behind,&#8221; he said. &#8220;Even grandmas, everyone was on Facebook.&#8221;</p>
<p style="text-align: justify;">To capture customers, Myanmar&#8217;s mobile phone operators began offering a sweet deal: use Facebook without paying any data charges.</p>
<p style="text-align: justify;">&#8220;Facebook should be sending flowers to me, because we have been an accelerator for bringing the penetration,&#8221; said Lars Erik Tellmann, who until July was chief executive of Telenor Myanmar, part of Norway&#8217;s Telenor Group. &#8220;This was an initiative we took fully on our own. And this was extremely popular.&#8221;</p>
<p style="text-align: justify;">In Myanmar today, the government itself uses Facebook to make major announcements, including the resignation of the president in March.</p>
<p style="text-align: justify;"><strong>&#8216;GENOCIDE ALL OF THE MUSLIMS&#8217;</strong></p>
<p style="text-align: justify;">In the fall of 2013, Aela Callan, an Australian documentary filmmaker studying at Stanford University, began a project on hate speech and false reports that had spread online during conflicts between Buddhists and Rohingya Muslims the prior year. In June 2012, at least 80 people had died in riots and thousands of Rohingya were moved into squalid internment camps. Anti-Rohingya diatribes appeared on Facebook. One Buddhist nationalist group set up a page called the &#8220;Kalar Beheading Gang.&#8221;</p>
<p style="text-align: justify;">In November 2013, she met at Facebook&#8217;s California headquarters with Elliott Schrage, Vice President of Communications and Public Policy. &#8220;I was trying to alert him to the problems,&#8221; she said.</p>
<p style="text-align: justify;">Emails between the two show that Schrage put Callan in touch with internet.org, a Facebook initiative to bring the internet to developing countries, and with two Facebook officials, including one who worked with civil-society organizations to assist the company in coping with hate speech.</p>
<p style="text-align: justify;">&#8220;He didn&#8217;t connect me to anyone inside Facebook who could deal with the actual problem,&#8221; she said.</p>
<p style="text-align: justify;">Asked for comment, Schrage referred Reuters to a press person at Facebook. The company didn&#8217;t comment on the meeting.</p>
<p style="text-align: justify;">Matt Schissler, a doctoral student at the University of Michigan, said that between March and December 2014, he held discussions with Facebook officials in a series of calls and online communications. He told them how the platform was being used to spread hate speech and false rumours in Myanmar, he said, including via fake accounts. He and other activists provided the company with specific examples, including a Facebook page in Burmese that was called, &#8220;We will genocide all of the Muslims and feed them to the dogs.&#8221; The page was removed.</p>
<p style="text-align: justify;">Schissler belonged to a private Facebook group that was set up so that Myanmar human rights activists, researchers and company employees such as Asia Pacific policy chief Garlick could discuss how to cope with hate speech and other issues. The activists brought up numerous problems with Facebook&#8217;s multi-step reporting system for problematic content. As one example, they cited a photograph of an aid worker in Rakhine State in a post that called him &#8220;a traitor to the nation.&#8221; It had been shared 229 times, according to messages reviewed by Reuters.</p>
<p style="text-align: justify;">One of the private group&#8217;s members had reported it to Facebook as harassment of an individual but later received a message back: &#8220;We reviewed the photo you reported for containing hate speech or symbols and found it doesn&#8217;t violate our Community Standards.&#8221; After multiple complaints by activists over six weeks, a Facebook employee finally explained to the activists that the takedown request was rejected because the photo had been reported, but not the comment above it. It eventually was taken down.</p>
<p style="text-align: justify;">In March 2015, Schissler gave a talk at Facebook&#8217;s California headquarters about new media, particularly Facebook, and anti-Muslim violence in Myanmar. More than a dozen Facebook employees attended, he said.</p>
<p style="text-align: justify;">Two months later, Madden, the tech entrepreneur, gave a talk at Facebook headquarters about tensions and violence between Buddhists and Muslims. He said he showed a doctored picture that had spread on Facebook of the country&#8217;s de facto leader, Aung San Suu Kyi, who is Buddhist, wearing a Muslim head scarf. The image, Madden said, was meant to imply she was sympathetic to Muslims – a &#8220;very negative message&#8221; in Myanmar.</p>
<p style="text-align: justify;">&#8220;The whole point of this presentation was really just to sound the alarm, to show very vividly the context in which Facebook was operating, and already the evidence of how it was being misused,&#8221; he said. He left the meeting thinking his audience took the talk seriously and would take action.</p>
<p style="text-align: justify;">Madden had founded a technology hub and start-up accelerator in Yangon called Phandeeyar. He said he and others involved with the venture interacted with Facebook &#8220;many dozens&#8221; of times over the next several years, including via email, in the private Facebook group and in person, showing how the network&#8217;s systems for detecting and removing dangerous content were ineffective. He isn&#8217;t sure what steps the company took in response. &#8220;The central problem is that the mechanisms that they have to pull down hate speech in a timely way, before it does real world harm, they don&#8217;t work,&#8221; he said.</p>
<p style="text-align: justify;">Madden and Jes Kaliebe Petersen, Phandeeyar&#8217;s chief executive, said Facebook was still relying too much on their group and other volunteers to report dangerous posts. &#8220;It shouldn&#8217;t be incumbent on an organisation like ours or people who happen to be well-connected with folks inside Facebook to report these things,&#8221; Petersen said.</p>
<p style="text-align: justify;">In April, shortly before Zuckerberg&#8217;s Senate testimony, Phandeeyar and five other Myanmar groups blasted him for claiming in an interview with Vox that Facebook&#8217;s systems had detected and removed incendiary messages in September last year. &#8220;We believe your system, in this case, was us,&#8221; they wrote. Zuckerberg apologised.</p>
<p style="text-align: justify;">Back in 2014, tech organizations and researchers weren&#8217;t the only ones sounding alarms with Facebook. So was the Myanmar government.</p>
<p style="text-align: justify;">In July of that year, riots broke out in the central city of Mandalay after false rumours spread online, on Facebook and elsewhere, that a Muslim man had raped a Buddhist woman. A Buddhist man and a Muslim man were killed in the fighting.</p>
<p style="text-align: justify;">The Myanmar government asked Tun, then a Deloitte consultant, to contact the company. He said he didn&#8217;t succeed at first, and the government briefly blocked Facebook.</p>
<p style="text-align: justify;">Tun said he eventually helped to arrange meetings between the government and Facebook. &#8220;What they promised to do was, when you spot fake news, you could contact them via email,&#8221; Tun said of Facebook. &#8220;And they would take action – they were willing to take down pages after their own verification process.&#8221;</p>
<p style="text-align: justify;">The government began reporting cases to Facebook, but Tun said he quickly realized the company couldn&#8217;t deal with Burmese text. &#8220;Honestly, Facebook had no clue about Burmese content. They were totally unprepared,&#8221; he said. &#8220;We had to translate it into English for them.&#8221;</p>
<p style="text-align: justify;"><strong>&#8216;I DON&#8217;T KNOW THE LANGUAGE&#8217;</strong></p>
<p style="text-align: justify;">In August 2013, Zuckerberg announced a plan to make the Internet available for the first time to billions of people in developing countries.</p>
<p style="text-align: justify;">&#8220;Everything Facebook has done has been about giving all people around the world the power to connect,&#8221; he said. The company would now work, he added, to make &#8220;internet access available to those who cannot currently afford it.&#8221;</p>
<p style="text-align: justify;">But in Myanmar, the language barrier would cause trouble. Most people here don&#8217;t speak English. Although Myanmar users at the time could post on Facebook in Burmese, the platform&#8217;s interface – including its system for reporting problematic posts – was in English.</p>
<p style="text-align: justify;">Making matters worse, the company&#8217;s operation for monitoring content in Burmese was meagre.</p>
<p style="text-align: justify;">In 2014, the social media behemoth had just one content reviewer who spoke Burmese: a local contractor in Dublin, according to messages sent by Facebook employees in the private Facebook chat group. A second Burmese speaker began working in early 2015, the messages show.</p>
<p style="text-align: justify;">In Manila – the original site of the outsourced Project Honey Badger – there were no content reviewers who spoke Burmese. People who reviewed Myanmar content there spoke English.</p>
<p style="text-align: justify;">&#8220;In cases like hate speech where we didn&#8217;t understand the language, we would say, &#8216;I don&#8217;t know the language,'&#8221; said a person who worked there. &#8220;So the client had to solve that,&#8221; the person said, referring to Facebook.</p>
<p style="text-align: justify;">By 2015, Facebook had around four Burmese speakers reviewing Myanmar content in Manila and Dublin. They were stretched thin: that year Facebook had 7.3 million active users in Myanmar.</p>
<p style="text-align: justify;">Accenture slowly began to hire more Burmese speakers. With the help of volunteer translators, Facebook also introduced a Burmese-language interface.</p>
<p style="text-align: justify;">By 2016, the Honey Badger project had moved to Kuala Lumpur after Accenture convinced Facebook it would be easier to recruit Burmese and others to work in Malaysia&#8217;s capital than in further-off Manila, according to a person familiar with the matter.</p>
<p style="text-align: justify;">In an office tower in Kuala Lumpur, teams of content monitors are assigned to handle different Asian countries, not just Myanmar. They are paid around $850 to $1000 a month and are often employed by temporary staffing agencies, according to ex-employees and online recruitment ads.</p>
<p style="text-align: justify;">Facebook said in a statement: &#8220;We&#8217;ve chosen to work only with highly reputable, global partners that take care of their employees, pay them well and provide robust benefits &#8211; this includes Accenture in Asia Pacific.&#8221;</p>
<p style="text-align: justify;">A spokesperson for Accenture confirmed it partners with Facebook. &#8220;The characterization of our operations as &#8216;secretive&#8217; is misleading and confidentiality is in place primarily to protect the privacy and security of our people and the clients we serve,&#8221; the spokesperson said.</p>
<p style="text-align: justify;"><strong>THE COMMUNICATIONS MAN</strong></p>
<p style="text-align: justify;">Former content monitors said they often each had to make judgements on 1,000 or more potentially problematic content items a day, although the number is now understood to be less.</p>
<p style="text-align: justify;">Facebook&#8217;s complete rules about what is and isn&#8217;t allowed on its platform are spelled out in its internal community standards enforcement guidelines, which the company made public for the first time in April. It defines hate speech as &#8220;violent or dehumanising speech, statements of inferiority, or calls for exclusion or segregation&#8221; against people based on their race, ethnicity, religious affiliation and other characteristics.</p>
<p style="text-align: justify;">In response, Facebook said: &#8220;Content reviewers aren&#8217;t required to evaluate any set number of posts … We encourage reviewers to take the time they need.&#8221;</p>
<p style="text-align: justify;">A Facebook official also told Reuters the community standards policy is global, &#8220;but there are local nuances,&#8221; such as slurs, that content reviewers who are native speakers can consider when making decisions. But former content monitors told Reuters the rules were inconsistent; sometimes they could make exceptions and sometimes they couldn&#8217;t.</p>
<p style="text-align: justify;">Former content monitors also said they were trained to err on the side of keeping content on Facebook. &#8220;Most of the time, you try to give the user the benefit of the doubt,&#8221; said one former Facebook employee.</p>
<p style="text-align: justify;">The ex-monitors said they sometimes had as little as a few seconds to decide if a post constituted hate speech or violated Facebook&#8217;s community standards in some other way. They said they didn&#8217;t actually search for hate speech themselves; instead, they reviewed a giant queue of posts mostly reported by Facebook users.</p>
<p style="text-align: justify;">Many of the millions of items flagged globally each week – including violent diatribes and lurid sexual imagery – are detected by automated systems, Facebook says. But a company official acknowledged to Reuters that its systems have difficulty interpreting Burmese script because of the way the fonts are often rendered on computer screens, making it difficult to identify racial slurs and other hate speech.</p>
<p style="text-align: justify;">Facebook&#8217;s troubles are evident in a new feature that allows users to translate Burmese content into English. Consider a post Reuters found from August of last year.</p>
<p style="text-align: justify;">In Burmese, the post says: &#8220;Kill all the kalars that you see in Myanmar; none of them should be left alive.&#8221;</p>
<p style="text-align: justify;">Facebook&#8217;s translation into English: &#8220;I shouldn&#8217;t have a rainbow in Myanmar.&#8221;</p>
<p style="text-align: justify;">In response, Facebook said: &#8220;Our translations team is actively working on new ways to ensure that translations are accurate.&#8221; The company said it uses a different system to detect hate speech.</p>
<p style="text-align: justify;">Guy Rosen, vice president of product management, wrote in a blog post on Facebook in May about the problems the company faced in identifying hate speech. &#8220;Our technology still doesn&#8217;t work that well and so it needs to be checked by our review teams,&#8221; he wrote.</p>
<p style="text-align: justify;">Facebook officials say they have no immediate plans to hire any employees in Myanmar itself. But the company does contract with local agencies for tasks unrelated to content monitoring. One is Echo Myanmar, a communications firm whose managing director is Anthony Larmon, an American.</p>
<p style="text-align: justify;">Larmon has expressed strong opinions on the Rohingya. Toward the end of 2016, the Myanmar army launched an onslaught across some 10 villages after Rohingya militants attacked border posts. At the time, a U.N. official accused the government of seeking &#8220;ethnic cleansing&#8221; of the Rohingya.</p>
<p style="text-align: justify;">In November 2016, Larmon wrote that an article about the U.N. allegation was &#8220;misleading.&#8221; He cited what he said were claims by multiple &#8220;local journalists&#8221; that the ethnic minority &#8220;purposely exaggerate (lie about)&#8221; their situation to &#8220;get more foreign aid and attention.&#8221;</p>
<p style="text-align: justify;">He also wrote: &#8220;No, they aren&#8217;t facing ethnic cleansing or anything remotely close to what that incendiary term suggests.&#8221; He said he later removed the post.</p>
<p style="text-align: justify;">A Facebook spokesperson said that Larmon&#8217;s post &#8220;does not represent Facebook&#8217;s view.&#8221;</p>
<p style="text-align: justify;">Larmon told Reuters: &#8220;It was overly-emotional, under-informed commentary on a highly nuanced subject that I do regret. My view on the Rohingya, same today as then, is that they should be safely repatriated and protected.&#8221;</p>
<p style="text-align: justify;">The platform on which he aired his views about the Rohingya? Facebook.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">7749</post-id>	</item>
		<item>
		<title>Facebook deletes hundreds of posts under German hate-speech law</title>
		<link>https://www.faith-matters.org/facebook-deletes-hundreds-of-posts-under-german-hate-speech-law/</link>
		
		<dc:creator><![CDATA[Faith Matters]]></dc:creator>
		<pubDate>Tue, 31 Jul 2018 09:10:33 +0000</pubDate>
				<category><![CDATA[Facebook]]></category>
		<category><![CDATA[Germany]]></category>
		<category><![CDATA[hate speech]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Hate]]></category>
		<category><![CDATA[Hate Speech]]></category>
		<category><![CDATA[Hatred]]></category>
		<category><![CDATA[Mark Zuckerberg]]></category>
		<guid isPermaLink="false">https://www.faith-matters.org/?p=7535</guid>

					<description><![CDATA[Facebook said it had deleted hundreds of offensive posts since a law banning online hate speech came into force in Germany at the start of the year that foresees fines of up to 50 million euros (51.53 million pounds) for failure to comply. The social network received 1,704 complaints under the law, known in Germany [&#8230;]]]></description>
										<content:encoded><![CDATA[<p><a class="a2a_button_facebook" href="https://www.addtoany.com/add_to/facebook?linkurl=https%3A%2F%2Fwww.faith-matters.org%2Ffacebook-deletes-hundreds-of-posts-under-german-hate-speech-law%2F&amp;linkname=Facebook%20deletes%20hundreds%20of%20posts%20under%20German%20hate-speech%20law" title="Facebook" rel="nofollow noopener" target="_blank"></a><a class="a2a_button_x" href="https://www.addtoany.com/add_to/x?linkurl=https%3A%2F%2Fwww.faith-matters.org%2Ffacebook-deletes-hundreds-of-posts-under-german-hate-speech-law%2F&amp;linkname=Facebook%20deletes%20hundreds%20of%20posts%20under%20German%20hate-speech%20law" title="X" rel="nofollow noopener" target="_blank"></a><a class="a2a_button_linkedin" href="https://www.addtoany.com/add_to/linkedin?linkurl=https%3A%2F%2Fwww.faith-matters.org%2Ffacebook-deletes-hundreds-of-posts-under-german-hate-speech-law%2F&amp;linkname=Facebook%20deletes%20hundreds%20of%20posts%20under%20German%20hate-speech%20law" title="LinkedIn" rel="nofollow noopener" target="_blank"></a><a class="a2a_button_whatsapp" href="https://www.addtoany.com/add_to/whatsapp?linkurl=https%3A%2F%2Fwww.faith-matters.org%2Ffacebook-deletes-hundreds-of-posts-under-german-hate-speech-law%2F&amp;linkname=Facebook%20deletes%20hundreds%20of%20posts%20under%20German%20hate-speech%20law" title="WhatsApp" rel="nofollow noopener" target="_blank"></a><a class="a2a_dd a2a_counter addtoany_share_save addtoany_share" href="https://www.addtoany.com/share#url=https%3A%2F%2Fwww.faith-matters.org%2Ffacebook-deletes-hundreds-of-posts-under-german-hate-speech-law%2F&#038;title=Facebook%20deletes%20hundreds%20of%20posts%20under%20German%20hate-speech%20law" data-a2a-url="https://www.faith-matters.org/facebook-deletes-hundreds-of-posts-under-german-hate-speech-law/" data-a2a-title="Facebook deletes hundreds of posts under German hate-speech law"></a></p><p style="text-align: justify;">Facebook said it had deleted hundreds of offensive posts since a law banning online hate speech came into force in Germany at the start of the year that foresees fines of up to 50 million euros (51.53 million pounds) for failure to comply.</p>
<p style="text-align: justify;">The social network received 1,704 complaints under the law, known in Germany as NetzDG, and removed 262 posts between January and June, Richard Allan, Facebook&#8217;s vice president for global policy solutions said in a blog.</p>
<p style="text-align: justify;">&#8220;Hate speech is not allowed on Facebook,&#8221; Allan said, adding that the network had removed posts that attacked people who were vulnerable for reasons including ethnicity, nationality, religion or sexual orientation.</p>
<p style="text-align: justify;">Complaints covered a range of alleged offences under Germany&#8217;s criminal code, including insult, defamation, incitement to hatred and incitement to crime, the report said. Of the posts that were blocked, the largest number was for insult.</p>
<p style="text-align: justify;">Facebook is less popular in Germany than other European countries, with only around two in five internet users logging on each month, according to researchers eMarketer.</p>
<p style="text-align: justify;">That&#8217;s in part due to collective memories of hate-filled propaganda that date back to Germany&#8217;s 20th century history of Nazi and Communist rule that don&#8217;t always sit well with Facebook&#8217;s broad view on freedom of speech.</p>
<p style="text-align: justify;">Chief Executive Mark Zuckerberg faced criticism in Germany after saying in a recent interview that Facebook should not delete statements denying that the Holocaust happened &#8211; a crime in Germany. He later clarified his remarks.</p>
<p style="text-align: justify;">Facebook has a dedicated team of 65 staff handling complaints under the NetzDG, Allan said, adding that this could be adjusted in line with the number of complaints.</p>
<p style="text-align: justify;">From January to June, Facebook removed a total of around 2.5 million posts that violated its own community standards designed to prevent abusive behaviour on the platform.</p>
<p style="text-align: justify;">&#8220;We have taken a very careful look at the German law,&#8221; Allan wrote in his blog, which was published in German.</p>
<p style="text-align: justify;">&#8220;That&#8217;s why we are convinced that the overwhelming majority of content considered hate speech in Germany, would be removed if it were examined to see whether it violates our community standards.&#8221;</p>
<p style="text-align: justify;">A lawmaker for Chancellor Angela Merkel&#8217;s ruling Christian Democratic Union (CDU), Tankred Schipanski, said the NetzDG law &#8211; which requires social platforms to remove offensive posts within 24 hours &#8211; was doing the job for which it was intended.</p>
<p style="text-align: justify;">($1 = 0.8579 euros)</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">7535</post-id>	</item>
		<item>
		<title>Facebook Founder’s Post Shows Why Jewish Muslim Relations Are Fundamental to Protect</title>
		<link>https://www.faith-matters.org/facebook-founders-post-shows-why-jewish-muslim-relations-are-fundamental-to-protect/</link>
		
		<dc:creator><![CDATA[Tell Mama]]></dc:creator>
		<pubDate>Fri, 11 Dec 2015 18:54:50 +0000</pubDate>
				<category><![CDATA[anti-Muslim]]></category>
		<category><![CDATA[Antisemitism]]></category>
		<category><![CDATA[Donald Trump]]></category>
		<category><![CDATA[Facebook]]></category>
		<category><![CDATA[Islamophobia]]></category>
		<category><![CDATA[Mark Zuckerberg]]></category>
		<category><![CDATA[Muslims]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Opinions]]></category>
		<category><![CDATA[Islam]]></category>
		<category><![CDATA[Jewish]]></category>
		<category><![CDATA[Jews]]></category>
		<category><![CDATA[Muslim]]></category>
		<guid isPermaLink="false">https://tellmamauk.org/?p=7021</guid>

					<description><![CDATA[<p>Muslim and Jewish communities have so much in common. From food, to religious practices and to the fact that both communities live so closely in various parts of the globe, the history of Muslim and Jewish communities have always been intertwined. Sometimes these relationships have been strained because of Israel and Palestine, but on the</p>
<p>The post <a rel="nofollow" href="https://tellmamauk.org/this-is-why-jewish-muslim-relations-are-fundamental-to-protect/">Facebook Founder&#8217;s Post Shows Why Jewish Muslim Relations Are Fundamental to Protect</a> appeared first on <a rel="nofollow" href="https://tellmamauk.org/">TELL MAMA</a>.</p>]]></description>
										<content:encoded><![CDATA[<p><a class="a2a_button_facebook" href="https://www.addtoany.com/add_to/facebook?linkurl=https%3A%2F%2Fwww.faith-matters.org%2Ffacebook-founders-post-shows-why-jewish-muslim-relations-are-fundamental-to-protect%2F&amp;linkname=Facebook%20Founder%E2%80%99s%20Post%20Shows%20Why%20Jewish%20Muslim%20Relations%20Are%20Fundamental%20to%20Protect" title="Facebook" rel="nofollow noopener" target="_blank"></a><a class="a2a_button_x" href="https://www.addtoany.com/add_to/x?linkurl=https%3A%2F%2Fwww.faith-matters.org%2Ffacebook-founders-post-shows-why-jewish-muslim-relations-are-fundamental-to-protect%2F&amp;linkname=Facebook%20Founder%E2%80%99s%20Post%20Shows%20Why%20Jewish%20Muslim%20Relations%20Are%20Fundamental%20to%20Protect" title="X" rel="nofollow noopener" target="_blank"></a><a class="a2a_button_linkedin" href="https://www.addtoany.com/add_to/linkedin?linkurl=https%3A%2F%2Fwww.faith-matters.org%2Ffacebook-founders-post-shows-why-jewish-muslim-relations-are-fundamental-to-protect%2F&amp;linkname=Facebook%20Founder%E2%80%99s%20Post%20Shows%20Why%20Jewish%20Muslim%20Relations%20Are%20Fundamental%20to%20Protect" title="LinkedIn" rel="nofollow noopener" target="_blank"></a><a class="a2a_button_whatsapp" href="https://www.addtoany.com/add_to/whatsapp?linkurl=https%3A%2F%2Fwww.faith-matters.org%2Ffacebook-founders-post-shows-why-jewish-muslim-relations-are-fundamental-to-protect%2F&amp;linkname=Facebook%20Founder%E2%80%99s%20Post%20Shows%20Why%20Jewish%20Muslim%20Relations%20Are%20Fundamental%20to%20Protect" title="WhatsApp" rel="nofollow noopener" target="_blank"></a><a class="a2a_dd a2a_counter addtoany_share_save addtoany_share" href="https://www.addtoany.com/share#url=https%3A%2F%2Fwww.faith-matters.org%2Ffacebook-founders-post-shows-why-jewish-muslim-relations-are-fundamental-to-protect%2F&#038;title=Facebook%20Founder%E2%80%99s%20Post%20Shows%20Why%20Jewish%20Muslim%20Relations%20Are%20Fundamental%20to%20Protect" data-a2a-url="https://www.faith-matters.org/facebook-founders-post-shows-why-jewish-muslim-relations-are-fundamental-to-protect/" data-a2a-title="Facebook Founder’s Post Shows Why Jewish Muslim Relations Are Fundamental to Protect"></a></p><div class="_1dwg">
<div id="js_2" class="_5pbx userContent" data-ft="{&quot;tn&quot;:&quot;K&quot;}">
<p style="text-align: justify;">Muslim and Jewish communities have so much in common. From food, to religious practices and to the fact that both communities live so closely in various parts of the globe, the history of Muslim and Jewish communities have always been intertwined. Sometimes these relationships have been strained because of Israel and Palestine, but on the issue of tackling prejudice and bigotry, Jewish communities have stood with Muslim communities for decades.</p>
<p style="text-align: justify;">Over the last week, the comments of Donald Trump have ignited a debate about what is hate speech and has led to hundreds of thousands of fellow Brits asking for Parliament to discuss banning Trump from the shores of our tiny island.</p>
<p style="text-align: justify;">Yet, in all of the midst of the comments, the actions of one father, a Jewish father, has captured our hearts and no doubt, captured the hearts of many others. His name is Mark Zuckerberg, the founder of Facebook and who wrote a message of hope after hearing Trump’s bigoted and dangerous statements about Muslims. This is his inspirational and personal statement:</p>
<blockquote>
<p style="text-align: justify;">I want to add my voice in support of Muslims in our community and around the world.</p>
<p style="text-align: justify;">After the Paris attacks and hate this week, I can only imagine the fear Muslims feel that they will be persecuted for the actions of others.</p>
<p style="text-align: justify;">As a Jew, my parents taught me that we must stand up against attacks on all communities. Even if an attack isn’t against you today, in time attacks on freedom for anyone will hurt everyone.</p>
<p style="text-align: justify;">If you’re a Muslim in this community, as the leader of Facebook I want you to know that you are always welcome here and that we will fight to protect your rights and create a peaceful and safe environment for you.</p>
<p style="text-align: justify;">Having a child has given us so much hope, but the hate of some can make it easy to succumb to cynicism. We must not lose hope. As long as we stand together and see the good in each other, we can build a better world for all people.</p>
</blockquote>
<p style="text-align: justify;"> Next time groups who purport to tackle Islamophobia and who berate Tell MAMA for working with Jewish communities and the CST, (just so that we can provide the best service to Muslims who have suffered anti-Muslim prejudice), should reflect carefully on their actions. They should realise that their personal prejudices, antisemitism and internalised bigotry have no place in our society and communities. If anything, reading Zuckerberg’s comments should shame you in your actions. Zuckerberg’s comments inspire us to stand up against anti-Muslim prejudice, yet it redoubles our efforts to stand with other communities when they are attacked. Yesterday, Jews were targeted and virtually annihilated across Europe. Today, they stand with and for us.</p>
</div>
</div>
<p>The post <a href="https://tellmamauk.org/this-is-why-jewish-muslim-relations-are-fundamental-to-protect/" rel="nofollow noopener" target="_blank">Facebook Founder’s Post Shows Why Jewish Muslim Relations Are Fundamental to Protect</a> appeared first on <a href="https://tellmamauk.org/" rel="nofollow noopener" target="_blank">TELL MAMA</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">1547</post-id>	</item>
	</channel>
</rss>
