[kictanet] A "light" hearted look at how ICT permeates all sectors

Margaret Favour margaret.nyambush at gmail.com
Tue Dec 6 14:31:13 EAT 2016


Dear listeners...has anyone read this? what is your take in regard to the
move taken?.
Facebook needs to hand over its algorithm if it really wants to end fake
news
What is truth? (Stephen Lam / Reuters)
Share
Written by Michael J. Coren <http://qz.com/author/mcorenqz/>
Obsession

On Sept. 5, 2006
<http://mashable.com/2013/03/12/facebook-news-feed-evolution/#8xlfjXFoskqf>,
Facebook transformed from a technology platform into a media company. That
was the day the social media site replaced its chronological list of
friends’ updates with its algorithmic news feed.

Since then, Facebook has decided everything its users—now totaling 1.8
billion on a monthly basis—see in their unique feeds. “Once [Facebook] got
into the business of curating the newsfeed rather than simply treating it
as a timeline, they put themselves in the position of mediating what people
are going to see. They became a gatekeeper and a guide,” wrote
<https://medium.com/the-wtf-economy/media-in-the-age-of-algorithms-63e80b9b0a73#.hf8t71r2r>
Tim O’Reilly, a Silicon Valley investor and publisher, in a Medium post. “It’s
their job. So they’d better make a priority of being good at it.”
Screenshot of Facebook’s News Feed at launch in September 2006. (Facebook)

Facebook now stands as the world’s most influential editor, and perhaps its
most reluctant. Its CEO, Mark Zuckerberg, has repeatedly denied Facebook is
more than a technology company. The company’s main role, he argued in a
Nov. 12 Facebook post
<https://www.facebook.com/zuck/posts/10103253901916271>, is “helping people
stay connected with friends and family,” not sharing news or media. He
defined the company as one that builds software and maintains servers, and
only incidentally manages billions of peoples’ content. “We are a
technology company because the main thing we do across many products is
engineer and build technology to enable all these things,” he wrote.

This is certainly a message investors like to hear. Being an endlessly
scalable platform is much more preferable to the messy business of media.
And as a blameless technology company, Facebook can float above any
responsibility for content shared on its network. Its sole job is to
maximize users’ “engagement” for the purpose of generating advertising
revenue. But as a company making conscious editorial decisions, Facebook
would risk its bottom line by favoring things such as decency, accuracy,
and other considerations that may depress engagement.
Facebook’s internal poster asks its employees an existential question.
(Facebook)

This contradiction was in high relief during the US presidential election.
A global cottage industry sprung up to feed fake news to Trump supporters
who proved the most profitable
<https://www.washingtonpost.com/news/the-intersect/wp/2014/10/21/this-is-not-an-interview-with-banksy/>
targets
for bogus information. BuzzFeed reports
<https://www.buzzfeed.com/craigsilverman/viral-fake-election-news-outperformed-real-news-on-facebook?utm_term=.ndb5D8nGD#.icODaoXEa>
17
of the 20 top-performing false election stories on Facebook were pro-Donald
Trump or anti-Hillary Clinton. In the last month of presidential campaign,
the number of shares, reactions and comments to fake news significantly
outpaced
<https://www.buzzfeed.com/craigsilverman/viral-fake-election-news-outperformed-real-news-on-facebook>
that of mainstream reporting.

Facebook was—and remains—a very active editor in all this. Each day, the
company’s algorithms weighs clicks and likes to decide what appears in
users’ News Feeds <http://newsroom.fb.com/news/category/news-feed-fyi/>.
Thousands of people contracted by Facebook delete banned content such as
graphic violence. Content predicted to “engage” people is heavily
prioritized, while the rest sinks from view.

This process reveals a fundamental truth about being an editor, as noted by
Brian Phillips at MTV: it’s not an intention, it’s something you do
<http://www.mtv.com/news/2955021/shirtless-trump-saves-drowning-kitten/>.
And doing little or nothing to curb misleading or fake news is still an
editorial decision, says Sarah Roberts
<https://gseis.ucla.edu/directory/sarah-roberts/>, an information studies
professor at UCLA. “Letting bogus news flow through their pipes is actually
an editorial decision,” said Robert in an interview. “No action is not
neutral.”
Time to open up

For Facebook to avoid this unwanted position, it may need to stop playing
the only editor. Opinion in Silicon Valley is drifting in that direction.
Paul Graham, co-founder of the Y Combinator startup fund, suggested
independent parties gain access to the News Feed to help inform how
information is prioritized.

The News Feed algorithm is largely a mystery to outsiders, but Facebook
product managers told Quartz it relies heavily on users’
preferences—shares, likes and reports—to serve up content that will excite
(or incite) friends and connections. Facebook uses “community feedback
<https://techcrunch.com/2016/11/10/facebook-admits-it-must-do-more-to-stop-the-spread-of-misinformation-on-its-platform/>”
to avoid prioritizing misleading stories, but engagement is the overriding
priority. News stories appear alongside memes, hoaxes, and conspiracy
theories in a stream of information whether it comes from Fox News,
Washington Post, or a freshly minted WordPress site. As people engage with
content they want to hear, and share with their friends, the flywheel of
fake news gathers speed.

That wasn’t a big deal when Facebook was one of dozens of social
networks. Rules
that applied when Facebook had 100 million users have broken down down as
it approaches 2 billion. Facebook’s reach now exceeds every other player in
the media industry, giving it unprecedented power over the distribution of
information. Researchers have found that even even small tweaks in
presenting information online can sway voter preferences. A 2014 study in
the journal Science
<http://www.sciencemag.org/news/2015/08/internet-search-engines-may-be-influencing-elections>
reordered Google’s search rankings by placing one candidate above another.
The researchers shifted voter preferences equivalent to a 2% change in the
electorate (or 2.6 million votes) in a hypothetical US election. A similar
effect is possible
<http://www.sciencemag.org/news/2016/10/could-google-influence-presidential-election>
on Facebook and other social media. Clinton, it’s worth noting, lost three
crucial swing states in the Rust Belt by a combined 107,000 votes
<http://www.usatoday.com/story/news/politics/elections/2016/11/21/election-results-electoral-popular-votes-trump-clinton/94214826/>
or so.

Facebook’s only chance to escape this kind of responsibility may be to open
its News Feed to others, says Jonathan Zittrain
<http://hls.harvard.edu/faculty/directory/10992/Zittrain>, a computer
science and law professor at Harvard University. “The key is to have the
companies actually be the ‘platforms’ they claim they are,” he wrote on
Twitter <https://twitter.com/zittrain/status/798597027449815041>. “Facebook
should allow anyone to write an algorithm to populate someone’s feed.”
Giving up absolute control over the news feed would relieve Facebook of its
monopoly over users’ content, as well as the company’s responsibility to
manage it exclusively.

Zittrain suggested third-parties build on top of the existing algorithm,
offering the News Feed as a customizable stream that others modify to suit
their audiences. By allowing anyone to help populate someone’s feed, Facebook’s
audience will have a choice, and transparency, into how their news is
sourced, much as they do today when choosing a media source. The most egregious
clickbait and hoaxes can be eliminated by repurposing existing spam and
malware tools, while the hardest problem of “fake news”—biased or
misleading stories from popular sites—is better left to third-parties
rather than a single Facebook feed, Zittrain said in an interview.

Facebook would serve as a sort of app store for groups like the
Massachusetts Institute of Technology, Ralph Nader, the Republican National
Committee or The New York Times to build “feed rankers” that weight news
surfaced in users’ feeds. Users could select combinations that suit their
tastes. While this does not solve the filter bubble problem (people
self-selecting groups they already agree with), he admits, it avoids making
Facebook, along with Twitter and Google, the arbiters of what’s true or
not. “That’s way too big a burden to put on one company, and it strikes me
as dangerous,” he said.

Others are hoping that third-parties can label content in the News Feed
rather than sort it directly. One approach developed by Aviv Ovadya
<https://www.linkedin.com/in/avivovadya>, an MIT-trained software engineer
in San Francisco, is a non-partisan, fact-based database for news. “There
needs to be a third-party organization doing impartial credibility rating,”
said Ovadya in an interview. The non-profit database company will collect a
long-term analysis of news sources. Machine learning and crowdsourced fact
checking can categorize content and verify falsifiable claims. Social media
companies who want to use this datasource can label news with a
creditability ranking rather than screen it out entirely.
Change is inevitable

So far, Zuckerberg has shown little willingness to let others into
Facebook’s inner sanctum. Academics and other research have been rebuffed
<http://www.slate.com/articles/technology/bitwise/2015/05/facebook_study_why_silicon_valley_s_incursion_into_academic_research_is.html>
from studying the company’s full data sets. Facebook is spending some of
its $23 billion
<https://techcrunch.com/2016/07/27/facebook-earnings-q2-2016/> cash pile to buy
up
<http://venturebeat.com/2016/11/11/facebook-buys-crowdtangle-a-social-analytics-tool-for-publishers/>
independent companies such as CrowdTangle that can offer insights into how
content performs on Facebook. The company has also not given much time to
rethinking its approach to news, reports Zittrain who has had multiple
conversations with Facebook employees. “I think it’s something they care
about a lot, but it’s also one of the thing where the important and urgent
diverge,” he said.

But it appears handing over control to outsiders isn’t a complete
nonstarter. The New York Times reported on Nov. 22
<http://www.nytimes.com/2016/11/22/technology/facebook-censorship-tool-china.html?mtrref=undefined&gwh=A2F7382EFE2E08725B76B4EDFCA15F21&gwt=pay&_r=0>
(paywall) that the social network is testing software to suppress news feed
content in specific regions of China. The feature appears to be a
requirement for Facebook’s entrance into the Chinese market, where it is
now blocked. Rather than censor posts itself, Facebook could reportedly let
third parties (most likely Chinese companies) monitor and restrict access
to popular stories.

No matter what the outcome, changes will happen anyway, predicted UCLA’s
Roberts. “There is going to be big shakeup,” she says. Facebook’s position
is untenable. The idealistic idea, rooted in the early days of the
internet, that online conversation could happen without moderation from the
network designers has always been “nonsense,” she says. “We are stuck in a
paradigm that is out of date, may never have been true, and does not
scale.”

Public outcry is already pushing Facebook to act. The company banned fake
news sites from accessing its ad network
<http://qz.com/837474/facebook-fb-is-banning-fake-news-publishers-from-its-ad-network/>
on Nov 14. Although that doesn’t affect the News Feed, Facebook has said “we
take misinformation on Facebook very seriously,” in a Nov. 10 statement
<https://techcrunch.com/2016/11/10/facebook-admits-it-must-do-more-to-stop-the-spread-of-misinformation-on-its-platform/>.
“We understand there’s so much more we need to do.”

Facebook may be facing its own version of Google’s fight against “content
farms” in 2011. At the time, Google was battling a rash of sites that paid
writers to churn out low-quality, often inaccurate content threatening to erode
trust
<https://techcrunch.com/2011/01/01/why-we-desperately-need-a-new-and-better-google-2/>
in
Google’s search results. Matt Cutts <https://www.linkedin.com/in/mattcutts>,
Google head of search spam, says Facebook’s struggle mirror those of
Google’s at the time. “When *external commentary* started to mirror our own
internal discussions and concerns, that was a real wake up call,” Cutts
wrote
<https://medium.com/the-wtf-economy/media-in-the-age-of-algorithms-63e80b9b0a73#.hf8t71r2r>
recently on Medium. “I see a pretty direct analogy between the Panda
algorithm and what Facebook is going through now.”

How did Google respond? The Panda update
<http://searchengineland.com/library/google/google-panda-update>. It
identified low-quality pages and dropped them lower in the rankings, a
change estimated to affect 12% of US search queries
<https://webmasters.googleblog.com/2011/04/high-quality-sites-algorithm-goes.html>.
The new algorithm shaved off enough revenue that it had to be disclosed in
later earnings calls, but it restored quality and confidence in the
company’s search results, Cutts writes: “It was the right decision to
launch Panda, both for the long-term trust of our users and for a better
ecosystem for publishers.”
Facebook has a harder problem. It can start by following Google’s lead by
suppressing clickbait and “low-quality” results. That was ultimately a
solvable problem for the search giant. Prioritizing accuracy in the
content, and not just engagement, will take more—and likely some outside
assistance.

On Tue, Dec 6, 2016 at 1:06 PM, Mildred Achoch via kictanet <
kictanet at lists.kictanet.or.ke> wrote:

> Hi Walubengo,
>
> Yes, that is indeed another way to look at the situation. But think about
> it: The mulika mwizi torch was much more readily available than say people
> going to look for candles, kerosene lamps etc. In a situation where seconds
> could be the difference between life and death, the handy mulika mwizi was
> transformed from a (usually disdained) mobile phone into a lifesaving
> device.
>
> Perhaps my screenwriter's mind is at fault. To me, the best comedy can
> come from the most tragic situations. I apologize for making light of a
> serious situation.
>
> Regards,
> Mildred Achoch.
>
> Check out the Rock 'n' roll film festival, Kenya TV Channel!
> http://kenyarockfilmfestivaljournal.blogspot.com
>
>
>
> On Tue, Dec 6, 2016 at 9:49 AM, Walubengo J <jwalu at yahoo.com> wrote:
>
>> @ Mildred,
>>
>> this is not 'light' hearted. This is scary stuff.
>>
>> walu.
>>
>> ------------------------------
>> *From:* Mildred Achoch via kictanet <kictanet at lists.kictanet.or.ke>
>> *To:* jwalu at yahoo.com
>> *Cc:* Mildred Achoch <mildandred at gmail.com>
>> *Sent:* Tuesday, December 6, 2016 12:20 AM
>> *Subject:* [kictanet] A "light" hearted look at how ICT permeates all
>> sectors
>>
>> "Surgeon used a 'Mulika Mwizi" for a caesarean section"
>>
>> http://www.the-star.co.ke/news/2016/10/10/surgeon-used-a-
>> mulika-mwizi-for-a-caesarean-section_c1434643
>>
>> Regards,
>> Mildred Achoch.
>>
>>
>>
>> Check out the Rock 'n' roll film festival, Kenya TV Channel!
>> http://kenyarockfilmfestivaljournal.blogspot.com
>>
>>
>>
>> _______________________________________________
>> kictanet mailing list
>> kictanet at lists.kictanet.or.ke
>> https://lists.kictanet.or.ke/mailman/listinfo/kictanet
>>
>> Unsubscribe or change your options at https://lists.kictanet.or.ke/m
>> ailman/options/kictanet/jwalu%40yahoo.com
>>
>> The Kenya ICT Action Network (KICTANet) is a multi-stakeholder platform
>> for people and institutions interested and involved in ICT policy and
>> regulation. The network aims to act as a catalyst for reform in the ICT
>> sector in support of the national aim of ICT enabled growth and development.
>>
>> KICTANetiquette : Adhere to the same standards of acceptable behaviors
>> online that you follow in real life: respect people's times and bandwidth,
>> share knowledge, don't flame or abuse or personalize, respect privacy, do
>> not spam, do not market your wares or qualifications.
>>
>>
>
> _______________________________________________
> kictanet mailing list
> kictanet at lists.kictanet.or.ke
> https://lists.kictanet.or.ke/mailman/listinfo/kictanet
>
> Unsubscribe or change your options at https://lists.kictanet.or.ke/
> mailman/options/kictanet/margaret.nyambush%40gmail.com
>
> The Kenya ICT Action Network (KICTANet) is a multi-stakeholder platform
> for people and institutions interested and involved in ICT policy and
> regulation. The network aims to act as a catalyst for reform in the ICT
> sector in support of the national aim of ICT enabled growth and development.
>
> KICTANetiquette : Adhere to the same standards of acceptable behaviors
> online that you follow in real life: respect people's times and bandwidth,
> share knowledge, don't flame or abuse or personalize, respect privacy, do
> not spam, do not market your wares or qualifications.
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.kictanet.or.ke/pipermail/kictanet/attachments/20161206/a4e5bd5e/attachment.htm>


More information about the KICTANet mailing list