[kictanet] [should the victims be blamed? aren't platforms responsible as enablers and amplifiers?] Child marriage on facebook

Patrick A. M. Maina pmaina2000 at yahoo.com
Tue Nov 20 20:39:29 EAT 2018


 Then it looks like you have everything under control. :-)
Hope the problems gets solved soonest for the sake of us all. 
Best wishes! 
    On Tuesday, November 20, 2018, 8:30:08 PM GMT+3, Ebele Okobi <ebeleokobi at fb.com> wrote:  
 
 The problem set is a platform with 2.4 billion people, posting billions of pieces of content. We have some proactive review, of things like child exploitation content, but it is really difficult to pre-review every single piece of content at this scale. It’s also not even what a majority of users want. I know that I personally, use FB a great deal, and I would not want that.
In order to ensure that *no* bad content is posted, that is, actually, what would be necessary. This is actually an issue we have spent a great deal of time thinking about, with multiple experts, and while there are more ways we are identifying content, a state of no bad content ever posted, or a system that would enable FB itself to be aware of every single piece of content-this is a genuine question-how do you think that would work?
Saying “private tech platform” doesn’t answer the question-detecting crime is still detecting crime/bad actors, and it’s still not clear from this phrase how you think this would work, irl.
We pay multiple consultants and confer with thousands of rights activists, safety advocates, CSOs, law enforcement, etc, and it’s not apparent from this exchange that you are an expert in any of the topics discussed, so the notion that my genuine interest in your opinion amounts to a request for free consulting is odd. 




On Nov 20, 2018, at 5:16 PM, Patrick A. M. Maina <pmaina2000 at yahoo.com> wrote:


Sawa, in-line answers below...
On Tuesday, November 20, 2018, 7:52:11 PM GMT+3, Ebele Okobi <ebeleokobi at fb.com> wrote:

You haven’t answered any of my questions.  
I have repasted, for reference-


Would it be preferable to have a platform where every single post, picture, comment is subject to pre-clearance, by Facebook?A: Are those the only options? Is that the only alternative to status-quo? What happened to open innovation? Ideas for better *technical* solutions exist, but the good ones are not free. 


How do you think policing works in society?A: Policing in FB is not analogous to policing in society. FB is a private tech platform. 
Are there police assigned to each individual, actively monitoring each?A: pls. see above answer.
How do you think actual communities work?A: pls. see above answer.
If not for community, exactly how should a platform of 2.4 billion people posting billions of pieces of content per hour, the vast majority of which is completely innocuous, work, in your view?A: I could tell you but that would be free consulting. :-). 

What is an “educating” model?A: Again, I can't do free consulting to billion dollar companies. 
FB could try an "innovation competition" to get a cheap/free brainstorm on the issue but I think people globally are wising up on the odds around such events and so the quality of ideas is going down. Probably another area that needs new thinking.  

Enjoy your evening! :-)Patrick.



On Nov 20, 2018, at 4:45 PM, Patrick A. M. Maina <pmaina2000 at yahoo.com> wrote:


On the "educating model", I can do some ad-hoc paid consulting for you guys if you haven't thought of it. Lets discuss offline if interested.
I don't understand the society argument... Facebook is not "society". It is a *for-profit business entity* founded on what looks like a predatory business model which exploits human/society's weakness (e.g. narcissism, personal insecurities, reward mechanisms in the brain etc) for monetized data and engagement.  
There's an interesting pattern that I hadn't originally picked on... It looks like mega corporations embrace pseudo-communism ideals to avoid owning problems that *they* created/exacerbated: "Beloved users, the problem belongs to all of us, because we need each other as a community. So each one of you should give mega-corp a free lunch because its good for you".. but when it comes toprofits, they revert to pure capitalism "our profits belong to shareholders only. we are capitalists. no free lunches!". 
Take ownership.
Opinion today: Facebook's excessive focus on profits


| 
| 
| 
|  |  |

 |

 |
| 
|  | 
Opinion today: Facebook's excessive focus on profits

Regulation looms for social media — much as big banks after the financial crisis
 |

 |

 |




On Tuesday, November 20, 2018, 6:53:35 PM GMT+3, Ebele Okobi <ebeleokobi at fb.com> wrote:

What is an “educating” model?

How do you think policing works in society?Are there police assigned to each individual, actively monitoring each?How do you think actual communities work?
If not for community, exactly how should a platform of 2.4 billion people posting billions of pieces of content per hour, the vast majority of which is completely innocuous, work, in your view?
On Nov 20, 2018, at 3:43 PM, Patrick A. M. Maina <pmaina2000 at yahoo.com> wrote:


It kinda looks a bit like an *nudged* duty.. :-) otherwise, why would low levels of reporting be an issue if the system does not heavily *rely* on community reporting (yay! free labor!)? Doesn't the community have its own engagements to focus on and should facebook not respect that?
Are you sure that FB truly supports free expression or is it not that its just cheaper (more profitable) to offload policing to the community?

If the FB platform truly supported "free expression" things would be more complicated because instead of takedowns, you would use an *educating model* which is much harder to pull off. 
Sidenotes:a. I'm curious how FB defines "free expression".b. On the Child marriage, FB acted after it was too late (girl had been sold off). This suggests heavy reliance on community policing. Is this a form of wilful negligence on the platform part because by now FB is aware that it is being misused for anti-social purposes?
Good evening.Patrick.
"We continue to evolve our ability to detect violations on our platforms, but YES, it is YOU, the community, who helps to police the content on Facebook. 
I continue to be struck by the incredibly low levels of reporting across our Continent. We continue to develop educational materials, but I am always surprised at how few people, even in circles like this, know to report bad content into the platform."


On Tuesday, November 20, 2018, 6:14:11 PM GMT+3, Ebele Okobi <ebeleokobi at fb.com> wrote:


At no point did I say that the community has a duty to report or that the community is to blame. Facebook responds to the community when the report. This gives the community the power to let us know when something is wrong. Why would anyone *not* want the ability to report? 

 

 Would it be preferable to have a platform where every single post, picture, comment is subject to pre-clearance, by Facebook? I find it odd that anyone interested in free expression would want such a model.





 

From: kictanet <kictanet-bounces+ebeleokobi=fb.com at lists.kictanet.or.ke> on behalf of "Patrick A. M. Maina via kictanet" <kictanet at lists.kictanet.or.ke>
Reply-To: "Patrick A. M. Maina" <pmaina2000 at yahoo.com>, KICTAnet ICT Policy Discussions <kictanet at lists.kictanet.or.ke>
Date: Tuesday, November 20, 2018 at 2:44 PM
To: Ebele Okobi <ebeleokobi at fb.com>
Cc: "Patrick A. M. Maina" <pmaina2000 at yahoo.com>
Subject: Re: [kictanet] [should the victims be blamed? aren't platforms responsible as enablers and amplifiers?] Child marriage on facebook

 

Some responses on this topic raise some interesting and important issues:

 

1. Do social media/messaging platform play a role in crime as amplifiers, enablers?

 

2. Would crimes be harder to pull off if such platform could, through enhanced technical functionality (which might not necessarily be profitable), not be easily used for organized criminal purpose? 

 

3. Does the community owe the platform a duty to report (as alluded here, such that the community can be blamed for platform misuse)? How much blame does the community share?

 

4. If indeed the community has a duty to help FB police its platform, will FB also share its revenues with the community seeing as they are its informal "employees" as well? Or are they only buddies in bad times but strangers in good times?

 

5. Do (or should) victims of social media enabled harm (including, say, businesses that lose sales due to chaos or governments whose economies are effectively sabotaged) have recourse against the platform owner? To what extent? Who else should own the problem and why?

 

I think the "deflect blame to the victims" script is unwise and could backfire. It would probably cause an uproar if used in more assertive parts of the world (i.e. in developed countries/regions). 

 

Good day listers,

Patrick.

 

 

On Tuesday, November 20, 2018, 3:52:31 PM GMT+3, Wainaina Mungai via kictanet <kictanet at lists.kictanet.or.ke> wrote:

 

 

Hi, 

 

Facebook as increased their staff significantly to help police what is posted. We may want not to blame the medium used and focus more on addressing the culture of marrying off children of any gender in any country. That way, we remain focussed on 'children's rights'.

 

The main offenders in this case are the "sellers" and "buyers" who took part in the auction.

 

In the end, the extent of regulation will depend on mutistakeholder negotiations on the balance between an open Internet for all and the need to protect privacy, security and human rights online. 

 

Wainaina

 

On 20 Nov 2018 15:18, evelyne wanjiku via kictanet <kictanet at lists.kictanet.or.ke> wrote:


Hi listers, 

 

Im following a debate on cnn about this south sudanese 'baby bride' who was auctioned on fb. 

 

It brings me back to this question, who should regulate facebook? Some argue fb is too big to regulate all the things that happen on their platform. 

 

Who should police fb? Is it us? We have power to shut down our pages if we dont agree with what goes on in their...but we don't. Why?

 

Is it facebook? Do they care about being responsible especially in Africa?

 

Is it government? And just how far can the government reach? 

 

Or should we just relax and face the beginning of the  end by having an attitude of anything goes as long we have internet. 

 

Nice day everyone. 

 

Sent from Yahoo Mail on Android


 

_______________________________________________
kictanet mailing list
kictanet at lists.kictanet.or.ke
https://lists.kictanet.or.ke/mailman/listinfo/kictanet
Twitter: http://twitter.com/kictanet
Facebook: https://www.facebook.com/KICTANet/
Domain Registration sponsored by www.eacdirectory.co.ke

Unsubscribe or change your options at https://lists.kictanet.or.ke/mailman/options/kictanet/pmaina2000%40yahoo.com

The Kenya ICT Action Network (KICTANet) is a multi-stakeholder platform for people and institutions interested and involved in ICT policy and regulation. The network aims to act as a catalyst for reform in the ICT sector in support of the national aim of ICT enabled growth and development.




| 
| 
| 
|  |  |

 |

 |
| 
|  | 
KICTANet (@KICTANet) | Twitter

The latest Tweets from KICTANet (@KICTANet). The KICTANet is a multi-stakeholder platform for people and institu...
 |

 |

 |





| 
| 
|  | 
.co.ke at Kes 580. Promo code is Google domains registration only, renew...

.co.ke at Kes 580. Promo code is Google domains registration only, renewals, Web hosting in Kenya
 |

 |

 |





| 
| 
|  | 
Security Check Required


 |

 |

 |



KICTANetiquette : Adhere to the same standards of acceptable behaviors online that you follow in real life: respect people's times and bandwidth, share knowledge, don't flame or abuse or personalize, respect privacy, do not spam, do not market your wares or qualifications.




  
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.kictanet.or.ke/pipermail/kictanet/attachments/20181120/cdb5568c/attachment.htm>


More information about the KICTANet mailing list