A Moral Dilemma: WePay shuts down adult entertainer’s medical campaign

Image

While technology is often hailed as an agnostic and apolitical tool that can do wonders, technological platforms are increasingly becoming intertwined in the dynamics of human behavior and are thus venturing into moral territory. Case-in-point: WePay, a crowdfunding site, shut down an adult entertainer’s campaign to raise money for emergency medical care because it alleged she offered pornographic pictures in exchange for donations.

Here is the story according to TechCrunch:

 Adult entertainer Eden Alexander had an allergic reaction to a medication, causing her to need serious medical care. In Alexander’s account, doctors wrote off her condition as stemming from hard drug use, due to her occupation, and did not give her proper treatment, which made her even sicker. Alexander then set up Crowdfunding campaign for her medical bills on GiveForward (now removed), which processes payments with WePay. Unfortunately, one of the campaign’s supporters tweeted offering nude pictures in exchange for donations to Alexander’s campaign, and Alexander retweeted the offer.

That retweet was deemed to violate WePay’s terms of service, which states “you will not accept payments or use the Service in connection with the following activities, items or services: Adult or adult-related services, including escort services, adult massage, or other adult-entertainment services; Adult or adult-related content, including performers or “cam girls”; and Obscene or pornographic items.”…WePay subsequently withheld some of the funds donated to Alexander and eventually had her GiveForward page shut down.

 

WePay cited that it is beholden to contractual obligations with banks and credit cards, and did not want to risk being slapped with a fine or shut down because of improper usage of its platform. The moral questions embedded in this story demonstrate a point Dov made in the Wall Street Journal, that “the more we open up access to participation in two-way dialogues on [technological] platforms, the harder it is to influence the behavior that occurs on them. We will need to wrestle with where on the spectrum from morally agnostic to moral, or values-neutral to values-inspired, these platforms will fall.”

Yet, as the TechCrunch writer, Josh Constine notes, there was something amiss in how WePay enforced the rules in this situation; it lacked humanity and consideration for the serious circumstances that Alexander was in. Constine writes that this incident “should teach companies that enforcing terms of service shouldn’t be done robotically,” with which we agree. Rules are impersonal and blind, which means that they do not discriminate, but which also means that they do not know how to take into account the human nuances of a situation. There were a variety of alternatives they could have explored, including reaching out to Alexander to remind her of the Terms of Service (TOS), and asking her to retract her re-tweet.

We do disagree with Constine, however, when he writes that “it might not always be scalable, but we must remain humane in how we treat each other, even online.” Business loves to systematize and scale. ERP, HRIS, TQM, Kaizen, Six Sigma are all systematized processes we have created and scaled. The next frontier is systematizing and scaling our humanity, in being rigorous, deliberate and systematic in how we embed human values in the behaviors of every employee. It will be difficult, as Constine suggests, because it requires deep work, but it is not impossible.

How does this sort of scaling happen?  It starts with metrics, because you cannot scale something without measuring, and more importantly it starts with a recognition that it is just as important to measure “how” someone behaves as it is to measure “how much/how many.” In the case of WePay, instead of just calculating “how many” infractions of its TOS occurred, it could have cataloged and tracked “how” its employees communicated and enlisted Alexander or any similar customer and brought them to a place of shared understanding. They could have focused on whether they communicated with empathy and understanding, or acted with a sense of what one should do and not just what one was allowed or not allowed to do. These are all “how” issues. To scale humanity, WePay needs to first decide how their values ought to manifest in specific behaviors and embed the metrics that capture these behaviors in employees’ performance reviews (for more information, see lrn.com/howmetrics). Scaling humanity is much harder than scaling computing power (there is no Moore’s law for humanity), but that only makes it more urgent that companies begin the hard work.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 9,368 other followers

%d bloggers like this: