Today we are seeing virtually every industry being transformed by new digital business models, but in some instances this transformation can include an unwelcome side effect: digital discrimination.
It’s just targeted advertising…
A former colleague recently shared a Slashdot article that
caught my attention about targeted advertising on Facebook that allows advertisements to be
filtered based on “race preference.” The sheer volume and volatility of
comments on this post attest to how controversial this topic is.
From the point of view of a marketer, of course, I always want
any advertisement I pay for to reach the ideal audience with maximum efficiency.
There is not an inherent problem with advertising a product tailored to meet
the needs of a specific ethnicity, or religion, or gender or sexual preference.
Nor is it a problem to advertise in a publication or site that caters to a
niche audience.
This practice breaks down when you are targeting ads across
a broader public network (i.e. Facebook, Google, LinkedIn, Twitter, etc.) and
you somehow exclude individuals based on the above demographics for things like
mortgages, housing or jobs that would likely have very strong equal access
protections in these United States and elsewhere.
The Atlanta Black Star news site did a great job
demonstrating how Facebook advertisements targeting on race can create a clear path
for discrimination in practice. Can the consequences for inappropriate advertisement targeting be left up to
the ad buyer alone? Maybe for specific business categories where equal
opportunity laws are in place, Facebook should put in the guard rails and disallow
some of these advertising controls.
Update as of Nov 11: Facebook will disable race-based targeting for specific industries - housing, employment, credit (via the Verge).
We’re just focusing
on selecting the right customers …
A practice known as “redlining” is not uncommon in the
financial and insurance business. Basically, an institution can assess the
suitability of a customer for a loan by looking at the location of current and
past residences, job and credit references, and other factors. Though the applicants that don't make the cut may not specifically defined by race, a level of discrimination
can result.
The increased prevalence of social media and public data
sources is a two-edged sword for redlining practices. Powerful big data
analytics tools can be applied to better monitor and guard against
discriminatory practices, but they can also encourage discrimination through certain
types of filtering.
In a 2013 study in the Proceedings
of the National Academy of Sciences (PNAS), “Private Traits
and Attributes Are Predictable from Digital Records of Human Behavior,”
scientists from the University of Cambridge and Microsoft Research were able to
combine data on Facebook “Likes” and limited survey information to determine
the following: They could accurately predict a user’s sexual orientation 88% of
the time for men and 75% for women; predict a user’s ethnic origin (95%) and
gender (93%) with a high degree of accuracy; and predict whether a user was
Christian or Muslim (82%), a Democrat or Republican (85%), or used alcohol,
drugs or cigarettes (between 65% and 75%), or was in a relationship (67%).
If I’m selling a big-ticket item like jumbo jets, that’s
handled by sales professionals who directly call on a handful of named,
qualified buyers out there. The customer selection process is still 99% manual,
and deals are closed face-to-face.
Now look at any digital marketing model worth its salt and
you will find it includes a much higher degree of targeted advertising, personalization
and 1-on-1 nurturing to cultivate exactly the right customers. Especially when it comes to B2C business
models, a laser focus is too expensive to sustain on a personal basis – you
need the assistance of a lot of data and automation at every step of the
customer journey to capture and service demand at scale. Advanced marketing and
sales systems are a frontier where the ethical and legal aspects of
discrimination will be debated.
The Sharing Economy,
or the Selfish one?
When it comes to the economy of ride-sharing and home-sharing,
the presence of personal bias can get played out on a platform-wide level.
Big city taxi drivers have long been famous for not picking
up riders based on race (the Lenny Kravitz tune “Mister Cab Driver” comes to
mind here). With ride sharing systems like Uber or Lyft, a driver can either
refuse or cancel a ride – perhaps based on the rider’s profile picture, or name.
Even if the platform does not promote discrimination, it can make it a lot
easier for a seller or sharer to do so.
A recent study of 1,500 rides in Boston and Seattle on the services showed that African American males were three
times as likely to have their rides canceled, and on average wait for rides 30%
longer than white males.
For their part, Uber has responded by saying they are always
looking for ways to improve their performance and implement features to reduce
it. In the long run, there could be less discrimination than the analog version
of having a taxi pass someone by on the street, because the platform can
monitor usage patterns to discipline the “bad actors” who unfairly drop rides.
On the house-sharing side, Airbnb has made strides to get
ahead of selection bias issues. Their response to fair housing complaints, even
if considered a little late by some, was well thought out and sent to all users. They are putting more training
and agreements in place for hosts, encouraging more instant booking units, and proactively
following up on guest discrimination complaints with assistance finding
alternative accommodations.
“… We are all committed to doing everything we
can to help eliminate all forms of unlawful bias, discrimination, and
intolerance from our platform. We want to promote a culture within the Airbnb
community—hosts, guests and people just considering whether to use our
platform—that goes above and beyond mere compliance."
-- Excerpt
from Airbnb’s Non-Discrimination Policy as of November 2016
I travel a lot, and especially on family trips, I probably
use peer-based rentals slightly more than conventional hotels by now. I have
noticed far fewer Airbnb properties in less populated or remote areas, where
the demand for temporary quarters is much less elastic. Since a vacation destination
is usually booked well in advance, most of these guys stick to systems like
VRBO/HomeAway. While these sites also have policies, your booking request is often just that: a request,
waiting for seller approval – and the property owner may have stricter cancellation
policies, and settle an advance prepayment off-system.
An off-site approval and transaction process seems to push
the liability for discrimination out to the property owner rather than having
selection bias played out in the platform itself. I can’t say that this
approach is any less discriminatory, in fact it also forgoes the instant
gratification everyone expects of a digital customer experience.
blueFug Net: What should
you do about it?
I believe the issue of digital discrimination is just
starting to get the attention it deserves, and it will likely grow in
importance for any business that sells or brokers goods and services to the
public. Here’s three ways to get ahead of it for starters:
- Conduct a
discriminatory audit. Examine the end-to-end customer journey in your
company. Where are you left open to discrimination issues? Are you in
compliance for the communities/countries you do business in? If such a group
exists, make this a regular part of a risk management or security group’s
purview. Consult a civil rights oriented attorney for advice if you do not have
such a specialist on retainer.
- Look for biased
usage patterns in your solution, and address potential discrimination issues
at both the platform and the execution level. Simply changing some wording or
selection buttons in a user interface will not eliminate discrimination in
practice. Examine the outcomes of customer interactions over time to ensure
they are not trending in a direction that suggests discrimination.
- Align your
digital transformation toward inclusion, not exclusion. Everyone in your
organization, as well as your business partners and vendors, has the potential
to be a model citizen, or a bad actor as a representative of your company. Get
broad agreement to this alignment, perhaps conduct anti-bias training. Everyone
can make an impact on bringing diversity and fairness to the overall digital
customer experience, even if they operate behind the scenes.
Communities and governments develop and enact laws to limit
discrimination for good reason. Just because a Silicon Valley-style industry
disruption is under way in your neck of the woods doesn’t mean you can ignore
the scrutiny a conventional business would face in the communities it operates
in.
Indeed, the posted sign saying “We reserve the right to refuse
service to anyone” you might see in a restaurant or bar won’t take you far in
the digital realm, especially if you leverage a platform that automates or facilitates discrimination at scale. Outrage travels fast – and bad publicity,
legal problems, lost business and forced resignations can quickly follow. Best
to get ahead of digital discrimination before it gets ahead of you.
Need on-point technology marketing strategy and messaging that clarifies value and cuts through the fog of competitor claims? Contact blueFug Ventures today and find out how we can partner with you.