Think back to the last time you had to call a real, live person in order to complete a purchase or have a problem resolved. How did it go? Did you and the customer service representative you spoke to have trouble understanding one another in some fundamental way? Or was it a smooth interaction, almost as if the CSR you spoke with was carefully hand-picked for you by robots?
If it’s the latter, it might be because the CSR you spoke with was in fact carefully hand-picked for you by robots.
That’s what the call-center tool Afiniti is out to do, the Wall Street Journal reports.
Afiniti describes its mission as “transform[ing] the way humans interact” by using tech to “discover, predict, and affect patterns of interpersonal behavior.” And you’ve probably given it more than you think, to work with.
Because the thing about life in the 21st century is, unless you go to extreme and deliberate efforts to prevent it, you’re leaving a giant digital wake behind you. Everywhere you go, everything you do — it can all be tied back to a unique individual, and that individual is you.
How? Well, we make it easy: most Americans right now have voluntarily acquired a ten-digit unique identifying code tied to a static device in their purse or pocket. It’s your mobile phone number, and it goes everywhere with you and belongs to only you.
Any activity linkable to that number is something that can be collected, sorted, traded, bought, and sold — and is. There are data brokers out there building giant databases about, well, everything that anyone with enough money can buy and use.
Your name may not be unique, but who needs a name when your ZIP code, estimated income, estimated net worth, employer, last hundred credit card purchases, date of birth, presumed gender, parenting status, “ethnic affinity,” and public social media history are all right there for the mining?
According to the WSJ, Afiniti pulls in your entire history with the business you’re calling (sensibly) and also accesses a credit profile assembled on you. After that, it scours your public Facebook, Twitter, and LinkedIn posts to see what kind of a person you are… or at least what kind of words and tones you use. In total, it can scope out up to 100 databases for information linked to you basically the second you dial into the system.
If the idea of that kind of creeps you out a little, you’re not alone. Even the folks at the top think it’s a little bonkers: “It’s a little overwhelming, sometimes scary, to know how much information can be accumulated about you,” Larry Babbio, an Afiniti board member and former Verizon exec, told the WSJ.
So Afiniti has access to a robust wealth of information about you, and it can put that all together to build a fairly comprehensive picture of who you are. Feed it to the right algorithm, and it can spit out suggestions and predictions about the way you are likely to interact with the world and all the customer service agents in it.
Afiniti, meanwhile, is also collecting data on all of the call center employees it has to hand. It knows the outcomes of their previous work with previous clients, and and can track what went well and what didn’t. If a customer went away happy and spent more money with the business, great! If not, well, less great.
Those interactions ad up into a pattern, Afiniti CEO Zia Chishti told the WSJ. “You have a very accurate prediction of likely behavior … The machine-learning aspect allows us to tease out patterns … in a fashion that’s more effective than chance.”
Based on those profiles, the software then seeks to match you with the best call center employee for your specific needs. So you don’t just go into the queue and get the next available worker; you wait until the software determines that your magic CSR is available, and a match is made.
The folks at Afiniti — and clients that use it — are right that there can be advantages to doing it this way: getting customer service from someone who understands you and your problem, and whose style meshes well with yours, is often a more pleasant experience than the alternatives. Pleasant experiences in a potentially high-tension situation can calm down customers, retain their business, and leave them with a positive feeling about the interaction, which is good for both the consumer and the business.
But there are huge problems with it, too. For one thing, software can discriminate without caring. Maybe the algorithm thinks that people like you only ever do well when talking to men — do women employees then get an equal chance to make your day? Maybe you’ve said nasty things on Twitter about people for whom English is not their first language — do bilingual employees then not get an equal opportunity to perform their jobs for you?
It also can reinforce subconscious bias: you may be willing to talk to anyone, but if you’re routinely paired with one specific type of customer service employee, you may start to feel on some level that only that kind of person can or should do the job.
And then of course there’s the other glaring problem, that plagues all of our modern big data systems: it’s a big fat black box. Consumers have no way of knowing what goes in or what’s coming out.
Afiniti promises it only buys databases that are “legitimately available for purchase,” according to the WSJ, like the ones Experian and Acxiom assemble. And the only social media data it can access is public — if all of your Facebook posts are friends-only, they’re not included.
But you casually cannot verify if the data that those other companies acquire and assemble is, itself, particularly valid. There’s no way of knowing what data brokers know about you, despite FTC efforts to increase transparency.
The credit bureaus — including Experian — are in particular known to be mock ably error-prone and widely complained-about. And yet the use of data-driven profiles for all kinds of predictions keeps growing.
“There’s a process of discrimination going on,” said Joseph Turow, a University of Pennsylvania professor who studies digital marketing, told the WSJ. “Companies are bringing data together that we have no knowledge about, and it may discriminate against in a prejudicial sense or a positive sense, depending on who we are.”
Afinity keeps a client list on its site that lists some of its largest global clients. In the U.S., ppublicly-listed companies using the service include Sprint, T-Mobile, and Caesars Entertainment — but there are almost certainly many more. On that same page, Afinity estimates it has so far assisted in 719 million calls and counting.
Oh, and by the way: its next move is bricks-and-mortar retail. Chishti wants to take AI into physical stores to use facial-recognition software to identify you as soon as you walk into a shop, so the right employee on duty can get the ping to go over and sell you things in the way you most prefer.
by Kate Cox via Consumerist