Tech

When AI automates relationships | TIME

Share on facebook
Share on twitter
Share on linkedin
Share on pinterest
Share on telegram
Share on email
Share on reddit
Share on whatsapp
Share on telegram


ONEWhen we assess the risks of AI, we are overlooking a crucial threat. Critics typically highlight three main risks: employment disruption, bias, and surveillance/privacy. We hear that AI will cause many people to lose their jobs, since dermatologists to truck drivers to marketing professionals. We hear how AI turns historical correlations into inequality-reinforcing predictions for sentencing algorithms to predict more recidivism for black men than for white men. We hear that apps help authorities keep tabs on people, like Amazon Tracking which drivers look away from the road.

However, what we are not talking about is equally vital: what happens to human relationships when one side is mechanized?

The conventional story of AI’s dangers is blinding us to its role in a climactic “depersonalization crisis.” If we are concerned about increasing loneliness and social fragmentation, then we should pay more attention to the kind of human connections we enable or impede. And those connections are being transformed by an influx of technology.

As a researcher into the impact of technology on relationships, I spent five years observing and talking to more than 100 people employed in human interpersonal jobs such as counseling or teaching, as well as the engineers who automate and the administrators who supervise them. I discovered that the injection of technology into relationships makes this work invisible, makes workers have to prove they are not robots, and encourages companies to overwork them, compressing their work into smaller and smaller increments of time and space. Most importantly, no matter how good the AI ​​is, there is no human relationship when half the date is a machine.

At the center of this work is testimony. “I think every child needs to be seen, really seen,” Bert, a teacher and private school principal, told me. (All names in this article have been changed to protect privacy.) “I don’t think a child really understands that on a deep level. I don’t think they’re really bitten by the information or the content until they feel seen by the person they’re learning from.”

Many people depend on seeing others clearly to make their contribution: customers healing, students learning, employees staying motivated and engaged, customers satisfied. I have come to call this witnessing work “connective work,” and it creates value and, for many, is deeply meaningful. Pamela, an African-American teacher in the Bay Area, recalled how her own high school teacher took the time to discover that her selective mutism was a response to her family’s incessant moving. “I thought, ‘I want to be that teacher for my kids in this town. I want to be the teacher I wanted, needed and finally achieved.’”

However, this work is threatened by automation and AI. Even therapy, one of the professions most dependent on emotional connection between people, has seen advances in automated bots, from Woebot to MARCo. As Michael Barbaro noted in The Diary when ChatGPT3 responded to his question about being too judgmental: “Ooh, I feel seen – really seen!”

Read more: Do AI systems deserve rights?

Technologists argue that social-emotional AI addresses problems of human performance, access and availability, which is a bit like the old joke about guests at a Catskills resort complaining that the food was terrible – and such small portions! It’s certainly true that human connective work is fraught, full of risks of judgment and misrecognition—as Pamela repeatedly faced until she met the high school teacher who finally listened to her. However, the conditions of connective work shape people’s ability to see others.

“I don’t invite people to open up because I don’t have time,” said Jenna, a pediatrician. “And that’s a disservice to patients. My hand is on the doorknob, I’m typing and I’m thinking, ‘Let’s get the meds and get out the door because I have a lot of other patients to see this morning.’”

Veronica demonstrates for us some of the costs of social-emotional AI. A young white woman from San Francisco, she was hired as an “online coach” for a therapy app startup to help people interact with the app. She was banned from giving advice, but clients seemed happy to think of coaches as private counselors. “I loved feeling like I had an impact,” she said.

However, despite the personal meaning and emotional impact of the work, Veronica’s own language helped to minimize its effect. She “loved feeling like I had an impact” but quickly followed up with “Even though I wasn’t actually doing anything. I was just rooting for them and helping them get through some tough things sometimes.” Just as AI obscures the invisible armies of humans who label data or transcribe audio, it erases the connective work of the human workers it relies on to automate.

Veronica also found herself facing a new existential task: proving that she was human. “A lot of people asked, ‘Are you a robot?’” she told me. I asked her how she countered that impression. “Basically, I tried to talk to them a little, ask them another question, maybe share a little about myself if it was appropriate.” In essence, Verônica’s connective work – normally human activity par excellence – was not enough to convey her humanity, which she had to verify to a clientele accustomed to machines.

Finally, Veronica may have found the work moving, humbling, and empowering, but she left because the company grew its client list to unsustainable levels. “In the end, they were trying to model everything using algorithms, and it’s like you can’t explain the real emotional toll of the work in those moments.” Already convinced that coaches were nothing more than servants of the app, the company carelessly accumulated new clients.

In the midst of a depersonalization crisis, “being seen” is already scarce. The feeling of being invisible is pervasive, animating working-class anger in the U.S. and abroad, and rife in the social crises of “deaths of despair,” suicides, and overdose deaths that have radically reduced life expectancy.

While many remain close to family and friends, there is one type of relationship that has changed: the “weak ties” of civic life and commerce. However, research shows that these bonds help unite our communities and contribute to our health. A 2013 UK study entitled “Is efficiency overrated?” found that people who talked to their barista got more wellness benefits than those who went through them.

The solution that Big Tech offers to our depersonalization crisis is what they call personalization, as in personalized education or personalized healthcare. These advances seek to counter the alienating invisibility of standardization, so that we are “seen” by machines. But what if it’s important – for us and for our social fabric – to not just be seen, but to be seen by other people?

In this case, the working conditions of jobs like Bert’s, Jenna’s, and Veronica’s have consequences. Policies to limit client or student schedules and hours worked would help reduce burden for many groups, from medical residents to public school teachers and domestic workers, as well as a National Bill of Rights for Domestic Workers recently proposed in Congress.

We must also temper some of the widespread enthusiasm for data analysis, since its data entry requirements routinely fall to the very people tasked with making connections. Equally important is the imminent imposition of new technologies aimed at connective work. At the very least, social-emotional AI should be labeled as such so that we know when we are talking to a robot and can recognize – and choose – human-to-human connections. Ultimately, though, we all need to take responsibility for protecting the human bonds between us, because these are the unheralded costs of the AI ​​spring.

AI is often sold as a way to “free up” humans for other, often more meaningful work. Yet connective work is among the most deeply meaningful work humans do, and technologists are nonetheless committed to achieving it. While human beings are certainly imperfect and critical, we also know that human attention and care are a source of purpose and dignity, the seeds of belonging and the foundation of our communities; however, we keep this knowledge at the service of an industry that contributes to our growing depersonalization. What is at risk is more than an individual or her job, it is our social cohesion – the connections that are a mutual achievement between humans.



This story originally appeared on Time.com read the full story

Support fearless, independent journalism

We are not owned by a billionaire or shareholders – our readers support us. Donate any amount over $2. BNC Global Media Group is a global news organization that delivers fearless investigative journalism to discerning readers like you! Help us to continue publishing daily.

Support us just once

We accept support of any size, at any time – you name it for $2 or more.

Related

More

1 2 3 9,595

Don't Miss