Truth, Consent, and AI: Data Harm Is Exactly What It Is Like to Be a Woman

The introduction of generative AI for the mass market once again confronts the traditional gatekeepers of mass knowledge production – the press – with the choice of reflecting on their own position or defending it unreflectively. Journalist Keiko Tanaka attempts the former, not least by criticizing the latter.

*

“Your mommy went to the laundromat. Don’t worry now; go to sleep,” my mother-in-law said to my first-grade daughter, who woke up in the middle of the night to find me gone. My mother-in-law thought it was probably reasonable because in Japan, some school supplies had to be washed right after the day they were worn, and it was the mother’s job to keep them clean and tidy. Then, when her son came home, she told him that I was still at work because it was customary for me to work very late. In reality, she had no idea where I was, but she made it all up; I was down the street having my own personal mini-celebration of completing my second semester towards a degree in education. I can’t improvise by cleverly tailoring stories like my mother-in-law, but I find them definitely more nerve-calming for the two of them than my likely versions of “Don’t worry, I’ll just be down the street having a few drinks!”

In a 2019 interview, I asked Lawrence Lessig what he thought about how we use apps and click “agree” to terms and conditions without ever reading them. What struck me the most was how he never had to lie as a child; he was an obsessive truth-teller. “Today, you can’t go through life in a tiny way without telling a lie all the time. Like “yes, I read your terms and conditions.” It’s a petty lie. You teach a generation that the way to get along with life is to lie.” For Lessig this is reason enough to abandon an infrastructure of consent as a way to regulate access and terms of service for those things. “If you can’t actually be expecting the people to be honest, you should not be asking.”

As a woman, I have mixed feelings about lying and consent. My belated mandatory academic training module on campus safety, which I first took at the age of 36, taught me about clearly communicating boundaries and levels of consent, and that one has the right to say no at any point in relationships – only to reflect on my family endeavors or my life trajectory, which I feel I never had a hand in planning. I had always wanted to go to a Ph.D. program to further converge my understanding of my bachelor’s degree in journalism and my master’s degree in information technology. Fortunately, my friends never once recommended a fellowship opportunity to me – until my young husband, who had little clue but was burdened by the idle pressures of childbearing, scuttled the idea. Back then, when I was a newlywed, my husband urged me, “No secrets!”. I thought I was simply responding to a marked day on a calendar for his reference. Little did I know that he was using that information to make calculations and predictions using an app he had installed.

News content by generative AI?

Japan’s Newspaper Association recently published its position on the use of news content by generative AI, saying it was concerned about personal information, copyright infringement, and most importantly, a potential threat to democratic discourse. For a moment, I thought I could root for them. But soon I was saddened to see that, on top of the pitfalls of copyright reform in 2018, there was all sorts of nonsense in the details.

They argue that 1. AI may cause chaos in public discourse by having the ability to spread misinformation and disinformation on a large scale; 2. Generative AI may contain “sensitive personal information” protected by Japan’s Personal Information Act; 3. They were not warned about the negative aspects of AI when the copyright law was amended, which now allows the use of any copyrighted material in the development phase of AI; and 4. Generative AI poses challenges to publishers’ copyrighted content; 5. Lack of transparency in operations.

Unfortunately, this shows their poor understanding of what AI is. It is well written in Japanese, but the subject and object are not even clear. Do they want to problematize the data collection part or the output part? What upset me the most is that the News Association is acting like a “publisher,” a term used by technology companies, when they should be presenting themselves as “press,” not publishers. It almost seems to me that the News Association is about to ask for ancillary rights now that the tech giants are taking away their advertising revenue, as they did with Article 11 of the European Copyright Directive.

The dilemma of “free” technology services

“Nothing is more expensive than what is offered for free.” My great-grandmother, who called herself a proud hyakusho, or peasant with 100 jobs, lived to be almost 100. She passed on this family precept to me along with another, “Circumstances may justify a lie,” as she often told me, a saying originally derived from the Sanskrit upāya, meaning pedagogy or skill in means, as a term commonly used to explain larger Buddhist beliefs in Asian countries. The convergence of the two precepts has been on my mind as I face modern problems, especially when signing up for and using free technology services where our attention and data become their business model. I pretend to read and agree to the terms and conditions, I mean, what are the choices? You have to make an informed decision about accepting terms and conditions if you want to interact with any technology.

I wonder if the press knew what they were getting into when the tech companies started to take over and become the gatekeepers of knowledge. The press, which was supposed to have social responsibilities and functions, ethics, then convinced itself as a publisher on platforms, optimizing its headlines to best suit the algorithms of search engines. Forget provenance, digitization meant becoming a leading player in the attention economy. They must have done what they were supposed to do and chosen the right direction, but this time they’re playing by the rules of the tech giants, and there’s no winning in this game.

Did “Napalm Girl” have a choice?

A long time ago, newspapers were in a different position. They were the gatekeepers and the agenda-setters. They have special privileges and powers as they balance, trying to constitute ethical journalism while exposing and invading privacy, sometimes against the will of the subject. As Phan Thi Kim Phuc, also known as “Napalm Girl,” says: “As a child, I was so embarrassed; to be honest, I didn’t like that picture at all. Why did he take my picture? I never wanted to see it. I couldn’t go to school. I couldn’t fulfill my dreams. And so, I kind of hated it.”

In the iconic image depicting the horrors of the Vietnam War, Phan Thi Kim Phuc, recalls that she never consented to be photographed. She had to give up her dream of becoming a doctor because of the photo that made her famous as “Napalm Girl.” No journalist paid attention to her struggle. It took her many years to reconcile with the past.

Now read that line again: “Generative AI training uses vast amounts of information on the Internet and is likely to include “sensitive personal information, […] In principle, the collection of sensitive personal information itself is prohibited without the consent of the individual concerned, but there is a risk that AI may include such information in its responses.” Another on copyright: “Generative AI (GAI) developers and implementers should not use publishers’ intellectual property (IP) without permission, and publishers should have the right to negotiate fair compensation for the use of their IP by these developers” and “Without permission and specific licenses, GAI systems are not simply using publishers’ content, they are stealing it.”

The last colony

In “Why Love Hurts,” sociologist Eva Illouz quotes Simone de Beauvoir: “Even in love, men retain their sovereignty, while women aim to give themselves up.” What traditional news businesses are going through is everything women have gone through on a daily basis: the long history of shadow work at home before cheaper labor in Africa today doing the data cleansing and content moderation for AI companies, the blurring, the liquidation of boundaries, having a say in what gets done – the autonomy in decision-making or lack thereof in production! Not to mention the lack of wealth from the system of owning the intellectual property of ladies who sing the blues, knit, or cook. “The last colony,” as Sawako Ariyoshi called it in 1979 when she translated “Ainsi soit-elle” by French feminist Benoîte Groult.

So I want to welcome you all to the digitized last colony where defiance is the best new skill, but I know I should not let the past intrude on the future.

Leave a Reply

Your email address will not be published. Required fields are marked *.

This site uses Akismet to reduce spam. Learn how your comment data is processed.