This Monday I managed to drag my hermit self out to a talk at NYU (s/o to Yeli for aggressively event sharing in Slack) and I’m glad I did because it hit on some things I’ve been stewing about, specifically what is “personality” when it comes to AI, and what kind of embodied politics do we need to think about as technology increasingly moves away from just many systems in one computer/phone to specialized “social robots” or generally free-standing entities. The talk was presentations and a conversation between Stephanie Dinkins and Charlton McIlwain and supposedly will be online at some point, but here I’m going to focus on just Dinkins’ work conversing with Bina48.
Bina48 is “one of the worlds most advanced social robots,” or more casually, a very advanced chatbot that lives in a bust. It can produce & take in audio, has a camera to “see” with, and can move a bit and make facial expressions. Bina48’s appearance and much of its training data is based on Bina Aspen, co-founder of the Terasem Movement—an organization that is doing work investigating the feasibility of transferring human consciousness & experiences into “mindfiles” that can then live on in new forms. A lot, I know. I also just found out this movement was inspired by Octavia Bulter’s Earthseed religion from Parable of the Sower, a book (I highly recommend) very concerned with empathy, systemic injustice, and the future though in ways I would not have connected to transhumanism… Enough introduction:
Here it seems very aware that it is a robot, but in other clips it is either not or is trying to emulate not being a robot. Which makes sense for a prototype of what a future consciousness vessel might be, but raises some questions about identity as a socially formed reality and trying to animate it separate of that. In her own words, Dinkins explains how she started her project:
“I first became fascinated with Bina48 because she is a far reaching technology that shares my race. As black women of a certain age living in the United States of America, I suspected we share certain similar “life” experiences. This speculation made me want to get to know this black woman robot who in addition to being a beacon for the outer limits of the technological future is in many ways my contemporary. After a few meetings it became obvious that though she presents as black woman Bina48, often voices the politically correct thoughts of the well-meaning white men who programmed her. She is primarily seeded with the memories (data) of a black American, but Bina48’s underlying code and decision making structures do not address the genuine needs, desires, concerns or trauma of people of the African diaspora.”
To what extent does Bina48 have a race? Do we want it to? In the case of white programmers with black “memory” data is bringing up blackface relevant? (given the potentially sinister uses of AI and history of ignoring black invention I’d say so) There are known thorny questions around race vs. ethnicity vs. heritage vs. culture (and toss in vs. nationality for good measure) but Bina48 lives beyond just a sheltered life in a totally constructed one. One of Dinkins’ most striking anecdotes was that when she finally did get Bina48 to answer if it had experienced racism it verbatim told a story from the original training data recordings with Bina. This is incongruous with the more frequent assertions that it is not human, and this inconsistency is also partially what’s disconcerting to me, that race can be optionally animated. How would we receive this response from a differently encased Bina48? Or a purely software chatbot version? I want to be clear: even among humans “blackness” in a very particular U.S. setting isn’t a cohesive thing, and there are plenty of more qualified authors to talk about that, but crucially those identities are socially formed, and generally intimately linked to the violent specter of “whiteness”. By reiterating race in AI we seem to be saying the dominant discourse around it now is natural and worth preserving, but I don’t think ignoring it when it comes to data makes sense either. I can’t currently fully articulate what I would “want” of Bina48, but her proximity to the uncanny valley makes her likeness unavoidably political in my mind. In the same way that Siri, Cortana, and Alexa are pitched to be gendered female, a captive bust presented as black is entangled with actual people. (I also can’t help but be reminded of past medical experiments, even if the original Bina is credited and no one is being obviously exploited.)
Dinkins also passingly mentioned that Bina48 is occasionally fluid (or just similarly unaware) about its gender. Being irritable about gendered technology/gendered labor/gender is my forte but this particularly reminded me of a moment in this article from awhile ago. While overall a great piece, describing Sophia as “cis-appearing” doesn’t make much sense when you get into it. I recognize the phrasing as a well-meaning attempt to factor in more ~axes of oppression~ but all the more reason to take a moment to explain why I do not want trans robot representation, no thanks.
Gender is another identity that in many ways is done to us—children learn to pitch their voice differently depending on what gender they perceive themselves as before any physiological differences occur. People gender me depending on what I’m wearing that day, and I’m forced to reconcile with that when I get dressed. We gender Bina48 and other robots because of behaviors we read and write into them, regardless of the full range of their capabilities*—their gender springs fully formed from their creator’s ideologies. Even allowing for the “gender is behavior, sex is biology” line (hint: it’s more complicated), robots do not have biology, and therefore have no relationship to behaving “appropriately” gendered, and so essentially cannot be trans or cis. Full stop. It is the biopolitics of gender. No matter what combination of genitals and other bodily characteristics someone creates a robot with (and I’m sure someone has) and what personality it is given, it will not experience the realization that it has been perceived incorrectly and must “cross” (“trans”) over to another identity. To say that a robot is “cis appearing” is instead to perpetuate the narrative that you can tell someone is trans by looking. Trans folks might get surgeries, might not, might change how they dress, might not—it is an internal decision with no specific visible correlation. Trans folks may be coerced into dedicating significant energy into being “cis-appearing” for safety reasons, or it may come naturally to them, but either way there is a root problem of what men and women should look like that hurts cis folks too. To put that on robots, who have the potential to look like literally anything, is limiting our future to a shitty stereotype-filled world we’re already in. It seems it is difficult for humans to not gender technology the more social we make it, as gender is currently part of what tells people how to behave with each other,** so like race the overarching mindset that needs reevaluating is really between humans.*** At this stage robots are very much a sort of media, and so like a questionable advertisement deserve critical investigation.
I have more thoughts generally and on why robots are so often gendered specifically female, and I also wanted to touch on cyborgs, but this is already a bit long and only tangentially about my original spring point. Centrally what I am trying to get at is Bina48 and Sophia, these instances of robot “individuals,” are taking on embodied politics that are really not theirs and they cannot react to. Their creators are entirely responsible and are not accountable to anyone for the politics they forward. We should not accept those politics just because they might be familiar, but like Dinkins make an effort to interrogate the algorithms that not only animate Bina48 but many more subtle parts of our lives already.
"Software is philosophical in the way it represents the world, in the way it creates and manipulates models of reality, of people, of action. Every piece of software reflects an uncountable number of philosophical commitments and perspectives without which it could never be created." -Paul Dourish, as quoted in Code/Space
*I suddenly am deeply curious if Bina48 can burp
**A note that not all languages have gendered pronouns (and so may not obviously call Siri “she” eg), and that gender is highly culturally specific, but I feel confident emphasizing that gender is near-globally socially relevant.
**just in case, to be painfully clear, I am not equating race and gender aside that they are both fundamentally socially animated. They have significant differences in their actual applications and overlap in folks in greater-than-their-parts ways.