Tag: researcher

  • The Street Finds Its Stories: Southside Chicago Digital Humanities

    The Street Finds Its Stories: Southside Chicago Digital Humanities

    This talk was delivered on January 29, 2024 to guests of the English Department at the University of Illinois Urbana-Champaign in Urbana, IL.

    ABSTRACT

    What is at stake when data becomes story, and story becomes action? These questions have major implications for society’s future as generative artificial intelligence becomes part of everyday life, but the Black body as dataset was exposed for analysis long before generative Al was consumer technology. In this talk, I consider Chicago’s Southside as a locus of algorithmic narrative, reading contracts and associated documents forged between analytics contractor Palantir Technologies and the Chicago Police Department to trace one pathway of Black data to algorithmic story to anti-Black action. Where Palantir’s CEO offers that “software is the language of our time,” I counter that a data story is only as telling as the set of assumptions it is written on top of. Finally, I offer erotic data poetics as an alternative to algorithmic intelligence, building upon poet Audre Lorde’s theory of the erotic to vision Black data ontology beyond state capture.


    Good afternoon everyone, and thanks for taking time out of your day to attend my talk. I am really glad to be here with you all today to share some of my Black digital humanities work. 

    In this talk, I consider Chicago’s Southside as a locus of algorithmic narrative, considering what is at stake when community data is algorithmically processed into an actionable narrative. I do so in three parts. The first part of my talk offers a brief overview of Black digital humanities, touching on some of the scholarly and public-facing work that I have done in this field to consider some of the uses of this theoretical framework. Part two opens up the Southside as a Black DH case study, reading contract documents and emails to trace one pathway from data to narrative to anti-Black action. Finally, part three turns to queer Black poetics in search of an alternative.  


    This story begins on the Southside of Chicago. West Englewood, my neighborhood, rarely gets to speak for itself, but it is constantly spoken through data. To be such a tiny slice of the city, West Englewood is known around the world primarily through breaking news alerts and crime data logs. When politicians call Chicago a murder capital, they’re talking about neighborhoods like West Englewood. Folks don’t talk about its rich history of activism and community engagement in the same way. 

    This, of course, has a negative effect on the community. As I was once reminded by community journalism trainer Jhmira Alexander, we citizens have been taught that violent crime is the true narrative of this community, and we have been taught to replicate that data truth in our constant vigilance against such violence, but we know as Englewood residents that this is not our reality. So vigilant we must be that it becomes too easy to miss the bright green trees blowing in the wind, the carefree kids playing on the sidewalk, the summer block parties bursting with barbecue and backpacks full of school supplies to give away. A microcosm of the stakes when Black data becomes algorithmic story becomes anti Black action. Here, though, the street finds its own stories.  

    “…the street finds its own uses for things…”

    Samuel R. Delany
    “Black to the Future: Interviews with Samuel R. Delany, Greg Tate, & Tricia Rose”

    In the 1993 interview where the word “Afrofuturism” is said to be first used, science fiction writer Sam Delany pushes back on the idea that Afrofuturist impulses are an easy alignment between nineties cyberpunk culture and Black technoculture.

    For Delany, cyberpunk’s interest in technoculture’s dirty complexities—summed by the phrase “the street finds its own uses for things,” a line used by early cyberpunk writers and adopted as a core sensibility—is little more than ironic white middle-class angst. This is meaningless in the face of “the anger, the rage, the coruscating fury from the streets toward the traditional use (which is, after all, lying to the people) of that technological armamentarium that is the referent for that cool and breezy word ‘things.’”

    He was referencing the 1992 acquittal of the four LAPD officers recorded savagely beating Rodney King and the riots that resulted, causing “millions and millions of dollars of devastation in Los Angeles”; to Delany, the idea that there could be any irony in that technological reality of Black control and destruction moved “into the lunatic.” 

    “…specific miss-use and conscientious desecration of the artifacts of technology and the entertainment media…”

    Samuel R. Delany
    “Black to the Future: Interviews with Samuel R. Delany, Greg Tate, & Tricia Rose”

    However, I have always felt that that phrase “the street finds its uses for things” without its tongue-in-cheek cyberpunk trappings, so perfectly speaks to the “specific miss-use and conscientious desecration of the artifacts of technology and the entertainment media” with which Black digital humanists are often concerned. Delany’s attention to naming this oppositionality “miss-use,” spelled M-I-S-S dash U-S-E rather than its usual spelling, points to the complex nexus of forces acting upon Black technoculture.

    It is not a matter of us misusing digital artifacts out of ignorance or cynical art practice. Despite being integral to mainstream digital infrastructure, we hack new worlds and code new stories precisely because our attempts at inclusion are violently denied.

    “…a call to study how we read, how we think, and how we become invested in anti-Black systems…”

    Kimberly Bain

    The street finds its stories. This is what Black digital humanities means to me: it is the intertwining of methodology and theory in service of “interrogating and mapping” and refashioning the complicated relationship between technology and the African Diaspora.

    To quote literature scholar Kimberly Bain from our article “The Street Finds Its Uses: a Black Digital Humanities Call and Response,” Black DH is a “call to study how we read, how we think, and how we become invested” in anti-Black systems of thinking and knowing. 

    “…working to try to secure a situation for our babies, to be able to secure our future babies, no matter where they have to go, wherever they end up.”

    Taryn Randle

    To quote Grow Greater Englewood steward Taryn Randle from our audio story “Growing a Greater Englewood,” Black digital humanities means “working to try to secure a situation for our babies, to be able to further secure our future babies, no matter where they have to go, wherever they end up”. My Black digital humanities practice involves using and understanding technology in relation to and in service of Black self-narrative, Black community (re)creation, and Black world making. For example, the above analysis comes from an article written with a dear friend; as is a common theme in my BlackDH community, that article began as a way for us to better understand our own relationship to and personal histories of the technologies we use and teach. 

    I have also had the pleasure of advising two scholars in their personal explorations of imagining and creating digital worlds for Black data. One project works with institutional and community archives for Black literary data in the DMV, the other works with a personal archive of family photographs and textiles. Both projects are examples of what it looks like to narratively build toward a new data future while keeping an eye toward the present; both scholars, though working with different data sets, were deeply concerned about representing their information in a way that honored rather than exploited their subjects. 

    Black digital humanities forms the core of my doctoral research, which uses queer Black poetics to work toward a transformative reading of the database. I offer a view of technology where users construct the self with and through a speculatively expansive yet technologically grounded vision of data and algorithms, blooming into subversive forms at the intersection of information technology and culture. For Black women, these subversive forms shift broader American technology and culture. In this way, my dissertation theorizes data analysis as socially constructed, something we move with and through rather than something that only ever moves us. 

    I begin in chapter one by contextualizing data analysis ideology through Black women’s surveillance history, arguing that data-based platforms are driven by some of the same principles as federal and corporate data surveillance networks. I accomplish this through a close reading of FBI dossiers targeting Black women writers for surveillance. In chapter two, I read together two of poet Audre Lorde’s groundbreaking essays from the Sister Outsider collection, “Uses of the Erotic: The Erotic as Power” and “Poetry Is Not a Luxury,” to think about poetry as an erotic algorithmic narrative, an alternative to that presented by the data state, which I will discuss at the end of this talk. This pattern plays out in Lorde’s biomythography, Zami: A New Spelling of My Name, as she comes to understand herself and the world through the women she encounters. These encounters constitute embodied data. In contrast to the data state narrative, poetry functions as a record analysis of that embodied information. 

    I return to the data state narrative in chapter three, reading data analysis company and government contractor Palantir Technologies against itself and its own claims of narrative ethics; this is the chapter from which I draw the second section of today’s talk. Finally, I move to multi-hyphenate creator Janelle Monáe’s 2018 visual Dirty Computer in chapter four as an example of erotic data analysis in practice. This film (or emotion picture as she calls it) mirrors American culture as being rooted in surveillance, capture, erasure, and reformatting for its citizens, featuring a company that operates a lot like Palantir Technologies, if Palantir had the ability to erase its data objects minds and upload an algorithmically verified narrative instead. However, the protagonists’ embodied truths expand beyond what the system can capture, producing a poetic antidote to their erasure. 

    My dissertation was a first iteration of my research on the materiality of information, the craft of information production, and the infrastructure of information technologies. I take craft literally to mean an engagement with both this material nature of data alongside the art of its production. For example, I take on craftwork in a measure between, a digital project that uses Lorde’s poetry as a framework for thinking about the user’s desire within algorithmic discourse.

    This project combines fiber arts, hardware, and critical digital interface design to consider the idea of calculation and measurement as vectors for desire.

    In the project, I consider “measurement” both in terms of rhythm and meter and in terms of regimented algorithms.

    Self and world are key entry points into the work of (re)creating community through narrative. I had the opportunity to enact this work in co-producing an audio story spotlighting the community justice work being done by Grow Greater Englewood and the Englewood Village Farms network, a collective of organizations that grow produce locally and teach residents how to grow their own food.

    This is a photo from a foraging walk I attended along the Englewood Nature Trail, a two-mile stretch of former train tracks re-wilded into a green space for residents to enjoy. Its entrance is the Grow Greater Englewood marketplace.

    Specifically, Grow Greater Englewood uses storytelling to demystify and teach agricultural methods for self and community sustenance, reminding participants of the great agricultural knowledge built within the African Diaspora before that knowledge was exploited in enslavement, securing a future for our Englewood babies. In this sense, story becomes a technology through which to store and share information, and our collectively produced audio story a means to articulate a theory of this information share.

    We can talk more about how Grow Greater Englewood uses Octavia Butler in this mission during Q&A, but I will say that their Backyard Gardens program aided my mom in growing some delicious Swiss chard last summer. 

    A mural on the side of the Grow Greater Englewood farm located at 58th and Halsted in Chicago. The mural features a smiling elderly Black man in a yellow shirt next to the Grow Greater Englewood logo. Between them is the Adinkra symbol Aya, representing endurance and resourcefulness. Behind them is a background of greenery that matches the greenery of the farm behind it.

    This is how the street finds its stories. Organizations like the aforementioned Grow Greater Englewood are critical because they write and act back against normative data narratives about Englewood and the Southside. When surveillance data becomes the community story, the resulting actions are inevitably anti-Black. 

    Of course, Black bodies are not new to surveillance and datafication; in many ways, we are the original American data-based technology. In her foundational text Dark Matters: On the Surveillance of Blackness, Black surveillance scholar Simone Browne makes the case that existing surveillance technologies are refined during transoceanic African enslavement; this system is well suited to executing anti-Black oppression. Beginning with the slave ship and ending with predictive policing, Browne argues that “certain surveillance technologies installed during slavery to monitor and track blackness as property (for example, branding, the one-drop rule, quantitative plantation records that listed enslaved people alongside livestock and crops, slave passes, slave patrols, and runaway notices) anticipate the contemporary surveillance of racialized subjects”.

    Many scholars, famously legal scholar Michelle Alexander in The New Jim Crow: Mass Incarceration in the Age of Colorblindness, trace myriad policing ideologies and practices as having roots in enslavement. In Race after Technology: Abolitionist Tools for the New Jim Code, sociologist Ruha Benjamin critically outlines how algorithms work akin to oppressive social codes such as those that produced enslavement, particularly as they play out across Black bodies. Data-based technologies “are sold as morally superior because they purport to rise above human bias, even though they could not exist without data produced through histories of exclusion and discrimination”. Lest we believe that our technology is colorblind and non-political, scholars remind us that modern data-based technologies are fraught with human oppression and dehumanization as a development sandbox. The only difference is that modern surveillance technologies accomplish this work exponentially faster, and use a lot less paper in the process. 


    For example: consider ShotSpotter, the highly controversial audio surveillance platform contracted since 2018 by the City of Chicago from the analytics company SoundThinking.

    ShotSpotters are pole-mounted microphones, actively listening for loud noises and algorithmically deducing which noises are gunshots, with the aim of triggering a rapid police response to a gun crime in progress. Despite the company’s titular claim that ShotSpotter uses sound analysis to produce actionable thought, the Office of the Inspector General reported in 2021 that “fewer than ten percent of ShotSpotter’s gunshot alerts led to evidence of a gun-related crime.”

    What ShotSpotter does lead to is increased policing; a SouthSide Weekly article notes that “police showed a pattern of stopping and frisking people more often in areas they considered prone to ShotSpotter alerts”. Even though the city’s own data shows the ineffectiveness of imagining a community through its sounds, and even though Mayor Brandon Johnson campaigned on a promise to cancel the contract, there’s an $8M line-item in the 2024 city budget to continue funding ShotSpotter. 

    This brings me to the second section of my talk, where I consider the narrative work being done by Denver-based data analytics company Palantir Technologies and its flagship software Palantir Gotham. With its software portfolio, Palantir attempts to solve massive information problems with even more massive amounts of analysis.

    The company operates in both the public and private sector but has become immensely profitable through its government contract work, at one point garnering the company a $41B dollar valuation. Where other tech companies “are built on advertising dollars,” as a 2020 CNBC article notes, Palantir licenses “software…used to target terrorists and keep soldiers safe,” working to support companies and departments whose “technological infrastructure has failed them”. In a nutshell, Palantir creates software to help its clients best leverage their data. 

    I am interested in Palantir not because they are one of the most connected data analytics companies in this country, nor because one of the company’s co-founders is a University of Illinois alum, nor because the company heavily recruits from U of I’s computer engineering programs, nor because citizens of Champaign-Urbana are spoken by visual information narratives written by the extensive surveillance network that encompasses this campus, as I learned at dinner yesterday, nor because the name of the company is a highly ironic literary reference, though we can certainly discuss that in Q&A.

    Though I had heard of the company in passing before beginning this research, I didn’t start focusing on Palantir until I stumbled upon a trove of documents released under the Freedom of Information Act. These documents relate to Palantir’s 2012 contract with the Chicago Police Department, including user guides for Palantir Gotham and email threads between Palantir, CPD, and the Cook County Sheriff’s Office. I am interested in Palantir because I and my community are subject to the surveillance dragnet created by this platform and its software afterlives. 

    I became fully intrigued when I began to see the gap between the company’s theoretical underpinnings and the realities of its work. CEO and co-founder Alexander Karp is a scholar of social theory and criticism, a PhD graduate of Goethe University Frankfurt, and brings a highly moral philosophical viewpoint to his company’s work. I am indebted here to comparative literature-trained digital humanist Moira Weigel and her article “Palantir Goes to the Frankfurt School,” in which she close-reads Karp’s dissertation (which is written in German) to trace his journey from doctoral study to big data analytics. 

    To vastly oversimplify, Karp’s dissertation deconstructs the concept of jargon as language to understand how aggressive drives can power the use of communal language. His ultimate goal is to prove that “the social is constituted through acts of unconscious aggression, and that this aggression becomes legible in specific linguistic interactions”. According to Weigel, this framework “anticipates the software tools Palantir would develop. By tracing the rhetorical patterns that constitute jargon…Karp argues that he can reveal otherwise hidden identities and affinities—and the drive to commit violence that lies latent in them.” In other words, by algorithmically patterning our linguistic data, Palantir knows our subconscious truth better than we do. It was this deep confidence in the intelligence of his company’s algorithms that drove Karp to claim in a 2020 investor’s call that “software is the language of our time”. 

    If software is language, working with the Palantir Gotham analysis software, or PG as it is commonly referred to, is akin to creating a new grammar. PG is built around a data schema structure called “dynamic ontology,” drawing upon the study and nature of being to derive a first-level classification system for all data that enters the system. As a Palantir developer explains in a lecture on the topic, their ontological view is that every single piece of information that exists can be classified as an object, a property, or a relationship.

    A review of Palantir’s work with the Marine Corps in Afghanistan provides a much more telling definition: dynamic ontology is a “means by which data from multiple sources are transformed and integrated from their raw storage formats into data object and associated properties that represent real objects in the world”. Ultimately, Palantir’s data narratives can only ever be notional at their absolute best, enclosing people, places, things, events, and their interconnectedness as the representational notions of themselves, using software to pattern-match a grammatical structure by which a data narrative can be written. 

    As outlined in the PG Object Explorer manual, the point of a data ontology is to “drill down” to “the outcomes you desire”.

    A user can perform a phone search, which returns “information related to a phone number, including subscriber name (person or business), address, and carrier data,” and “may also include applicable consumer bureau data and utility records.”

    A user could also perform a person search, which returns “information related to a subject using public record proprietary data” as well as “consumer bureau data, phone numbers, utility records, death filings, associated drivers licenses, health care provider information, and any criminal or arrest records”.

    Just in these two searches lies a wealth of information through which a dynamic ontology is meant to sift, with the purpose of revealing the information pattern the analyst desires. 

    Furthermore, “information returned will include the subject’s name, aliases, social security number, dates of birth, address and phone history,” the potential for “consumer bureau agency” information, and the software will comb state and federal databases to summarize the subject’s “Federal and State Sanctions, national provider identifier records, Assets, Driver Licenses, Nationwide Healthcare Licenses, Business Associations, Property Ownership…Vehicle Registrations, and much more”.

    The most basic report returned from both types of search lists “primary subject information” and “information on 1st degree relatives, neighbors, and associates, which include listed phone numbers and most recent addresses,” and the most comprehensive report is a multifaceted and highly networked subject data biography. 

    Palantir’s clients can then create their own data classification structures to hang on top of the objects-properties-relationships framework. In the case of the Chicago Police Department this means using the software for predictive policing, the theory that one’s potential “risk of becoming a victim or a possible offender” in a crime, such as a “shooting or homicide,” can be calculated. This is the very task for which Palantir was built; seeking out subconscious or otherwise hidden relationships between propertied objects, regardless of the surface-level object reality, and writing those relationships into an actionable narrative.

    This level of civilian analysis has resulted in the Strategic Subject List, or SSL, a “computerized assessment tool” by which individuals are “identifie[d] and rank[ed]” according to their risk score, an analysis of a number of factors, including “[t]he number of times an individual was the victim of a shooting…the number of times the individual was a victim of aggravated battery or assault…[or] gang affiliation”.

    A snapshot of the SSL database appears above. In one email, a CPD commander says that the Intel group at the Cook County Sheriff’s Office “love [Palantir],” and notes that CPD is interested in the platform particularly “for violent crimes / gun / gang crimes”. It is impossible to not read this as code for low-income Black and Latinx Chicagoans based on my experience of seeing Black and Latinx family, friends, and acquaintances be heavily profiled and captured along these lines. 

    What happens when the software pattern is wrong ?

    With the meteoric rise of commercial generative AI software, it is already passè to joke about how badly AI can hallucinate. It is one thing for AI to hallucinate that I am a scientist named Mark, as Chat-GPT 3 did when I produced an algorithmic biography preparing for a class activity last semester and the platform gave me a name and profession it found to be most logical—we can talk more about how that activity went in Q&A. It is a very different thing for an algorithm to hallucinate that I and my community are key strategic targets for increased policing; it is altogether a different thing entirely when those algorithmic hallucinations are a feature, not a bug.

    If language is a technology through which to sublimate human experience, software as language returns a narrative result that can be verified in data patterns, faster and more precisely than a human could verify. The ability to shift grammar depending on a client’s needs makes it possible to write the story you need no matter the data reality. 

    The SSL bears out these discrepancies; according to independent analysis, “fifty-six percent of Black men under the age of thirty in Chicago have a risk score…more than one third of individuals on this list have never been arrested or a victim of a crime, and almost seventy percent of that cohort received a high risk score”. A look at these data points’ lived experiences would likely reveal a greater need for socioeconomic resources, child and family support, and mental and emotional healing from generations of social trauma. But, the algorithm intentionally doesn’t account for these nuances, only object relations. Black people are overrepresented in this object narrative, and yet these deep insights into Black and brown ontology don’t seem to match the reality of Chicago citizens. 

    So whose narrative reality is this? A thread of emails written between one CPD commander and a deputy sheriff from the Cook County Sheriff’s Office offers clues.

    Bored during a demo meeting for a Palantir competitor software, the two officers started to email their feelings back and forth.

    The deputy sheriff joked that the meeting “might end with [her] punching this woman in her face :),” to which the commander replied that “[he] wish[ed he] hadn’t left [his] Taser at home”.

    Later the deputy sheriff complained that she couldn’t leave the meeting because “Olivia Pope won’t stop talking,” and the conversation continued from there.

    Not only are these two state officials casually joking about being violent toward a citizen, it becomes clear from the “Olivia Pope” title that they are talking about being violent toward a Black woman. The casualness of their conversation, plus the fact that they are IT professionals joking around on their work email accounts (which is part of the public record as government employees), shows that they aren’t worried about being caught. This probably was not the first time such sentiments were shared in this way. 

    Nevertheless, these are literally the actual people in charge of Chicago’s countywide surveillance. The deputy sheriff played a large support role in Palantir’s implementation at the CCSO; other emails show her organizing administrative elements like officer software training. The commander she was emailing was the point person for the Palantir contract at the CPD; horrifyingly enough, he is now a Supervisory Policy Analyst at the Office of Community Oriented Policing Services.

    Imagine my surprise when I realized from the unredacted emails that the deputy sheriff joking about punching a Black woman out of boredom was the same person who helped coordinate my policy debate tournaments with the Chicago Urban Debate League when I was in high school—but we can save that for Q&A. 

    More to the point, whose data story is this not? Of the 398,684 Chicagoans on the Strategic Subject List, 33% of those citizens had no actual relation to any criminal activity despite being spoken by the software as such. Granted, a 66% success rate might not be terrible analysis returns, but that rate hits very differently when one’s life and freedom hangs in the balance. When Karp claims that Palantir “makes sure the data is actually preserved in a way that guarantees its veracity,” he belies a desire to preserve data for its veracity according to the analyst’s truth, not the truths of those under watch. If Karp was meaningfully invested in the consequences of surveillance, Palantir would have to contend with at least one hundred thousand lives negatively affected by predictive policing at the hands of its software and one of its clients. Of course, the contract states that Palantir isn’t liable for the reality it helps to create; it’s just software as a service. 

    None of this is theoretical to me; CPD could easily craft and implement a strategic data ontology that includes my siblings and I. Let alone being Black and working class (revealed in census data and all consumer financial data tied to our immediate family), our earliest forms of state identification, first voter registration, and tax returns all list an address in a low-income, high-violent crime neighborhood in gang territory. We could come up in an associates search for any of our neighbors (mostly underemployed Black men between twenty-five and forty, surveillance targets whether warranted or not), and my cell phone associates me with them and other young Black men in Chicago via carrier location data and messaging data. Absent fathers being a longstanding trope used to pathologize Black families, our father’s Cook County death certificate could be a big flag.

    What would likely save my sister and I is the gender we were assigned at birth, and our educational and work affiliations. Though, as a CPD officer reminded our then-teenage brother, soon to be a first-year at the University of Illinois, upon demanding he produce documentation, my brother’s fancy Catholic prep school ID made him little more than a “smart nigger”. 

    Once upon a time, my older brother was a Black man under thirty in Chicago — a strategic subject profile— which is how that boy in the photo on the left was mistaken for someone else on that Westside block and accosted by police. He’s now a college-educated Black man over thirty in Southern California, but whenever he comes home he links up with our neighbors like they’re still ten years old playing outside.

    A Provider Comprehensive Report for him would return not only the CPD data narrative, but also the Los Angeles Police Department narrative. These most complex Palantir reports return “information across all states associated with the subject,” and the LAPD is another one of Palantir’s clients (many metropolitan police departments are). Anytime he drove his truck or his motorcycle from the oceanside city of Redondo Beach to the city of “Los Angeles since 2015, the police [could] see where [my brother’s vehicles were] photographed, when [they were] photographed, and then click on [his] name to learn all about [him]” as tracked through two states.


    However, in the face of the state’s algorithmic “technological armamentarium,” the street finds its own stories.

    In the final section of my talk, I turn to Black lesbian poet Audre Lorde and her theory of the erotic to vision Black data stories beyond capture and exposure, more nuanced and reflective than even the clearest Palantir data story.

    Why Audre Lorde? For one, she was under federal surveillance while developing her standing in the world as a writer and activist. Her biomythography Zami, a New Spelling of My Name is striking because she recounts multiple moments where her communist activism brings her into contact with the American surveillance network, first as a student in Harlem, New York and then as an expat in Mexico. Some of these moments of contact appear in her FBI dossier as well, in addition to descriptions of Lorde that she would have never been privy to. It is jarring to read two competing descriptions of the same person’s life, one of which purports to be an official record of that person and the other which is actually written by that person. 

    Moreover, what would it mean to consider data aggregation and processing in the same genealogy as poetics? I want to imagine a framework for algorithmic processing that is not tied to mastery and control over information, that is not tied to seeking some ultimate analytical truth, but rather treats data analysis like building a palette of colors from which to paint a painting, or like building a corpus of words from which to write a poem, or like creating a movement vocabulary and language from which to choreograph a dance, or like a series of soundbites to produce an audio story. These cultural forms work as information interfaces that express and encapsulate the creator’s experience of the world; an algorithm is no different. 

    “The erotic is a measure between the beginnings of our sense of self and the chaos of our strongest feelings. It is an internal sense of satisfaction to which, once we have experienced it, we know we can aspire.”

    Audre Lorde

    “Uses of the Erotic: The Erotic as Power”

    As Lorde writes in her essay “Uses of the Erotic: The Erotic as Power,” the erotic “is a measure between the beginnings of our sense of self and the chaos of our strongest feelings. It is an internal sense of satisfaction to which, once we have experienced it, we know we can aspire”. We are sold the idea that algorithms are an intelligent programmatic path, generating a map that moves us from the chaos of the world to the truth of the matter. My sense is that an erotic understanding of algorithms moves us from ourselves through the satisfaction of navigating chaotic experience. The satisfaction of navigating chaos is not an inherently negative desire, but the erotic does not let us ignore this desire in our search for mastery over information.

    For, ultimately, we are based in these desires. Each of us is one process for ordering the chaos of the world, and we bring our individual processing mechanisms together to create culture. These cultural forms help us navigate the chaos of experience and observation without trying to claim ownership of it, co-existing with others without trying to dominate, forms that can be edited and returned to for improvement and expansion (as opposed to optimization).

    To think about culture as data visualization means to take seriously that which we take in to our senses as data, and the erotic as a process of moving with and through that data. Here, aggregation and analysis is looking, listening, touching, smelling, and tasting, data condensed into revelation.

    This pattern of expansion and refinement is a rhythm in itself; indeed, rhythm operates as an “answer to chaos…there is rhythm wherever there is a transcoded passage from one milieu to another,” creating a melodic path by which the unpredictability of chaotic data aggregation can be navigated.

    This is a rhythm that could not be mistaken for the droning dogma of meter, “the sterile word play that, too often, the white fathers distorted the word poetry to mean — in order to cover a disparate wish for imagination without insight”. Cultural forms offer analysis without sterile word play, sensual and multidirectional instructions for transcoding feeling data and moving it to a new context; an erotic algorithmic narrative. 

    The street finds its own stories. In the movement from the self to chaos, we pattern our data toward revelation, point to pen to pavement as we write back, producing erotically algorithmic returns that reflect our information realities in spite of our federally contracted datafication. For, as Grow Greater Englewood steward Taryn Randle reminds us, no infrastructure is permanent; we must take it upon ourselves to build a base for ourselves that nourishes rather than neglects.

    According to the late Mecca Bey, GGE steward and co-founder of grower collective Sistas in the Village, the word on the street is, “when y’all comin’?” 


    Citation (Chicago):

    Alexander, Elizabeth M. “The Street Finds Its Stories: Southside Chicago Digital Humanities.” Paper presented at the University of Illinois Urbana-Champaign, Urbana, IL, January 2024.

  • The Street Finds Its Uses

    The Street Finds Its Uses

    “The quandary before me—a blessing or a vexation, depending on one’s perspective—is a deceptively simple one: how do we read?”

    “Literary scholars such as myself are often guilty of naming everything a “text” that can be “read,” and despite recognizing the temptation for what it is, I nonetheless find the question of how (and why) we read (whether literary text the world around us, or the dreams we collectively share with others) a compelling one. This is in no small part because the reading methods we deploy are the same we use to make, create, and build the world around us. But I find the question an urgent one for the simple reason that it offers us an entry point into asking how do we study? That, perhaps, is at the crux of what we call Black digital humanities.”

    Co-authored with Kimberly Bain, Ph.D.

    Citation (Chicago)

    Bain, Kimberly and Elizabeth Murice Alexander. “The Street Finds Its Uses: A Black Digital Humanities Call and Response.” Studies in Romanticism 61, no. 1 (2022): 161-174. https://doi.org/10.1353/srm.2022.0016

    Read the full article in a new tab.

  • Growing a Greater Englewood

    Growing a Greater Englewood

    Synopsis

    Where do you get food when the grocery stores aren’t open, or when their shelves have gone bare? Many Chicagoans asked themselves this question for the first time during the 2020 Covid pandemic, but for those living in the food deserts of the city’s West and Southside’s, the question wasn’t a new one. Join Change Agents producers EM Alexander and Dylan Cohen as they take a look at the urban farming collective Grow Greater Englewood, who works in partnership with community stakeholders to develop local food economies and land sovereignty which empower residents to thrive.

    Produced with Dylan Cohen for Change Agents the Podcast (Juneteenth Productions)

  • Dirty Computer Data: Erotic Data Poetics

    Dirty Computer Data: Erotic Data Poetics

    “Since the beginning, I have been fascinated by the ways in which Black women acknowledge and self-critically subvert structures of power without giving themselves over to those structures.”

    “I have also been fascinated by the ways in which power crystallizes around technology, particularly technologies of capture and narrative. […] I did not set out to write a Lordian erotic reading of codes, data surveillance, and digital technologies, and I do not have a set of UI/UX best practices for erotic digital platform design. I don’t even know if I want that—is the answer to Facebook’s myriad ideological issues simply an erotically structured Facebook? Rather, I learned through experiencing this project that the best way for me to read, analyze, and understand digital technology was through the erotic. Where the surveillance structure pins down a datalogical answer, I want to locate the excess context around that answer and see how the narratives compare…”

    ABSTRACT

    Corporate actors increasingly use networked technology as a storytelling device, drawing on a long federal history of writing citizens through surveillance, data aggregation, and analysis. Social and juridical infrastructures are racing to keep up with the rapid pace of data-based technological development, but existing guardrails around aggregation serve instead to foster an environment of algorithmic data storytelling. In this context, narratives written by data are deemed more truthful than the stories told by the data’s subject.

    Dirty Computers: Erotic Data Poetics turns to Black womanist poetics to glean an alternative ideological framework for collecting, analyzing, and using data. In this dissertation, I argue that queer womanist writer Audre Lorde’s concept of the erotic allows us to think about information capture and analysis as a generative collective poetics rather than institutional datafication and itemization.

    In chapter one, “Surveillance States,” I explore and contextualize federal surveillance ideology through Black women’s surveillance history. In chapter two, “Zami: Erotic Data,” I analyze Lorde’s biomythography Zami: A New Spelling of My Name as an erotic analysis of her life. The text distills the facts of her young adulthood to their most salient and revelatory elements—largely stories about loving other women—and leaving out that which was not an erotic measure. Zami stands in stark contrast to the federal narrative of Lorde’s young adulthood as it is captured in her FBI file.

    The project then turns to compare the narratives produced by surveillant analytics and erotic analytics. In chapter three, “Data States,” I read data analysis company/federal contractor Palantir Technologies against itself, focusing on its contracts with the Chicago PoliceDepartment to analyze CPD surveillance data. In chapter four, “Riot: Erotic Analysis,” I explore poet Gwendolyn Brooks’s analysis of the 1968 MLK assassination riots in her chapbook Riot. Though the text predates Palantir Technologies by many decades, they offer an interesting comparison point on the topic of capturing, reading, and writing data. Both work to read and speak data about Black Chicago citizens in crisis, but arrive at critically different ends.

    In chapter five, “Pynk: Erotic Objects,” I conclude my analysis by considering how Janelle Monáe and her contemporary singers negotiate control over their surveillance. The entire project uses Monáe’s Dirty Computer (2018) and the dirty computer protagonist Jane 57821 as a theoretical starting point to vision escaping surveillance, and this chapter returns here. The mysteriousness of her information narrative ultimately functions as a pathway of escape from the surveillance network. Finally, in the “Return” the project explores the idea of technocultural choreography from two endpoints; the platform company end, and the culturally grounded movement end. Platform companies design technologies that attempt to datafy and predict user behavior in order to influence it, akin to teaching users the choreography of a new dance. However, an exploration of movement choreography through the lens of virtual reality technology elucidates a level of embodied and grounded nuance that cannot easily be algorithmically delimited.


    Citation (Chicago)

    Alexander, Elizabeth Murice. “Dirty Computer Data: Erotic Data Poetics.” PhD diss., Cornell University, 2021.

    Read the full dissertation in a new tab.