Copy of Two New Possibility Lunches! - Originally Posted October 16, 2016

We're keeping up our HSCollab tradition of monthly meetings to discuss topics in digital cultures. Jessica Rajko, Marisa Duarte, and I would like to invite you to our upcoming Possibility Lunch: The Internet of Creepy Things. October 19th 11:30-1:00 in ISTB4-492. Lunch will be provided.

We host these lunches once a month in order to discuss aspects of how data and digital technologies shape lived experiences from within the body. We enjoy open conversations from anti-racist, feminist, and decolonial perspectives on topics like social justice, arts and performance,  the quantified self movement, cyber-security and surveillance, digital labor, social networks, and more.

To RSVP please email Gloria.Espinosa.1@asu.edu

We'll be hosting our November event: "Decolonizing Technology" on the 16th (same time/location) and have already received a lot of interest, so mark your calendars and let Gloria know if you can make that one as well.

~Jacque

Decolonial and Performing Algorithms - Originally Posted April 19, 2016

We held the last of our "Critical Conversations" series on Monday, April 18th and we focused the discussion on algorithmic accountability and algorithmic bias. We also spent a final hour discussing what the next steps might be for the collective that has emerged from these conversations.

We began with a discussion of what exactly algorithms are and do - how algorithmic thinking works both in computational and everyday decision making practices (at the end of the post are several helpful resources). Drawing on this conversation, we began thinking a bit about just how much tacit knowledge is actually expressed in the process of making algorithms and algorithmic decisions. Touching lightly again on our theme of the displacement of knowledge from the body, we noted that algorithms obfuscate personal knowledge. We spent time mulling over the challenges of transparency with respect to that knowledge, as well as those of trying to document a process that is often iterative, multiply authored, and very often relies on hunches and guesswork. We dreamed of a visualization of the volume of decision making that goes into the harvesting or marshaling a single point of data.

We also talked at length about relationality and the ways that algorithmic approaches often obscure the relationships between networks of people and things.  We wondered about ways to think of the mess or the cacophony of relational information an asset rather than a barrier. Even as we championed the value of good metadata for discovery and use in systems, archives and the like, we wondered if we could get away from the disciplinary processes of colonial knowledge systems. As one participant observed, all cultures seem to want to "tame knowledge" so categorization and abstraction show up in a multitude of epistemes, not only those of western enlightenment thought. This led us then to ask about the possibilities of "co-prioritization" of knowledge systems - thinking of a "many paths" approach to knowledge and decision making that would create space for a plentitude of algorithms, each derived from and suited to different knowledge systems. We wondered, would this offer us the possibility of decolonial algorithmic cultures?

We noted that public awareness of how algorithms work is relatively low. We also talked a bit about the techno-utopian approaches to algorithmic optimization and the myth of the apolitical/neutral algorithm. Even as we lamented the sense in many spaces that algorithmic processes are neutral, one of our participants observed that the military has a clear mandate to keep human action linked to algorithmic work, even if only to "have someone to fire" when things go wrong. So while it may seem that the law has a long way to go in terms of understanding who might be held accountable for algorithmic bias, we do have a model that demonstrates a clear-eyed understanding of accountability even at a distance.

This then led us to daydream about a research project that we might call Performing Algorithms, which would allow us to demonstrate to everyday users just how powerful algorithmic personalization can be for our individual and collective experiences in digital cultures. We imagined a set of user profiles that could then be activated in an installation where a single search string would produce a range of results from the same search engine. We imagined it as a kind of gesture to the multiplicity of the web and the powers of "customization" and "optimization" for shaping an experience that many people assume is universal rather than highly crafted and, in some sense, personal. As one person put it, this project would allow us to demonstrate the ways that person's life is "vectored" by largely hidden algorithmic processes.

We concluded with an exciting discussion of future work...but that will have to wait for another day. :)

A handful of resources on algorithmic culture and studies

Ted Striphas "What is an Algorithm?"

Algorithm Reading Club

also @alogrithm_club and #algclub

"Information Intermediaries Playlist" (includes Soraya Chemaly and Catehrine Buni on the "Unsafety Net")

Soraya Chemaly and Catherine Buni "The Secret Rules of the Internet"

A collaborative statement on "Intersectional Data" (contact J. Wernimont for more info)

Healthy Data - Originally Posted March 23, 2016

Our third HSCollab luncheon focused on "Healthy Data: health, data and healthy practices in the age of the quantified-self" and we spent some time playing with our growing collection of pedometers, including some of the first mechanical devices. We also laughed and did a little commiseration. It feels important to me to report out on this quality of our conversation too. We're developing a community that cares with and for one another and that is an important feature of the kind of work HSCollab wants to do.

Shared care is also an important part of what people are doing when they are engaging in community efforts to quantify their activities/bodies in pursuit of improved health. As one participant noted, it's no fun if you're the only one with a Fitbit.

We talked a bit about the very analogue pace or step counters. The one I have is a lot like this, although the cord on mine is in desert camo. As one participant noted, we also know of these as borderland technologies for diabetes care - such pace counters are often handed out to members of communities and become part of a collective effort to increase walking as a way to combat diabetes and diabetes related illnesses. This includes friends peeping in others' beads and having competition and uplift as part of the behavior change practice.

We also talked about the Healthy Active Natives FB groups, that encourage greater activity within the communities they serve. Important in our discussion was a recognition that efforts such as this include native specific practices and advertisements. This is a decolonizing quantifying effort, which has really interesting implications for how we can think about the politics of technologies and techno-social networks. 

We also discussed at length about the ways that quantifying and health surveillance technologies are often invested in developing a set of consumers with a set of life-long dependencies. We talked about what knowledge is lost and gained when displacing knowledge production onto a device, and how agency is an important part of interfacing with any tech.

In the case of the pace beads, there was a general sense that they allowed a certain kind of agency - that it was a memory aid, rather than an automation device - and that the aesthetics of beads on a string mean differently than numbers on a digital device. This leaves us attuned to the differences between receiving and creating the information, or receiving and creating meaning. This also has a set of really important implications for what we mean when we say someone is performing with data devices - the what and the how are different depending on the media affordances/restrictions. One of our participants noted that the agency of working on the world with a tool (as in the pace counter) is entirely different from that enacted by putting on an electronic device and being worked upon yourself.

In terms of agency, one of our participants shared her work on data monitoring in health care settings and she echoed that agency is important. She noted that people tend to slow usage if they are not empowered to do something with the data and that just monitoring doesn’t change patient quality of life outcomes. More needs to be done to empower people who are using tracking devices for acute health situations.

This led us to talk at length about affect and wearable technologies, particularly around health. As several people observed, routine monitoring is important, life-saving even for some, but it is also a reminder that the person doing the testing is ill. In the case of mental health or non-critical health conditions, this rubs up against different understandings of what is "healthy" and may put a person in the position of regularly being reminded that others think she's sick when she disagrees. Shaming, negative affect, and dominating daily activities are problems for those who are being asked or compelled to use health monitoring tools.

This brought us again to the question of whether there are ways for health tracking devices/tools to NOT displace human knowledge/expertise. We also returned to a favorite theme of ghosts - although this time it was expressed as being aware of the presences of others - designers, manufacturers, etc - in the technology. An acute sense that "this was not made for me" - often without "me" ever in view.

We also talked briefly about psychotherapeutic “mood logs” and the ways in which there have been tracking technologies for a very long time, many of which were textual and highly contextual. That contextuality matters a great deal, based on our conversations, to a full understanding of what the data means.

Our next and last luncheon is scheduled for April 18 on the theme of Algorithmic Bias: subjectivity and implicit biases in algorithm and tech design.

We also want to make time to scope out our future work as a group. Among the ideas we discussed are

  • Bringing in speakers for public events, possibly even curating a series of related events across the ASU campuses and in our communities.
  • We would like to write together and share our work with one another - including possibly writing something in the vein of "As We Now Think" and/or a manifesto or statement of research objectives ("Urgent Questions") for the future of HSCollab.

To make room for all of this, we'll schedule a two hour session for our last luncheon. If you can only make one hour - we'll do Algorithmic Bias in the first hour and then transition to the Next Steps conversation in the second hour. We hope you'll join us!

Wearable Research Charrette at HASTAC - Originally Posted February 25, 2016

HSCollab is excited to announce a partnership with the UC Davis ModLab to host a Wearable Research Charrette as part of HASTAC 2016. 

This half day event will take place Saturday May 14th from 9 a.m. - 2 p.m., registration is available through the HASTAC2016 registration portal. Space is limited so please register early and let us know if your plans change.

We are using the term charrette here to signal a session that is collaborative and participatory with the goal of shaping and extending how we engage with concepts around wearable technologies. With this half-day retreat we hope to collectively imagine alternative methods for making, designing, and using wearables. We may also develop future areas of collaborative work.

 

Fragile, flowing data and aspirational labor - Originally Posted February 24, 2016

Our Feb 22nd HSCollab conversation asked "who has the rights?" with respect to data ownership, invisible labor in digital economies, and agency with respect to data production, analysis, etc.

Fragility was a major theme of the discussion. Data is fragile in a number of ways:

  • it is subjective and therefore subject to discrepancies across similar datasets
  • the data creators and curators are often single individual or small groups of people whose work is subject to time, energy, and resource constraints. Very powerful datasets sometime just stop getting collected because someone - a person - just can't anymore.
  • Classification schemes are contingent and therefore are subject to variation over time and across communities.
  • Data flows are subject to gatekeeping practices - making data flow itself fragile
  • Data as information captured/maintained/transmitted through material media are subject to decay and deformation
  • Data can be lost, hidden, and/or destroyed

We also spent a fair bit of time talking about "prosumption" and "prosumers" in the context of invisible and often precarious labor. Examples included young women who are using social media platforms (like pinterest) to promote themselves in beauty industries, often without knowledge of how that work can be appropriated. In this context we talked about the many sides of "aspirational labor" - work that gets your name out there but might also make you vulnerable to threat and or appropriation. This included the largely unstudied psychological effects this an have on aspirants themselves.

A really fascinating discussion on digital lives and selves as a matter of public health, security as a matter of public health followed. This included noting that trans people can be particularly insecure in a data-based, social media culture. We talked about people who don't want to be documented - whether to promote "safety" or to "fit" existing legal frameworks. In many cases this arises from long histories of documentation-as-violence. This led us to want to know more about the general histories of disenfranchised people's relationship to data. We also discussed the ways in which efforts around sports injury testing might be themselves harming and imagined what a more holistic approach to thinking through issues like head injury might look like (hint: it's not all accelerometers).

We had several really productive forays into the idea that data is not an artifact (or not only an artifact) but an episteme or scenario. Every piece of data is an act of theory or ideology. This dovetails nicely with efforts elsewhere to think about intersectional or deeply multivariate data. In this context we noodled around thinking about how data lives in our own lives - often as stories. This got us thinking about the temporality of narrative or story and a comparison to the "allatonceness" of the visual representation of data. We also found ourselves intrigued by the richness of "scenario" given some of our interests in performing data - thinking about scene and scenic repetition. This, of course, has certain resonances with our conversation about data as evidence. In particular, we spent some time thinking through the questions we'd like to ask of any data: produced by whom? for whom? to what end? and in which theater of proof?

We also touched on time as suggested above - balancing an observation that some data uses are a way for us to stop time, to take a slice of life and reflect/subject it to scrutiny, with the idea that humanistic data actually flows diachronically, through time. Perhaps this is one of the many ways in which humanists can intervene in our collective understandings of data -- by bringing the diachronic back to what has largely been an effort to freeze/fix and study.

Our third luncheon will be held on Monday March 21 - topic: Healthy Data: health, data and healthy practices in the age of the quantified-self. To RSVP send an email to jwernimo@asu.edu

Ghosts, Tangles, and Data Ethics - Originally Posted February 5, 2016

Monday Feb 1 we officially launched HS Collab with the first Critical Conversations research luncheon. Our theme for the first event was: (Un)Corporeal Technologies: how data, algorithms and interfaces rub up against the body. It’s fair to say that thanks to the cadre of researchers in attendance, we began this series with an intensely rich and insightful discussion. We covered a lot of ground; below are some of the topics that emerged.

  • Detangling data from the Age of Enlightenment by challenging epistemologies and their hold on current data practices
  • Recognizing long-term, data-based histories of oppression and discrimination
  • Digital/Physical Ghosts: Recognizing and reconciling traces of trauma borne by people and data, as well as thinking about the traces of trauma residue left in machines and in human bodies
  • Imagining data, algorithms, and interfaces outside of colonialist paradigms
  • Pornography, Identity, and Query: How do online queries make visible oppressive, misogynistic, racist perceptions of identity
  • Borderland Technologies: personal technologies, non-digital technologies, technologies of necessity (border moding/Resquache tech)
  • What does it mean to thrive, and what does thriving look like to different individuals/communities?
  • How does the Quantified Self movement impact how humans relate to their own bodies?
  • Performing Data: How do we perform data and how does data perform us?
  • The ethical implications of using data for the making of art: Appropriation and exploitation in the name of art.
  • What happens when art is seen as not being ethically liable?
  • The ethical implications of profiting off of another’s personal data: Who has the agency?
  • Embodied Agency: The right to use wearable technology and not give personal data away. The right to keep data to oneself. The right to choose to participate.

At this point we are continuing to identify/generate common areas of interest. As the series continues, we'll eventually work toward topics for further exploration and next steps.

We'll be talking again Feb 22nd (Who Has the Rights?: data ownership, invisible labor, and agency) and we welcome folks who are interested in the above or related to join us.

Critical Making - Originally Posted December 17, 2015

HASTAC Research Charrette: HS Collab will be hosting an open wearables/personal data research charrette during the HASTAC (Humanities, Arts, Science, and Technology Alliance and Collaboratory) Conference in May 2016

International Research Retreat: HS Collab will be hosting an international team of researchers (Dr. Aimee Morrison, University of Waterloo; Dr. Fiona Barnett, Brown University; and Dr. Padmini Ray Murray, Srishti Institute of Art, Design & Technology) working on digital technology and human behavior in January of 2016 for a research retreat.

Critical Conversations - Originally Posted December 15, 2015

We are hosting a series of lunchtime conversations about human security research, policy, law, and advocacy held monthly during the spring of 2016.

Critical Conversations will be organized around emerging and important areas of interest although our conversation topics may range within any given topic to include concerns with indigenous digital rights, quantified self, community-based tech development, the “Internet of Things” and the “Internet of Bodies,” biosensing technology and security design, medicine and tech equity, and more.

Planned HS Collab Critical Conversations Topics:

  • Feb 1 (Un)Corporeal Technologies: how data, algorithms and interfaces rub up against the body
  • Feb 22 Who Has the Rights?: data ownership, invisible labor, and agency
  • March 21 Healthy Data: health, data and healthy practices in the age of the quantified-self
  • April 18 Algorithmic Bias: subjectivity and implicit biases in algorithm and tech design

As part of our planning we considered separate sessions on “performing with/through data” and “caring for people/caring for data” – rather than keep those as one-off sessions, we’ve opted to weave the thematics of performance and care throughout the series.

Please contact Jacqueline Wernimont at jwernimo@asu.edu for more information.

IMAGE SOURCE: VECTORPORTAL.COM