• Chelsea Thompto

Routine Insecurity

Image showing test profile page titled "Bruce Wayne" on CAA website.
(Screenshot of CAA conference website.)

Last week, I had the opportunity to take part in CAA 2021 as a panelist and session chair. The conference, like so many in the last year or so, was hosted virtually in response to COVID-19. As is often the case, the virtual version of the conference didn’t stack up well compared to it’s in-person predecessors. This is to be expected though, even the most well run, technologically sophisticated conference platform wouldn’t be able to replace meeting folks in person, grabbing a drink or bite to eat with friends/colleagues old and new, and just generally feeling connected to a broader conversation.

This conference however was not technologically sophisticated.

It was in fact the clunky nature of the platform (provided by Confex) that led me down a rabbit hole ending in the realization that Confex was exposing the personal information of CAA panelists (from 2019 through 2021) to the Internet at large through the conference website as a result of careless coding and a lack of oversight.

This started when I realized that CAA was using the same platform from its previous in-person conferences to run this virtual one and that it was still just as slow now as it had been when it was functioning merely as an online guide/calendar. My frustration and curiosity was amplified when I was unable to access the pre-recorded content for the conference, despite logging in and knowing my password. Having contacted IT and not yet received a response, I was aimlessly browsing the conference site to see if I had missed something when I stumbled upon this:

Screen of live CAA website showing test entry labeled "test :P"
(Screenshot of CAA conference website.)

“test :P”

Clearly an artifact of them building and testing the system that had been overlooked and left in place in the final release. This is of course an innocent and harmless mistake but it did get me thinking and wondering about what else they may have left behind. So, I did what I encourage my students to do: I took a closer look. Using the developer tools in my web browser I took a look at the source code for the page (all of this for note is freely available, there was no special access or software required). As I dug, I found comments in the source code itself about its inefficiency and about decisions yet to be made regarding its functionality. Like these two:

Coding comments stating dissatisfaction/uncertainty with the code.

Coding comments stating dissatisfaction/uncertainty with the code.
(Screenshots of comments found in Confex source code.)

To be clear, it's not that these code comments are in and of themselves super damning or something, but rather it's that every one of them spurred me forward until I got to something that I truly wasn’t expecting to find:

Sitting in my locally stored cache was the personal data: name, address, clientID, member number, gender, race, and sometimes phone number of all the panelists whose panel pages I had visited during my little exploration.

This information was downloaded to my machine by the normal process of the platform. In other words, the programmers at Confex had designed their system to transmit this unencrypted sensitive data automatically to anyone who navigated to the site. When the system pulled a person’s name from their database, all the information they had on the person came with it.

Shocked at having discovered such an obvious data leak, I contacted CAA and let them know. Within 24 hours, Confex had patched the system (kinda) and CAA had sent out an email letting its current members know about the data breach (kinda).

What do I mean by kinda?

Well, on the CAA front two things:

  • First, they only contacted current members despite the leak impacting 2019 through 2021.

  • Second, while they explained that the leak had happened they didn’t attend to the scope of the issue or provide context on why/by whom the data might be accessed.

On the Confex front:

  • First, while they roughly patched the problem, the system itself is still clunky and still transmitting more data than is required for the system to function.

  • Second, the architecture of their system is ripe for SQL injection techniques and other common web vulnerabilities. It is through a version of this that I was able to find the Mr. Wayne test account featured at the top of this article.

  • Third, in trying to contact the company I received no response. This is especially troubling as they host a large number of conferences every month and a brief search revealed this same vulnerability might exist across their entire system.

To be clear, CAA rightfully expected that Confex would know how to do its job (especially given Confex’s history and client volume) but it's these “kindas” that left me unsettled and led me to writing this. We live in a time where our data is becoming more and more valuable while at the same time there is a constant stream of news about data breaches and leaks. It's easy to become numb to it, as with many big systemic problems it's hard for our brains to fully grapple with these issues. But when I stumbled into this leak, it really got its hooks in me and I haven’t been able stop thinking about it. I think this is because I am an artist, not a security expert, and I found this leak simply by doing one of the core things of my practice:

Looking (often times at things people would rather I not look at)

It might also be because my data is part of the leak or because my current body of work relates to data and the body. For whatever reason, it has really grabbed my attention. So, when I see a response full of “kinda” solutions I can’t let it go. Instead I think:

  • What would a response look like that contextualized the leak with respect to people’s personal security and the monetization of data more broadly?

  • What would a response look like that acknowledged the different levels of risk a person might face from a leak like this depending on their race, gender, and/or research agenda?

  • What would a response look like that wasn’t invested in saving face?

  • What would a system look like that actually cared for people and their privacy in the first place?

I keep thinking of these questions and about that stupid Bruce Wayne test page, still viewable at the time of writing this at:


I fixate on the black box nature of all the systems being built around me and that I interact with online everyday. From Confex dumping my address into the caches of every CAA conference goer to ad trackers trying to sell me endless varieties of food and clothes to facial recognition software programmed to misgender me, I am left wanting to know:

What would these systems look like if they were programmed with a care motive instead of a profit motive?

For more about my thoughts on tech and the ways our bodies are caught up in it, you can explore my new/in-progress interactive artwork Landmarks here:

Live Site:


Artist Statement and Source Code:


56 views1 comment

Recent Posts

See All