New Humans

New Humans

New Humans

In New Humans, emergent gatherings of synthetic humans rise from the surface of a black ferrofluid pool.

Appearing to morph like a supernatural life form, these dynamic clusters of magnetic liquid produced by machine learning processes are images of communities of synthetic people--hybrid profiles modeled from actual DNA, fitness, and dating profile data sets sourced from public and leaked caches. The work questions how we can radically conceptualize the "user profile" to embody a self whose bounds are indefinable and multiple. Generative algorithm using machine learning (GAN, T-SNE) and fluid simulation (Navier Stokes), countour generation (OpenCV), user profile data caches (DNA, fitness, and dating), software production (Processing), ferrofluid, custom electromagnet matrix, custom PCB control system, computer, steel, wood, aluminum. Hardware developed by Brooklyn Research. Installation at the Okayama Art Summit 2019, Tenjinyama Plaza.

Artist Statement

I have been using various algorithmic and machinic processes in recent work to interrogate how technology is being used to shape our present–our desires, decisions and daily life. I want to target the senses and emotions of the viewer and make them acutely aware of this manipulation. The projects become a metaphor for an existence thoroughly shaped by technology and the soft power that regulates us as subjects from within our psyche. Often working in tandem with direct coercion, this new form of control is immaterial and abstract but also has a palpable presence in our lives. We can feel it shaping our decisions, behaviors, and desires especially when it comes to predictive algorithms we encounter in our everyday and digital life.

I’m always thinking about how one could escape being scraped, from being detected or coded, how to achieve freedom and this includes being unknowable (i.e. not just a profile or data). During our discussions about the possibilities presented by new technologies, we would also talk about the tension between the notion of “freedom” and control under these new conditions. While new forms of freedom are introduced through harnessing/utilizing technology that allow us to communicate, express ourselves, create new forms, buy things, travel, etc. our data is used knowingly and beyond what we realize or understand. New forms of control emerge with these new possibilities.

The question is really: Is escape possible? Would escaping detection look like a moving shadow in a room? Or is it more about obfuscation like hiding within noise? Or maybe something that can only be seen in fleeting moments? As technology seeks to understand human behavior, desires, and who we are, the connection between data and identity formation became the central focus. As Jacob, Christy and I began talking about the fluidity of identity in real-life and how the notions of constant change relates to generative processes in technology, the project began forming around utilizing machine learning in tandem with various clustering algorithms. The technology itself would be a metaphor for perpetual morphing and vehicle for chasing its own shadow.

We used available data sets that represented robust characteristics of individuals –genomic, biometric and subjective data taken from a DNA testing company, fitness app and dating website. From here, we experimented with various clustering algorithms and GAN processing to “see” these newly produced subjects and their fluid affinities.
The next question was how this technology would visually manifest. The last step of the project was to invent and connect the algorithmic processes with the material and hardware that would manifest the concept and data output. Fluidity and liquid itself became the material representation—our identities are in continual transformation, with new unknown possibilities, affinities and communities. What I imagined was black pool of magnetic ferrofluid that would move in unnatural ways against nature using the data sent from our customized program. New life emerging from a primordial liquid abyss.

Collaborator Statement

It was a great journey. In all honesty, I’m happy that we could finish. From the outset, our goals were ambitious, and there were many moments when it felt like we were running on empty and the finish line was just a mirage. In order to move forward, we had to put out multiple fires at once, but occasionally our prototype actually caught on fire to remind us of that. Seeing how calm and elegant the piece looks now almost makes me forget about the complexity and the struggle of its genesis. And that amply captures the primary technical goal, which was to serve the visual metaphor. We wanted to breathe life into this piece and generate randomness, yet the software components were inherently about normalization. This tension between the natural and the artificial was the challenge for us. We attempted to engineer something that feels alive with inorganic parts.

Collaborating with Mika was a great learning process. We started the work with visual examples. For instance, Mika shared a scene from The Terminator and set the liquid metal moving around in the scene as the gold standard. And that became the starting point for technical research. After a long sequence of searches on Google, we arrived at a few promising methods to simulate liquid and control its properties such as viscosity. Subsequently, we conducted experiments to validate them. There was a lot of creativity in this process. How to randomize the liquid movements had many possibilities, yet choosing Chaos Theory and the Navier-Stokes equations as the theoretical foundation was a circuitous decision after a long exploration of different ideas. Further, the focus on visual effects was a bit unusual as a technical constraint to deal with since performance is the priority in most software projects. Most of the time, the results were disappointing, but a small subset was successful and accumulated to a more sophisticated and cohesive product.

In retrospect, there were a few notable technical challenges. Firstly, we strived to use theories as metaphors for their corresponding visual effects. One of the themes we wanted to capture was clustering and visualizing connections that we as humans can’t see. So in visualizing the dataset, we applied dimensionality reduction techniques such as t-SNE as their purpose was aligned with the theme. Once we had the theoretical foundation, we needed a way to iterate quickly to create a feedback loop with Mika. It needed to be mainly visual for Mika to review. Processing was the perfect platform since it supported live interaction and visual feedback. We were able to increase productivity significantly after adopting the platform. Towards the end of the project, integrating all the different components was a considerably challenging task. It was essentially mapping the output from one module into the input for another module, which turned out to be a more creative and philosophical task. Further, since we had to wait for the hardware development, we had to predict the real-life visuals with the software. When the hardware was connected, we noticed that the simulation wasn’t an accurate representation of the final result due to the limits on the communication protocol. Particularly, we had to ensure that both the hardware and software could maintain a certain frame rate. Lastly, the piece was to be displayed for 8 hours at a time, so we had to test the program for its stability. We noticed that it kept crashing after an hour of running and later discovered that there was a memory leak. After resolving this issue, the piece was mostly ready for production.

Partaking in this project made me think deeper about the purpose of technology. In this project, enabling visual metaphors was its main purpose. Nonetheless, I’m convicted to think in this way more broadly beyond this project. Using technology to enable processes and help people is the priority. Technology is there to serve, and the magical experience comes when one can’t notice it seamlessly operating in the background.

In summary, I’m grateful that I had the opportunity to be part of this collaborative project. I want to thank Mika et al. from Tajima Studio, Christy, Greg, J, everyone at Backslash, and Ezer and Johnny from Brooklyn Research for being part of this project. The amazing result was only possible through everyone’s contribution. It was a truly unique educational experience that I will look back on many times later.

Collaborator Statement

The Backslash experience is a project that cultimates our studies at Cornell. With little to no limitations imposed, we were free to conjure up any idea and create a piece of art that simply used technology. Then came the realization the hardest part was not solving the problem, but defining the problem. What do we want to say the most? After rounds of discussion, we arrived at the idea that the artwork be an expression of human identities.

In this digital age, it’s no secret how hard it is to escape from exposing one’s identity. You wake up, your location is tracked when you go to work, you eat at restaurants recommended to you during your morning scrolling sess, you shop at brands your internet icons shared earlier this week. Technology enables digital profiles, which can provide great convenience and joy to our lives, but at the same time can be problematic as we’ve seen some alarming scenarios unfold in front of our eyes.

The ferrofluid installation shares the vision that each of our identities is unique and beautiful, nonetheless obfuscatory. From the data source, we chose anonymous datasets that represent a variety of facets of human life. The data was shuffled, clustered and re-constructed using t-SNE and generative algorithms to create “new” synthetic humans. In the artwork, the “new humans” are manifested by the morphing shapes of the ferrofluid, symbolic of the ever-changing behaviors and experiences that technology partakes to shape and define us.