Negotiating the image with pixels and thread

Echo Chamber: Darn (m,b71a,3295,20,4,27,13,31,59-c)
a collaboration between Susan Yelavich and David Young, 2020

I’d been watching the artist David Young’s work with artificial intelligence and machine learning for a while when it occurred to me that the images he and the AI were generating looked very similar to textiles.  Some resembled textiles under a microscope; others were not unlike the coarser stitching I was doing.  

Four photos of darns by the author, fed into the AI

I have always knit, but have recently taken up darning using vivid embroidery floss to transform practical mends into (almost) randomly patterned repairs. I also study the pervasive nature of textiles in the built environment. There is something about the co-dependency of slender threads in forming structures from garments to rugs to architecture that I find compelling. I find it especially absorbing to watch how the movements of fibers (via the techniques of making) gradually bring those structures into appearance and into other dimensions. 

Given these preoccupations, I asked David whether he also saw some resemblance or relationship to textiles in his work. He had and he proposed that we collaborate. He would feed digital images of my embroidered darns into an AI program that trained itself to produce an image of a darn, not unlike the way images of faces are fed into AI/machine learning programs to create facial recognition software. In our case, David had the machine generate new images from the training data (the digital photos of my darns). He then he took the machine generated images, manipulated them with his own custom code, and printed one that we selected together for me to stitch back into an analog state.The image we chose was printed on paper with a black ground but since paper would be impossible to embroider, David also printed it on a square of sturdy, white canvas. 

Digitalized darns

While David thought it would be okay to take some license with the image, I wanted to stitch the AI image as carefully as possible to see how close I could come to depicting its virtual state. So I tried to match each stitch to the lines on the canvas and match the colors as well. However, I soon realized there was a discrepancy between the pattern on the black background and the pattern on white ground. So I began a process of negotiating between the two printed images David had given me, sometimes, ‘deferring’ to the colors indicated on the white ground, but more often to those printed on the black ground, as the original darns were sewn into black and dark green knitted textiles.  

Prints on black and white grounds

When I was done stitching, David pointed out that I was not only creating an echo chamber among the various iterations of the darn, I was also using a process that paralleled the AI’s learning curve by negotiating between two images to produce one. And then there was the factor of light: color choices made working in the evening varied from those made in sunlight during the day. Neither choice was ‘correct,’ each was an option. I realized that there was also another correspondence between the virtual and the analog in that, like the machine, which could have kept going and produced other variations on the ‘darn,’ I also could have kept stitching and refining the sewn surface. The entire notion of “finished” is incorrect in both cases—something I think is quite important because it leaves the process open and recognizes the larger point that things are never finished because of the infinite possibilities for their perception. 

Detail, Echo Chamber

Ultimately, for me, the pleasure of project is in the experience of bringing the work(s) into a state of becoming. Because of that, the title of the project “Echo Chamber” should be understood literally. Echoes are always a bit distorted while remaining related to an originary sound. Likewise, I’d made a hybrid not a copy—a hybrid of the gestures of all three artists: David’s, the machine’s, and mine. Even so, the ‘hybrid’ is not made out of three equal parts. The process of integration wasn’t done on a level playing field. The AI draws on what it learned in a very brief time, while David and I draw on life experiences passed down through time and augmented by our own.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s