Who is watching? Carnegie Museum exhibit explores the dark side of AI
In 1984, Rockwell sang a song called “Somebody’s Watching Me,” a paranoia hymn that included the phrase, “I always feel like someone is watching me.”
Maybe it was just a timely shout out to George Orwell’s dystopian novel – which someone was still watching – or maybe Rockwell was really on to something.
Photography has actually been used since the late 19th century to keep an eye on people and keep them online, according to the pamphlet of “Trevor Paglen: Opposing Geometries,” a new exhibit at the Carnegie Museum of Art in the Oakland section of Pittsburgh.
In this exhibit, Paglen examines how images are militarized against humans and the environment through the use of artificial intelligence. The US-born artist, who divides his time between New York City and Berlin, Germany, is known for his work on the problems of mass surveillance and data collection – and the dangers inherent in it.
“When we first started thinking about this type of exhibition, it was the first name that came to mind,” said Dan Leers, Carnegie’s curator of photography. “He has been thinking about Big Data longer than anyone.”
When visitors enter the lobby of the museum’s Forbes Avenue entrance, they are greeted by “CLOUD # 902,” a 16ft by 32ft photographic rendering specific to the site of a cloud to which an algorithm trained to identify circles was applied.
The random way the circles are applied “illustrates the rigidity of the way machines see the world,” Leers said. The title of the piece refers to both the cloud as a natural phenomenon and the nebulous database to which we entrust our most important information.
Landscapes and faces
Upstairs, Gallery 1 hosts a series of photographs of landscapes and human faces by Paglen, some of which make their debut at the museum, which also explore machines making interpretations using algorithms, Leers said.
Landscape photos of the American West draw attention to how the use of AI and automated imaging can accelerate the exploitation of natural resources. Portraits were originally used by private institutes and the United States government in early facial recognition research dating back to the mid-20th century – unbeknownst or unbeknownst to the subjects.
“It raises questions about how it still happens today,” Leers said. “There was a need for faces to form these early computers. These images raise awareness of the way faces are used today, the issue of convenience versus what we sacrifice.
To access the final installation of the Paglen exhibition, visitors must walk through the galleries to Gallery 16, which is empty except for a pedestal containing a small sculpture titled “Autonomy Cube”.
Courtesy of the Carnegie Museum of Art
A visitor interacts with “Autonomy Cube,” a sculpture that provides a Wi-Fi hotspot, which is part of “Trevor Paglen: Opposing Geometries,” which runs through March 14 at the Carnegie Museum of Art in Pittsburgh.
The sculpture actually provides a Wi-Fi hotspot from which visitors can connect to the open source Tor network, where their digital movements cannot be tracked and data cannot be collected. The plexiglass box that surrounds it is “a metaphor for the transparency that the artist seeks to bring to the operations of the machine”, according to the exhibition brochure.
“This is a first for the museum,” Leers said. “It was important to create a secure digital space, to give visitors a space where they are not subject to surveillance.”
The exhibit, which runs until March 14, has been purposefully set up in separate spaces “to take visitors on a tour of the entire museum,” Leers said. It was organized by Leers with Taylor Fisch, project curatorial assistant, as part of the museum’s Hillman Photography Initiative.
Special programming associated with “opposing geometries” includes:
• Online conversation: Trevor Paglen and Dan Leers, 7:30 p.m. to 8:30 p.m. September 17 – Virtual tour of the exhibition.
• Algorithms and social spaces Workshop, 12 p.m. to 1:30 p.m. October 14 – By giving participants the opportunity to create their own social algorithms and reflect on the impacts of these algorithms, this workshop shows how algorithms used in social spaces can exclude and prioritize certain people.
• AI and speculative fiction: online Workshop, noon-1:30 p.m. October 28 – Attendees will take a crash course in AI basics, then create a collaborative and speculative story describing a fair and inspiring future of technology.
• Machines that learn: online Workshop, 12:00 p.m. to 1:30 p.m. November 11 – The session will describe the concepts and functions of machine learning in AI, examining the technology that tries to predict what we want based on our previous decisions and preferences .
Participation in these programs is free; for registration information, visit cmoa.org.
A series of podcasts will be released over successive weeks starting in October, with each episode highlighting a different facet of the conversation around artificial intelligence, from biometrics to racial bias to navigating a post world. -covid-19.