Artificial Intelligence (AI) is a discipline within computer science concerned with creating computer programs that exhibit aspects of intelligence. Research started in the late nineteen fifties and has now matured to provide the computational foundations for the web and social media today (for example search engines or data mining). But AI research is more than practical applications. It has always aspired to tell us something about our own intelligence: how humans are able to see, hear, act, think, learn, and speak.
AI researchers are increasingly using artistic means to give expression to the fascinating ideas and methods coming from their own research, and artists have begun to employ AI technologies to create highly original novel works. The ARTE@IJCAI exhibition at the Centro Cultural Borges in Buenos Aires shows several influential examples of this trend. It is organised on the occasion of IJCAI, the largest and most respected conference on AI in the world, with the 2015 edition taking place in Argentina. This year the interaction of AI and Art is the major theme of the conference.
The exhibition brings together eight works. Some of them are ‘historical’, they played an important role in introducing AI into the arts. Others are shown here for the first time. Works by Jon McCormack (Australia) and Patrick Tresset (France) present robotic drawing machines. These machines do not simply execute drawing commands, as a plotter would, but simulate to some extent the esthetic decisions that a human artist would make and explore the role of embodiment in shaping an image. Other works by Karl Sims (US), Leo Nunez (Argentina), and Alexander Mordvintsev, Christopher Olah and Mike Tyka (US) explore the power of human-guided genetic algorithms, cellular automata, and deep learning respectively to create novel visual patterns and show the working and effect of these computational mechanisms. Alexander Berman and Valencia James (Sweden) use AI to explore new forms of interactivity, presenting a program that re-interprets human dancing movements and translates them into the movements of an avatar. And finally two renowned artists employ AI techniques to empower their artistic research: Olafur Eliasson (Denmark) explores his engagement with color and human vision through a multi-agent systems in which agents invent their own color language and AnneMarie Maes (Belgium) deepens her artistic studies of collective bee behavior using intelligent signal processing and machine learning.
ARTE@IJCAI-2015 is on display at Borges Cultural Center – located in Pacif Galleries, Viamonte 525, in the middle of Buenos Aires downtown (http://www.ccborges.org.ar). The exhibition is situated in Lounge 22, and opens on Saturday 25th at 10:00 am.
Download the catalogue of the exhibition:
The immersive installation The Scaffolded Sound Beehive demonstrates how AI techniques can be used to enhance our experience of the natural world by enhancing sounds and images with artificially generated structures, and by visualizing the deeper categories that machine learning algorithms can detect in sensor data.
The scaffolded beehive is an immersive multi-media installation which provides viewers an artistic visual and audio experience of activities in a beehive. The centerpiece of the installation is the top of a Warré beehive constructed using open source digital fabrication and mounted on scaffolds. The hive is 2.5 m high so that visitors can put their head inside it and experience a visual and auditory artistic interpretation of hive activity.
An 8-channel sound installation plays continuously inside the hive. This sound installation is based on recordings of actual bee and environmental sounds in the broodnest of an urban beehive installed on the roof top of the Brussels Urban Bee Laboratory for a complete season. It started with recordings on June 21st 2014, the longest day/shortest night, processed using sophisticated pattern recognition algorithms, and artificial intelligence analysis software, and edited into a 15 minute-piece by adding swirling electronic sound clusters to sonify the increase and decrease of swarm activity in the hive.
A video shows 365 days of activity inside a real observation beehive, played back at higher speed. The images were recorded with an infrared camera inside the hive and processed using pattern recognition, AI and computer graphics algorithms. These images give a stunning visual experience of a colony in action. A second video shows a graphical rendering of AI analysis of colony behavior combining real audio data with measure-ments of microclimate inside the hive: temperature, CO2 and humidity.