The project was devised by robotics Ph.D. student Gerry Chen, in collaboration with Juan-Diego Florez, a fellow graduate student; Frank Dellaert, robotics professor in the School of Interactive Computing; Seth Hutchinson, professor and KUKA Chair for Robotics; and Sang-won Leigh, assistant professor in the School of Industrial Design. The team’s peer-reviewed study of the robot system will be published in the International Conference on Robotics and Automation proceedings in June 2022.
How It Works
For a robot to be able to paint in a human style, both the robot and the art must be designed with the other in mind — at least for now. The GTGraffiti system consists of three stages: artwork capture, robot hardware, and planning and control.
First, the team uses motion capture technology to record human artists painting — a strategy that allows for insight into the types of motions required to produce spray-painted artwork. For this study, Chen and the team invited two artists to paint the alphabet in a bubble letter graffiti style. As each artist painted, they recorded the motions of the artist’s hand across the canvas, as well as the movements of the spray paint can itself. Capturing hand and spray paint can trajectories is crucial for the robot to be able to paint using similar layering, composition, and motion as those of a human artist.
The team then processed the data to analyze each motion for speed, acceleration, and size, and used that information for the next stage — designing the robot. Taking these data into consideration, as well as portability and accuracy required for the artwork, they chose to use a cable-driven robot. Cable-driven robots, like the Skycams used in sports stadiums for aerial camera shots, are notable for being able to scale to large sizes. The robot runs on a system of cables, motors, and pulleys. The team’s robot is currently mounted on a 9 by 10-foot-tall steel frame, but Chen says it should be possible to mount it directly onto a flat structure of almost any size, such as the side of a building.
For the third stage, the artist’s composition is converted into electrical signals. Taken together, the figures form a library of digital characters, which can be programmed in any size, perspective, and combination to produce words for the robot to paint. A human artist chooses shapes from the library and uses them to compose a piece of art. For this study, the team chose to paint the letters “ATL.”
Once the team chooses a sequence and position of characters, they use mathematical equations to generate trajectories for the robot to follow. These algorithmically produced pathways ensure that the robot paints with the correct speed, location, orientation, and perspective. Finally, the pathways are converted into motor commands to be executed.
With all the computing and competing movements, the motors on the robot could potentially work against each other, threatening to rip the robot apart. To address this, the central robot controller is programmed to recalculate motor commands 1,000 times per second so that the robot can function safely and reliably. Once assembled, the robot can then paint an artwork in the style of a human graffiti artist.
Why Art? Why Graffiti?
Some of the most typical industries for robotics applications include manufacturing, biomedicine, automobiles, agriculture, and the military. But the arts, it turns out, can showcase robotics in an especially powerful way.
“The arts, especially painting or dancing, exemplify some of the most complex and nuanced motions humans can make,” Chen said. “So if we want to create robots that can do the highly technical things that humans do, then creating robots that can dance or paint are great goals to shoot for. These are the types of skills that demonstrate the extraordinary capabilities of robots and can also be applied to a variety of other applications.”
On a personal level, Chen is motivated by his hope for people to perceive robots as being helpful to humanity, rather than seeing them as job-stealers or entities that cause feelings of fear, sadness, or doom as often depicted in film.
“Graffiti is an art form that is inherently meant to be seen by the masses,” Chen said. “In that respect, I feel hopeful that we can use graffiti to communicate this idea — that robots working together with humans can make positive contributions to society.”
Future Directions
Presently, Chen and the team’s plans for the robot are centered around two main thrusts: preserving and amplifying art. To this end, they are currently experimenting with reproducing pre-recorded shapes at different scales and testing the robot’s ability to paint larger surfaces. These abilities would enable the robot to paint scaled up versions of original works in different geographical locations and for artists physically unable to engage in onsite spray painting. In theory, an artist would be able to paint an artwork in one part of the world, and a GTGraffiti bot could execute that artwork in another place.
In the future, Chen hopes to use GTGraffiti to capture artists painting graffiti in the wild. With the captured motion data, GTGraffiti would be able to reproduce the artwork were it ever painted over or destroyed.
“The robot is not generating the art itself, but rather working together with the human artist to enable them to achieve more than they could without the robot,” Chen said.
Chen envisions that the robot system will eventually have capabilities that allow for real-time artist-robot interaction. He hopes to develop the technology that could enable an artist standing at the foot of a building to spray paint graffiti in a small space while the cable-driven robot copies the painting with giant strokes on the side of the building, for example.
“We hope that our research can help artists compose artwork that, executed by a superhuman robot, communicates messages more powerfully than any piece they could have physically painted themselves,” said Chen.
GTGraffiti is funded by a National Science Foundation grant that supports research involving human-robot collaboration in artistic endeavors.