2013-03-26

Co.Labs

Printing On Food With OpenFrameworks And Arduinos

Three-dimentional printing promises to let people customize virtually anything. At last, a technology for unabashed ego stroking! But to create your own shoes, furniture, jewelry, or food you first must solve a basic technical problem: how to go about converting images onto real-life material.



For a double-shot of an ego boost, the builders of Barista Bot—a collaboration between Hypersoniced, Rock Paper Robot and creative coders Jamie Zigelbaum and Kyle McDonald—built a robot that turned webcam images of customers into drawings printed in the foam of a latte. Like many 3-D printing projects, Zigelbaum says, “[it] appeals to the narcissistic side in all of us.” But also the hacker side: You’ll need to master openFrameworks before you start wiring up arduinos.

Building The Picasso Of Steamed Milk

Barista Bot was hacked together from a 3-D printer controlling a medical syringe pump attached to the end of a large robotic arm powered by four stepper motors. It uses open source software Kyle McDonald wrote for a previous art project called “Blind Self Portrait,” which built contour drawings of people’s faces using computer vision algorithms.

“Translating a face into something you can draw isn’t easy. A printer does it in a very brute force, systematic way with brightness values. It’ll be like—this pixel is this brightness, that pixel is that brightness, and it’ll print every pixel so you get photographic representation. But what Kyle wanted to do was to make a drawing, not a print, and to make a drawing is hard. To take a bitmap image that you get from your camera and turn that into a line drawing that you might see from somebody writing in ink is a real challenge. Kyle did that in ‘Blind Self Portrait,’ which put out a really nice set of lines that an XY plotter with a pen attached to it could draw onto a piece of paper,” explains Zigelbaum.

One of the biggest challenges the team faced was overall software integration—translating the paths being generated by McDonald’s software into machine instructions that could create accurate, repeatable prints on coffee.

Fine-Tuning The Draftsmanship

In “Blind Self Portrait,” McDonald had used a hacked Makerbot, which was controlled with a Python script interpreting binary space positioning coordinates communicated via Open Sound Control. “We thought at first that we could use that software to control our robot but we couldn’t, we had to simplify that process,” explains Zigelbaum, whose task it was to write the new software for Barista Bot. “Now everything happens inside openFrameworks. [We] take the paths from inside oF and write stepper control code to control the four stepper motors and make sure that the ink is delivered properly.”

The system architecture they came up with was built entirely in openFrameworks, driving an arduino running to handle the motor controls. “We wanted to avoid multiple scripts calling to the serial ports at the same time, which would clog things up, so we thought it would be best to integrate everything in openFrameworks directly,” says Bill Washabaugh of Hypersonic, who was responsible for the robot design and construction. “We were running four stepper motors, so calling all of them at micro-second speeds via openFrameworks was a trick, making sure we weren't clipping any step calls, etc. We also spent a long time trying to fix drifts in the image (where the image looked like one of those sketches people do when they don't look at the page as they draw.”

Zigelbaum, who typically creates interactive lighting installations and displays had never written software to drive steppers and control robot arms before and, given the project's short timeline, had to learn on the fly. “I made some design decisions about how the code should be that I later realized were not good ones and had to rewrite it a couple of times. Then, even after that, after getting all the software right and working really well, the steppers themselves were a little tricky. We realized we had to change the way we were driving them and that wasn’t in the data sheet, there was no information on it, it was just by trial and error and accident. We had to change the waveforms until they were happy with it.”

Finding The Right Ink And Canvas

Getting the software right was only part of the challenge. Drawing on a hot, unstable surface of foamy steamed milk which evaporated pretty much on contact made it all much more difficult. “We settled on using a syringe pump earlier on, because we knew they are incredibly accurate in terms of pumping rate and accuracy. So, we reverse-engineered a medical syringe pump and integrated it to the bot,” explains Washabaugh. “But then the challenge was what ‘ink’ to use, how to get good foamed milk that held bubbles, but didn't bleed with the ‘ink,’ etc. In the end, we had two of our team members spend an entire day with every type of milk, thickener, thinner, ink, etc. Even the barimetric pressure will affect the foam bubbles, so these are all challenges in getting the image to look good. Our ‘ink’ was cold brewed condensed coffee, and our foamed milk was warm foamed half-and-half with a bit of heavy cream.”

While it’s tempting to dismiss the Barista Bot as silly, there’s value in this sort of experimentation. The team’s hacker approach helps introduce new and unusual applications for rapid prototyping, opening up new possibilities for the emerging technology.

“One of the big goals of my work is to inspire people with elements of the unexpected. At the same time, I'm hoping we can inspire people to think of how to use tools in new ways. I hope people can look at challenges around them and think, how can I hack what I have in front of me to fix that? Those challenges may be really practical, or they may be totally goofy (as in this case). Either way, I love the challenge and the creativity it inspires,” says Washabaugh.

Jamie Zigelbaum



Add New Comment

0 Comments