Imagine yourself walking through a quiet Michigan town one evening. You’re near the local ski resort, but it’s the offseason so the slopes are quiet and empty. All of a sudden, you hear the buzz of hundreds of flying lights taking to the sky and, above you, a gigantic 300-foot face appears to be staring at you and singing.
Sound like a bizarre fever dream? Far from it. You’ve just caught a glimpse of the experimental rock band VWLS’ music video for the single, “High in Heaven.” Hobbes, a design and animation studio out of Detroit, partnered with Firefly Drone Shows to produce the video, which features 200 flying drones in the shape of a celestial head lip syncing the music of VWLS musicians Josh Epstein and Louie Louie.
Epstein and Louie used their iPhones along with the free Moves by Maxon app, to capture their facial movements for the distinctive video. The Hobbes team brought that facial capture data into Cinema 4D to map out the flight path for Firefly’s custom-built drones.
We reached out Hobbes’ Nick Forshee to learn more about the project. Here’s what he had to say.
How did the concept of this piece come about?
Forshee: It’s kinda funny. We were asked to do a drone show for an artificial intelligence conference, and they wanted some sort of representation of artificial intelligence. So that almost ended up being the beta version of this drone face. But the Firefly guys were at the conference too, and they came back saying that the AI face was super cool, but it scared all the kids who were there.
I remember thinking, ‘Oh man, if it had that kind of impact, we should do something with a face.’ We tested it out on a Santa Claus face, and tried to make it kid friendly, but it still scared the kids. Our 300-foot-tall face was based on Josh’s likeness, but shifted into more of a cosmic entity over time.
Was this all handled remotely because of COVID-19?
Forshee: We had already started working remotely with Josh and Louie before COVID-19, so they recorded multiple performances and we got the facial tracking data easily by email. We used pose morphs to blend the performances. Firefly set everything up and adhered to social distancing guidelines for all of that.
Describe your process for animating a face in the sky with drones.
Forshee: The drone flight paths were originally animated by hand in Cinema 4D, like the other performances we’ve done. But that can be a pretty laborious process for a three-minute show. Then we found Moves by Maxon on the app store, and it just connected all the dots. We realized we could do a full performance driving the drones with the app.
We brought the mesh into C4D from Moves by Maxon. It comes with UV coordinates, so we just painted a rough idea of what our design would look like. That gave us a sense that it would translate, so we figured out that the design actually worked really well even in its rough form.
Once you bring your mesh into C4D, how do you translate that into points?
Forshee: Plotting the points of the 200 drones was probably the most time-consuming part of the project. It was almost like a giant Lite-Brite, only you only had 200 pegs. Once you commit to a number of points, you have to use that for an entire show, moving from point to point so you have to choose your lines wisely.
We chose every line in that face to represent X, Y, and Z depth, so it became this dance of pulling drones away from the outline of the face and adding them to another area, or adding them to the mouth. Once we had plotted the whole thing out, we had a few points leftover, and that’s where we put the teeth. I’m really glad we did that.
What came next, color?
Forshee: Yes. It becomes a game of color and the showmanship of color. We had the full color spectrum available, and just about any intensity. I would say it was almost a month of color for the drone show. We introduced different lighting patterns as we went along because we had three minutes to keep people's attention. Meanwhile we were really fighting the urge to turn the face into other objects. You know, if we exploded the head right off the bat, what was going to be next?
So how does all of this work move from Cinema 4D to the drones themselves?
Forshee: The 200 points are basically keyframes in XY and Z at every given moment of an entire show. One trick we use is to work on a real-world scale. Those are the units we used inside of Cinema, so it translated really well to the drone data., Moves by Maxon gives you a face that’s about the size of a human head. So all we had to do was bake down the final performance into an alembic, which has a really nice option of scaling up. We blew it up 300 times and then plotted the drones to it. We didn’t have to manually move anything on the face until the whole head explodes. The rest of the show was all done by our capture.
What was it like to set everything up on location?
Forshee: It’s a lot of manual labor. Just laying all of the drones out in the field takes half an hour. Each drone downloads the entire flight path so it can communicate where it’s positioned on Earth. Then the drones basically try to make certain waypoints, and they are bizarrely capable of being super accurate.
Fascinating. Do you have any plans to experiment with more facial captures?
Forshee: Yeah. We're always experimenting. I mean, as soon as we were done with this, we were like, ‘Okay, we did it. We're done. Next!’ Samantha Griffith, at Hobbes, had this pretty cool idea of using the flip side of Moves by Maxon, which would be a full body capture. We’re looking into doing a full-figure dance choreography with the drones.
We have all these weird outlandish ideas we want to play with.