Feature: "Angels of Amsterdam" volumetric film
Interview with Avinash Changa on the genesis of the production, made with Depthkit.
December 23, 2021
“Angels of Amsterdam” is a volumetric film, featured at the Venice Film Festival 2021. Immersed in a truthful recreation of a 17th century café in Amsterdam, the audience shares key episodes in the lives of Maritgen Jans, Juliana, Elsje Christiaens and Pussy Sweet, four fierce angels who were not getting their fair share of the Golden Age’s gold and took their destiny in their own hands. These stories show the power relations between rich and poor, as well as between men and women, in the early days of our capitalist society.
In this interview, Avinash Changa from WeMakeVR in The Netherlands talks about how this project came to life.
What is “Angels of Amsterdam” about?
Avinash Changa — The concept for Angels came from our co-creator, Anna Abrahams, who is a filmmaker and a curator for the Eye Film Museum here in Amsterdam. And within that context, she was exposed to a lot of different VR pieces.
There's this little booklet, “A Rough Guide to 17th Century Amsterdam”. It was written in the 17th century and listed the best places to gamble, the best place to drink, the best brothels. Anna was inspired to turn that into a film. Being inspired by seeing so many VR pieces as a curator, she ended up like, “Oh, I want to explore this VR medium”.
So she started talking to a lot of people but was not able to find a team that understood her vision. The piece features women from different walks of life and deals with inequality between men and women. It deals with violence and racism, against immigrant women. It deals with sexual liberation and prostitution. These themes were very prevalent in the 17th century and these topics are more relevant today than ever.
When Anna started talking to VR makers, a lot of them interpreted this project more as a sensationalist piece, like “We can show nudity”. And that's not what this project is about.
Now, the reason why personally, and as a maker, I wanted to make this project, was because my mom came to the Netherlands in the 1970s as an immigrant women, and she dealt with a lot of these issues herself. And as a kid, I saw this first hand. I saw domestic abuse. I saw when she became a single mom, trying to feed a family. She was an accomplished teacher but due to racism, she was not able to get a job in her profession here.
When I met Anna and I heard the concept and I started thinking about my personal background, that became a reason to say, “Okay yes, let's tell this story”.
What was the collaboration like?
To tell a story like this properly, it'll cost. This kind of budget is not available in the Netherlands, even though we can get a lot of help from the film funds. So you need that personal motivation to make this kind of commitment and investment. Anna was really the initiator. She is really talented in finding the right performers and she knows what to do in terms of getting someone in front of the camera. Telling that story in an immersive medium though is my forte. From the get-go, she was open to a very equal way of collaborating.
As a medium, we don't have the rules from over a hundred years of cinematic filmmaking because a lot of those rules don't really apply or translate directly to the immerse medium. So, telling a story like this is finding that balance between where do you make it interactive? What amount of agency do you give to the user? That's definitely a process where you need to have an equal collaboration with your entire development team – and that's something that you generally don't see when filmmakers say “you're a tech guy, just make my concept”, even though the concept might be flawed when you tell it in an immersive medium.
So there was a true connection between what is possible technically and where you want to go creatively?
The four stories in “Angels of Amsterdam” are true stories. These are real people that actually lived through these hardships. For example, the court reports show their documented personal statements. It's all very real. So for me, it was really important that the user feel that true human connection with these characters.
Instead of being in this fancy technological environment that looks impressive, my goal was to get to an experience where the technology was invisible. I did not want the users to be distracted by the tech, but to be immersed in the world. Therefore, to tell this story properly, it was crucial that the environment was going to be as realistic as we could make it with talented 3D modelers and LiDAR scanning. But if you use traditional humanoid CGI characters inside that photorealistic experience, you get a disconnect.
Even if you look at really high-end video games, you still get at best a scanned human model that is then driven by mocap. And if you're lucky there's been a bit of facial animation and you've got about 50 blend shapes to deal with and that still feels quite artificial. And the moment that happens, you don't accept a human as a human and you hit that uncanny valley. You get distracted by technology. We started thinking, how can we tell the story in a way that these people feel real? We had a background in really high end stereoscopic VR pieces. Back in 2013, we invented one of the first professional view stereoscopic VR cameras. So for me it was always important that if you do something with a real scene and with people, that it feels real.
When we started this project, which was Q4 2019, doing volumetric was not a very common thing, it was still early days. Anna as a filmmaker wanted a cinematic look, and she had seen some of our stereoscopic work. I told Anna I wanted to do this in a volumetric setup to give the user a better connection to these characters. Anna had seen some volumetric pieces as a curator and she wasn't impressed, but I convinced her to let me do an experiment with a volumetric pipeline.
We started thinking about how to use Depthkit Cinema. How can we set up a different post processing pipeline? What can we do on set in terms of lighting to make it look as real as possible? We shot references of the lighting on location. The piece takes place in a real bar that opened in 1642. We measured what the different light intensities were, the textures and the colors there. And then we set up a test with practical lighting, combined with Depthkit Cinema. When that all came together, that was the turning point where Anna saw that if we tell this story in this way, people will not think about the tech. They will just connect with the characters there.
You're doing something really ambitious, which is a period piece that's based on real people and dealing with some pretty significant themes. Did you draw on any particular creative inspiration and aesthetic references?
We definitely started from a blank slate because this is such a unique project and nothing's been done that looked like this.
We've seen various volumetric pieces but not something that achieves the kind of blend that we wanted. We looked at existing volumetric pieces like Carne y Arena, Cosmos Within Us and Vestiges. We also looked at motion capture. We looked at body scans. We looked at mesh carving. Those things were impressive but not good enough. So, from those references, we knew that if we want to hit that next level, let's do a lot of experimentation. Let's do our homework. Let's try a lot of things that other people haven't tried yet. We started researching, and talking to other makers. Our R&D led me to the conclusion that we have to develop a new pipeline and a new way of shooting to get to where we want to go.
Tell me about your technical pipeline to create this never-been-seen before look.
This whole experience takes place in one environment. Normally we would have 3D models built from photo references. But in this case, since the actual place exists, we tried photogrammetry. Photogrammetry is a nice technology, but in terms of the detail in your mesh you still end up doing a lot of remodeling and rethologizing. So we got in touch with a company that has high-end LiDAR scanners and we spent two nights trying to get every millimeter down of this bar. I think the environment on its own probably has about five months of work invested in it to get it to that level. In the actual experience you don't see everything you can. There's an upper floor, there's an area around a corner and there's a basement. All of that was created but didn't make it into the final cut of the piece.
So then we had this environment and we needed to get these characters in there. One of the first things that we did with the original concept was 360 videos. What we learned from that process in shooting it in a stereo plate and taking into unity and all of the manual steps involved, it became clear we need to to find a different approach.
We wanted to have six degrees of motion in this piece so we looked at volumetric video. But the formats that studios supply are too heavy. To make it lighter, we would have to do a lot of that re-encoding and find a way to properly play that out because there were not many decent integrations into a Unity environment. It was also really expensive to shoot in an Intel capture stage. It just wouldn't fit within the financing models that we have here in the Netherlands.
Then Depthkit Cinema was announced and it seemed like something interesting to explore.
What were some of the technical challenges and discoveries?
A big part of making the characters look real was finding the right lights. And we spent a lot of time trying a lot of different lenses. Do you go for 14 mil or an 8 mil? How do we solve for lens distortion? You want to work with certain high-end codecs to retain as much color information as possible, but not all of those codecs can go into the pipeline, so that was tricky.
Then the next challenge, how do we get this to run properly in Unity? We played around with using FFmpeg in real time as our decoder versus AVPro, and we ended up finding a workflow and a pipeline that gave us a good result. To get to that next level of realism, it's not just about resolution. Frame rate plays a big role. In my ideal world, we would get to 240 or 320 frames per second. In our play out, this sounds extreme, but something magical happens. I'm really convinced that if we get to a stage where we as a next step could do 120 fps depth data and 120 fps video data and play it out, your sense of immersion just shoots up. But yeah, just based on time and cost and how far we could push the tech, that was a bridge too far. The average user is impressed with the piece, but as a maker if I see these imperfections, that frustrates the hell out of me. At some point you kind of have to accept and move on.
Now another thing that we ended up discovering was that you cannot go through this whole process with run-of-the-mill hardware. So we got in touch with Nvidia and HP, and they loaned us a workstation with an A6000 graphics card and 48gigs of RAM. But then we ended up in a bottleneck with distribution, because if you package this all as an executable, you cannot encapsulate all of these binaries, license-wise. We ended up not getting to that stage where every single user can just easily play it on every piece of hardware. We still are changing a number of things to make it user friendly enough to put it on Steam.
Where is the project now?
We worked on this since Q4 2019, and it was finished in July 2021. It was quite a journey. This was the first project made in the Netherlands that was selected for the Venice Biennale in competition. Then we brought it to the Netherlands Film Festival. As soon as the festival run is completed, we will make it available on Steam as a downloadable project for consumers at that point. And that's something that we're already working on. We are trying to do a further optimized version, to make it a bit lighter. Or if anyone comes to Amsterdam and gives us a call, they can see it in our studio.