Ode to Joi
Model: Nathaly Blue
Tools used: Blender, Photoshop
Created: September, 2019
Two years ago I started to learn 3D modelling.. and I still can't do that. ;)
In the meantime, I did however manage to create the above image, and I am actually quite proud of the result!
Here's a little behind the scenes.
The backstory
In March 2018 Nathaly Blue and I got the idea to create an hommage to Blade Runner 2049: a new version of a particular scene in that movie. Cool! We sketched what we needed. I figured out the color scheme and the light setup and we had a great shoot!
Mind you - the "figured out the color scheme" bit is not completely trivial. A 3D/photomontage combination requires planning. What is the light source you're aiming for. Where does it come from? What shape does it have? What colors? For this particular scene we wanted to create a bigger than life version of a hologram. And since a hologram emits light (in my universe) we could get away with a lot of the nasty details.. I can light it with whatever, as it would then become its own light source in the 3D world!
Cool, easy, what can go wrong. And with that thought in mind, we picked a color scheme that suited the Blade Runner feel the most.
Next up: building the scene. Initially, I thought to do this in 3D, but after a few attempts I had to concede I was't ready for that yet. So, back to photoshop. That worked. We got this!
Ode to Joi - as posted a year ago.
Then came Blender 2.8
But the original idea still haunted me. This wasn't the result I wanted to get.
So I set out to learn Blender - a free opensource modelling and rendering program. Free is awesome, but like any new toolset it comes with a learning curve before you can use it. So I had to learn. A lot. In my previous blog post (from 2017 . . . ) you can read about my first steps. Back then, both my old computer combined with the not so realtime cycles renderinge engine prevented me to truly understand what I was building, and I spent more time waiting than time building. That didn't work for me, and I stopped.
Then came Blender 2.8! Wow! Blender 2.8 is a big milestone release with a new UI, a much faster rendering engine, and it's actually.. easy! It's amazingly fast and (for me) intuitive to use. Combined with these tools and a sponsored free cyberpunk kit from kitbash3d I went to work.
In earlier works I post-processed (read: photoshop) the model on top of a 3D generated scene, but I couldn't easily do that here. The hologram would emit light, and that would cause all sorts of reflections. It needed to be in the scene. As a test, I placed cut-outs from both model images as plane meshes in my 3D scene and set them up as emission shaders. That worked wonders! Reflections suddenly started appearing correct-ish, and I got the first feel that this image was going somewhere! At the end, the only thing left was building the bridge itself. And to be honest, that is where I could still improve the most on - in its current form it's still a bit.. simple.
During this process, the new Eevee rendering engine really allowed me to work fast. Real time visualziation of results that are almost good enough, wow! To compare, here's an early render in both:
Not bad right? Both could be useful in a pinch. To compare:
- Left: Eevee. The new engine in Blender 2.8. Render time - practically instantaneous. What you see is what you get. Bonusses: volumetrics and post effects (such as bloom) you get bascially for free. Problem areas: reflections aren't as nice, and there are bugs. Pay attention to the building on the left: the reflection of the model isn't in there, and there's something really weird going on with the windows.
- Right: Cycles. Render time - ouch. The final render I made took an hour to calculate @ 300 samples. That's still perfectly okay if you know this is the final final, but if it isn't it will mean a lot of try-out-and-see loops. Longer render times yield more samples yield less "grain" and dead pixels. It's a trade-off. However, you can't beat cycles if it comes to good reflections. The windows in the building to the left show what I want it to show, and the reflection on the road is much more accurate.
So what do I use? Both. Eevee is absolutely perfect for seeing where I am at during building. When it's done, I set cycles up for a longer render and I fire up photoshop afterwards.
I took the above Cycles result to do a first draft photoshop test to see what I need to do to get to the actual image. I quickly added some atmosphere and rain to see if the end result looked like what I wanted.
Nope! Not yet. For one thing. The road didn't have splatters of water on it from the fake rain, and second, it's all a bit far away isn't it? So I changed both in 3D, and then re-did all the photoshop work. And that's how the final image came to be:
Thank you for reading!
But what, where are the behind the scenes?
Okay, here they are!
Here's the Blender 2.8 UI working on the final image. The view on the left shows me roughly what I am doing, though without the full rendered lights. Blender allows me to quickly grab-and-drag stuff I want to have elsewhere, so this view helps me out a lot. The view on the right gives me a quick overview to see how everything is now organized spatially.
Here's the final result again, without the photoshop layers (no rain) and taken from a different perspective. Because the models are already in the scene I could do rapid recompositions to test out various viewpoints.
And here is the same scene - rendered in Eevee for speed. Here the perspective really starts to break. The whole image only works when it's set in the perspective I intended it to work in... anything else makes it amazingly fake and empty.