Object motion blur in Blender 2.8 EEVEE



This tutorial is aimed at Blender3D users, that are already familiar with the interface, and are at least testing the latest 2.8 releases.
At the time of writing the EEVEE engine for –almost– real time rendering doesn't really support motion blur. It has a checkbox for MB under render settings, but it only works for camera movements, which renders it mostly useless for professional animation work.
EEVEEE has many advantages, even though it's still in beta stage, it's amazingly fast, and can deliver very effective renders both for standard 3D animation, for stills and for motion graphics, and its realtime feedback makes for an ultra efficient workflow.
This is a much requested feature, so we all hope it'll work at one point or another (.81, .82?)
In the meanwhile we can take advantage of this sort of workaround to have the best of both worlds.

In its simplest form it's very easy:
  1. You render the animation at 10 or 20 times the fps of the final render (meaning 240 to 480 fps)
  2. You blend 10 to 20 frames together to get the nicely cinematic blur at 24 frames
To do that efficiently, and fast, the best methods that I found involve rendering in blender, and crunching the 480fps video using another software.
The "another software" part can be solved using either After Effects –if you have the software and closed source is not a problem– or, if you don't fear a little command-line/terminal/Cygwin/whatever using a few lines of copy-pastable code and two open source tools: ffmpeg and imagemagick convert.
I tried to use the video compositor inside blender, but I couldn't manage to make it work properly. If someone can, please let me know.

Now on to the details

For the animation part, we do everything as usual, if the final work is set at, for example, 24 frames and spans 15 seconds, we set that up as usual, and animate as we always do. We can make test renders and everything with our usual workflow.
When we are ready for the motion blurred render we have to make some tweaks.

We have to decide the amount of frame upsampling that we want to use. In my tests 10 to 20 times –rendering 10 to 20 frames for each final frame– is enough compromise between quality and speed. the faster the movement present in the animation, more frames we will need for the blur to look good and with no banding/strobing artifacts. If we have slower motion then 10 frames may be ok, or even less. Let's go with 20 for the example. This are my render output settings before the tweaking:
1• Multiply the Frame End and Frame Rate fields by 20. Fortunately all numeric fields in blender support math, so we can simply append a *20  to the fields "Frame End" and "Frame Rate" and blender will do the math, reducing the margin of error.
To edit the Frame Rate we must first select Custom from the drop down.
So we end up with both the Frame End and the Frame Rate both multiplied by 20.

2 • Open the little Time Remapping options and set them to Old=1 and New=20.

 
3 • Set the output File Format to OpenEXR, and the output to an empty folder –to make the second part of the process easier, foolproof and more scriptable–. I also leave the filename empty, just the folder, so that blender outputs the files in the "0001.exr" format, for easier scripting.
(this will make the blurring of hdr data behave as expected)
4 • Last but not least: do NOT enable Motion Blur in the Render settings. We will take care of the bur ourselves so we need nice sharp frames without partial blurs from camera movement.

So with everything tweaked this way, hit Ctl+F12 and wait patiently for the folder to populate with the image sequence.

Crunchy

Now with a folder full of 7200 fat and tasty .EXR images taking a lot of space we are ready for the second part of the trick.
I said that we have 2 methods for crunching the sequence to 24p again. One involving After Effects –easier, but slower and more cumbersome in the long run– and another using command line utilities.

you don't know the power of the dark side...

If you have AE available you have to just:
  1. launch it, 
  2. import the EXR sequence to the project
  3. drop it into a new Comp with the final framerate settings (24p)
  4. Make sure the Color setting is set to 32 bit float

Then right click the image sequence and select Time>Time Stretch...

And set the stretch to 5% (for 20 frames per frame) or 10% (for 10) or use simply 100 divided by the same factor you multiplied the framerate inside blender.
Now to make final magic work, you have to enable frame blending both for the comp AND the EXR layer, making sure to select the weighted blending (the two films icon) and not the motion interpolated version.

Now you are ready render the motion blurred version of your animation.
Should render very fast...

But, if you happen to don't have AE, you got kicked out of the CC Club because you lost your credit card in a boat party or just happen to enjoy the command line like crazy (I do) you're in for a treat with the second method

The command line version

I love the *NIX command line.
I'm not a programmer. I'm not a developer. I have no other specialized background other than arts, and I suck at math. I just work as a graphic artist/designer. But after the first shock I learned to love using the old terminal. Is simple (in it's own way). Is powerful –a lot–. Has the same interface over the years (decades) and is amazingly, blazingly, awesomely fast and efficient. You can solve tasks that look very daunting in one line of carefully crafted commands... So, if you don't use it give it a try. It simply works.
Now, to the point.
We will use two very common utilities: 
  • ffmpeg
  • imagemagick convert
If you are in Linux they may already be installed.
If you are on Mac/OS X, they can be easily installed using homebrew or macports (I use homebrew http//brew.sh)
If you are on windows I guess they can be installed under Cygwin or something like that. I don't use Windows so I can't help you with that. 

for simplicity I packed all the commands inside a simple to use function.
So there will be 2 lines of code: 
  • the function definition, that you need to execute just once per session (or you can store in your .profile or .zshrc or whatever you use for eternal reuse)
  • the call to that function that process the video itself.
[disclaimer: as I said earlier, I'm by no means a *NIX genius, barely a user with some scripting skills, so, if you are that genius please forgive any mistake a may have made, and feel free to correct me. Just be polite.]
[disclaimer 2: I made this by hand, man and google under OSX and the zsh shell –which I love–. I think that all of these should work ok under bash –tested– and/or most Linux distros –not tested–. Can't tell about Cygwing or the newer linux thingy that comes bundled with recent windows.]
So:
5 • open your terminal.

6 • Go to the folder where you have EXRs to do that type:
cd
and type a space. Then drag and drop the folder containing the EXRs inside the terminal folder and hit enter.
The prompt should tell you somehow that you are now inside that folder. if in doubt type:
pwd
and hit enter. It should print the full path where you have the EXRs

7 • Copy/paste this in the terminal window and hit enter:
function fforcemb() {  ls *.exr | xargs -n $1 sh -c 'convert "$0" "$@" -average -gamma 2.0 "$0".png ' ;    myindex=1 ; for file in *.png ; do mv $file frame_$((myindex++)).png ; done ;  ffmpeg -r $2 -i frame_%d.png -pix_fmt yuv420p __output_mb.mp4 ; }

This defines a function named fforcemb (yes with double "f" I use that for all functions involving video processing via ffmpeg).

8 • Now you  call it by simply typing:

fforcemb 20 24

and hitting ENTER.
20 is the frames multiplier we used earlier, and 24 is the final fps for the rendered video, if you used other settings you must modify that, for example if you rendered a 30fps project with a *10 multiplier, you must type fforcemb 10 30.

9 • Now you wait until the process is done, usually a couple minutes, depending on your machine, size and duration of the sequence.

10 • When it's done you will have an "__output_mb.mp4" with the motion blurred video.
From now on you can use the function directly by typing the fforcemb thingy until either you close the terminal, or restart the machine without pasting the function definition each time.
(If you know what you are doing you can add it to your startup scripts in your .profile or .zshrc files).

Compression quality

Ok. Right. These function makes an awfully compressed mp4 video unsuitable for professional work. Is true. But not completely. You have the full uncompressed version in PNG sequence format ready to use in the same folder. If you have issues with the PNG gamma you can modify the function above to output an uncompressed MOV or EXR sequence. If you only need an EXR sequence you can just use something like
enter:
function fforcembEXR() { mkdir __OUTPUT ;  ls *.exr | xargs -n $1 sh -c 'convert "$0" "$@" -average -gamma 2.0 __OUTPUT/"$0".exr ' ;    myindex=1 ; for file in __OUTPUT/*.exr ; do mv $file frame_$((myindex++)).exr ; done ;  }

This should be used typing:
fforcembEXR 20 24
and hitting ENTER.

Note that these things are usually very sensitive to CASE mismatch, spaces and typing errors.
It will put the EXR sequence inside a folder called __OUTPUT.
I use underscores at the beginning because Macs like to put that files at the top of listings. I know Linux behaves differently, feel free to modify it to suit your needs. If you use Linux you probably already know what to do about it.

Issues

 

This approach has one somewhat big flaw right now:  the amount of blur you get is about twice what you should. This is very technical, for people who understand how this works, the shutter angle we are getting is 360, and we should get 180 ideally. Or be customizable. With my current knowledge of shell scripting in easy to use and write one-line functions, I can tweak that issue at the cost of less flexibility (fixing the frame multiplier). For the shell wizards out there, to do that I need to parse half (or whatever less) the parameters passed via xargs, but still advance the whole batch to the next frame. I don't know how to do that without hardcoding the variables like $0 $1 $2 $3 etc to isolate 10 from each batch of 20.
Like this:
function fforcemb180() {  ls *.exr | xargs -n $1 sh -c 'convert "$0" "$1" "$2" "$3" "$4" "$5" "$6" "$7" "$8" "$9"  -average -gamma 2.0 "$0".png ' ;    myindex=1 ; for file in *.png ; do mv $file frame_$((myindex++)).png ; done ;  ffmpeg -r $2 -i frame_%d.png -pix_fmt yuv420p __output_mb_180_.mp4 ; } 

The usage is exacly the same, 
fforcemb180 20 24
the 20 is the multiplier and 24 is the final framerate. With these you can tweak the shutter angle editing the multiplier across the whole workflow. Where the mult is 20 the shutter will be 180. If you render at *40 the shutter will be 45, and if you render at 10 the shutter will be 360 again. This is a more accurate version, especially if you plan to use it in composite with real footage.
This version also wastes half the renders –they are discarded–.
maybe it can be improved via a python script that render only the frames needed and put empty placeholders in the "blind" frames –the time the shutter is closed–.
If you have any idea to improve any of this please tell me.
It should be noted that the After Effects blending method works only in the 360 shutter version. 

Final words

So, that's it. Looks more complicated that it is. I'm using this in production right now and is as simple as editing render settings (may write a python script/addon to that part) rendering and typing the fforcemb in a terminal.
It takes some time to render but usually a lot less than using cycles with high samples (needed to achieve decent blurs). It has no compatibility issues (lights, shadows, cameras, particles, plugins, deformation, reflectio, etc) everything works because its a really unoptimized-brute-force
 approach. Let me know if you find some issue/incompatibility.
I'm sure the devs will develop an amazingly optimized version of eevee mb for 2.81 or 82 that renders this technique obsolete. I hope so :-)
Or at least the vector output :-)

This is my first ever online tutorial of anything, so forgive me again any mistake of I made a mess of an explanation.
I may make more tutorials in the future about the awesomeness of the terminal for everyday graphics work. I have always one command promp at hand and is a real life saver.

[EDIT: this last image was recombined using the Blender Compositor. looks promising]


Comments

boyleo said…
Thank you. This is very helpful.

It might be good idea to write shell script containing those functions
so you don't need to copy/paste them every time.
Pablo G said…
Yes, I load the function in my startup script (along with many more useful funcions and aliases I've been collecting/crafting over the years) but explaining how to to do that was beyond the scope of the tut. And making an executable shell script is also too much for beginners, I don't want to scare shell newbies with chmod +x commands they don't strictly need to make this work. I thought that copy pasting the function for each session is a good compromise. And more experienced shell users already know how to make a function always available ;-).
Thanks for your feedback!
Pablo G said…
I tried a 3rd technique to blend the subframes using the Blender compositor that works ok, so the whole thing works without leaving Blender, but it takes some node setup. I'll try to make a python script to somehow automate the process, and that can maybe end wrapped into an addon that motion blur in one click.
şükrü said…
This comment has been removed by a blog administrator.

Popular posts from this blog

Motion Blur in EEVEE, full Blender version