I’ve just pushed an initial implementation of autocrop feature. I tried hard to make it as light as possible, and finally I decided to make it run via small steps spreaded over frames.
To better explain you how it works, here’s a little video:
https://mega.nz/file/gOkgWQhZ#6-ODLxxMP52PCTN8D3y5eW06ausJvJi2NU3RTBl5oVc
So, there are new parameters:
Autocrop maximum amount:
The higher, the more solid borders wil be cropped around the image.
Samples per frame:
Higher values makes the shader search more in a single frame for solid areas.
This leads to more accurate result in less time, however it will also stress the gpu more.
Fortunately even low/lighter values like 10 will work good if you're ok
in waiting 2..3 seconds for the final crop value to be found.
Scene change treshold
When autocrop finds a maximum crop value, it only tries to crop more when the scene changes.
By lowering this value, you tell the shader to try higher the crop more often.
Transition speed
This modulates the smoothness of the animation between various crop values.
The good news is that even for low values like 10, in a matter of 2…3 seconds it is able to reach almost full precision.
Also, if you plan to use it with games that have well defined solid areas (C64, Amiga), even 1…2 samples per frame are enough.
With default values of 20% autocrop and 10 steps per frame, the performance hit is about 4fps/100 on my Haswell, while the basal hit is about 1fps/110.
Caveats:
- For performance reasons the function that decides if fake integer scanlines are needed does not take autocrop into account.
- As soon as autocrop is activated, you’ve to wait for a scene change to see it in action.
Also, a big thanks to fishku from Discord for hints on low discrepancy sequences
There may be bugs and the need to test how parameters interacts each other to find good defaults.
Just pushed some good default (to me), but your mileage may vary and I’ve not tried many games so please, report!