Performance regression
I have seen a performance regression with a 10bit AV1 file I use for testing on my i5-7400. Somewhere between 0.71.72 and 0.8.02 is where this has occurred. A particular scene I test in 0.71.72 improved to a point where I have no dropped frames, but from 0.8.02 I'm dropping more frames in this segment than even with 0.40. Is there a expected reason for this?