I find this article very confusing. According to the second diagram, they get 60 fps, but it's still off by 2, which is not good, and rasterization takes almost a full frame, which I find hard to believe. In the next paragraph they say that rasterization only takes 10-20% of a frame. They say it was a data-driven decision, but I don't see any data supporting it. Even the graphs in the "Benchmarks" section use relative percentages, which tells me very little.
They should have used only ms/ns for every timing/graph, not FPS or relative percentages of whatever that was.
Yes :) all current browser engines have their roots in the Nineties (webkit is also based on KHTML). This is what WebRender (part of Servo) to be added to Firefox Quantum / Gecko will change.
I don't quite understand the graphs. If all we did was move rasterization to a separate thread, how does it take any time in the frame at all? Are we measuring instead the cost of synchronization/contention? If we measured rasterization time exactly the same, wouldn't the absolute quantities be relatively similar since we aren't changing the rasterization algorithm itself?
One interesting thing I realized reading this is that I've never bothered looking at Direct2D, having used Direct3D forever.
They should have used only ms/ns for every timing/graph, not FPS or relative percentages of whatever that was.