That is different to this "The Dreamscene version that everyone is using right now does NOT have the GPU optimization feature that the final version from Microsoft will have"
I would guess that the FAQ entry refers to fixing bugs particularly in how it picks the mpeg2 decoder. This would help if your system was picking up a mpeg decoder which did not use the mpeg2 decoding that your graphics card can do. This will still not make mpeg2 decoding free though as there is only so much a graphics card can do.
I am sure they have done some careful profiling of the code to see if there are areas to be improved, but as I said before, turning on a new feature in the final release is asking for trouble.
Another area which may have been improved is the performance of the selection rectangle drawing on the desktop. This is very slow in current builds. This will not help general performance though.
I would be surprised if the cpu usage of rendering the videos dropped by more than 15% if your machine already picked up the right decoders. If you had the wrong decoder then yes I could see a bigger drop, say halving the cpu usage.
I may be wrong and the final version will use <10% cpu on my pc. In which case that will be wonderful, but based on tests with playing the same videos in WMP on XP (where decoders have HW acc working), I have my doubts.
The following page shows some nice charts of the cpu usage of decoding various formats on a AMD Athlon 64 FX-60 CPU (2x2.60GHz, 2x1MB L2) using the latest graphics cards.
http://www.xbitlabs.com/articles/video/display/video-playback_6.html
http://www.xbitlabs.com/articles/video/display/video-playback_7.html
http://www.xbitlabs.com/articles/video/display/video-playback_8.html
Taking those results and working out the same scores for a low end cpu shows me that they are within 10% with dreamscene on my PC. The exception being WMV HD, but thats because my graphics card has no proper WMV HD decoding ability and the cards in the review do.