For the past few months my system had been struggling to keep up with both 720p and 1080i broadcasts. I was using Linear deinterlacing and the standard decoder. Not always the highest of quality, but it played smoothly. Still, the processor was always in the high 90s. Not a lot of room for error.
Recently there were a couple of threads in the mailing list with people reporting much-reduced processor time with a new (to the 9000 series I think) nvidia x-driver option called 'UseEvents'.
A week or so ago, I added this option and it reduced my processor usage to the mid 40s. After enabling Bob deinterlacing and the ffmpeg decoder, my usage is up around 60-70%. And the best part: the picture looks fantastic. Very smooth, very very few artifacts or tearing.
Good stuff.
Now if only they would get the edid to recognize the output from my tv so I could output real 1080i, it might look even better.
Tuesday, January 9, 2007
New Home
Previously I was just updating (infrequently as of late) a website hosted by Google Page Creator. This (blogspot) seems like a more appropriate venue, and with the domain feature and all the cool customizations, I decided to make the switch.
While most of the posts here will be about my myth system(and the endless tinkering needed to keep it happy), I will also be posting about anything techy that suits my fancy.
While most of the posts here will be about my myth system(and the endless tinkering needed to keep it happy), I will also be posting about anything techy that suits my fancy.
Subscribe to:
Posts (Atom)