Larry Elie (Ldeliecomcastnet) Intermediate Member Username: Ldeliecomcastnet
Post Number: 71 Registered: 10-2006
Rating: N/A Votes: 0 (Vote!) | Posted on Wednesday, February 04, 2009 - 5:25 pm: | |
Recently I picked up a Sony 51" HD rear projection TV cheap. Sure, I would have like a nice new Samsung 6 color DLP that can sync at 120Hz, but they are expensive. The Sony, since it has real CRT's can still work with some of my glasses by using the second DVI output from my nVidia 7950GT card, but of course I will have flicker as I can't get the refresh real high. That's the down side. The reason I got the TV cheap is the convergence was way off. It turns out that Sony is using two "convergence IC's" to converge HD TV's. I replace the chips and everything is great. Think about it through; converging a 1080i picture has to be done fairly accurately... BOOM! It hit me. In the old days of 480i, there weren't all that many lines across a big rear projector. That's why you can't do lenticular 3D on a projection TV; there aren't enough lines for the prisms and you can't converge the thing that well anyway. But now perhaps it's doable. Lots of lines. Not quite enough to cover every lenticular prism on the screen, but we aren't off by an order of magnitude any more. If Sony could converge that many lines on a retail product, is it beyond reason that they could also converge on a prism-by-prism basis for a real glasses-free lenticular display that size? It would make my little i-Art display look pretty tiny. I know all about the viewer positions, yep, that's all still true. Sure, there might be easier ways of doing it today on a really big LCD, but I don't think it's impossible anymore. Thoughts? Larry Elie |