You never answered my question Mr. HD Resolution, what TV do you have that plays 720i? You may call 720P as "true 720I" but you are flat out wrong.
" It is theoretically impossible for a 1920x1080 image to be displayed legitimately on a 1080i monitor, as interlacing technology does not write 1080 horizontal lines per refresh, period."
Here, I did your homework for you. You ARE correct in ONE thing; a 1080i set cannot draw (you said "write"?) all 1080 lines per refresh. For a TV to be able to do that, it MUST be a PROGRESSIVE set, no ands, ifs or buts about it. Read and learn newbie: http://forum.videohelp.com/threads/259672-Does-720i-exist
This is all old news and you should not argue what you cannot grasp: http://www.hometheaterhifi.com/volume_10_4/feature-article-hdtv-time-to-buy-part-one-10-2003.html
Excerpt: The 1080i format (āiā is for Interlaced) has a resolution of 1920x1080, or about 2,073,600 pixels per image, and the 720p format (āpā is for Progressive, where all the scan lines are shown sequentially) has 1280x720, or about 921,600 pixels per image.
Now, stay out of the big boys discussion until you learn some manners.
And just for the record, I cut the cable cord (minus broadband obviously) 4 years ago, have 45 channels OTA, and pay for my movies thru Netflix, not stealing pirated movies as you so readily (and proudly) admit to stealing.
This all has to do with the topic since you brought up that you know things that everyone else here knows is simply false, not true, and bullshit.
Re: The Cable Cutting Cronicles - Week 3
There is 1 reply to this message