madness of 1080p

Ecological issues around computing. This is an experimental forum.

Moderators: Ralf Hutter, Lawrence Lee

xen
Friend of SPCR
Posts: 243
Joined: Fri Dec 28, 2007 11:56 am
Location: NH, Netherlands

Post by xen » Sat Jul 05, 2008 6:10 am

Vicotnik wrote:It's like how 128kbps MP3 sounds ok (or at least no different than the source) on a pair of $5 passive computer speakers, but when you hook up the thing to your decent sound system that is no longer the case.
Yeah I'm thinking of replacing my amplifyer and cd player with something better, but i'm worried that my speakers will suddenly begin to sound like crap. I'd have to replace my speakers too. On the other hand, the speakers are like the end station in the chain, so perhaps it won't be a problem. The reverse - crappy system with good speakers - would more easily become a problem, I'd think.

jessekopelman
Posts: 1406
Joined: Tue Feb 13, 2007 7:28 pm
Location: USA

Post by jessekopelman » Sun Jul 06, 2008 4:02 pm

xen wrote:Yeah I'm thinking of replacing my amplifyer and cd player with something better, but i'm worried that my speakers will suddenly begin to sound like crap. I'd have to replace my speakers too. On the other hand, the speakers are like the end station in the chain, so perhaps it won't be a problem. The reverse - crappy system with good speakers - would more easily become a problem, I'd think.
Conventional logic is the opposite -- end stages are more important than the middle. In other words, source and speakers are both more important than amp. The reasoning? Well amplified crap will still sound like crap. Great stuff distorted by speakers will no longer sound great. I'd say that you only have to spend a lot of money on any component if you plan to play really loud. For most typical scenarios a $200 receiver coupled with a $500 5.1 speaker setup will sound excellent if properly configured. If you've got a 1,000 sqft room you will need better stuff (especially speakers). If you've got a 100 sqft room you can get by with cheaper stuff. Given that speakers can easily last 20 years, I think it is worth spending whatever you can afford with them more so than active components (that tend to become incompatible/obsolete even while still working well).

croddie
Posts: 541
Joined: Wed Mar 03, 2004 8:52 pm

Post by croddie » Sun Jul 06, 2008 6:29 pm

jessekopelman wrote:In other words, source and speakers are both more important than amp.
Yes
If you've got a 1,000 sqft room you will need better stuff (especially speakers). If you've got a 100 sqft room you can get by with cheaper stuff.
Size of room and quality of speakers is pretty much unrelated.
If you've got a badly shaped or very small room, save more money to spend on room treatments. That's the only connection between room and money that I can think of.
Unless the room is really huge; then you may have to use professional active high-wattage speakers (may or not be high quality, may or may not be expensive).

xen
Friend of SPCR
Posts: 243
Joined: Fri Dec 28, 2007 11:56 am
Location: NH, Netherlands

Post by xen » Mon Jul 07, 2008 1:05 am

Hm okay. Right now the weekest link in my system is really my computer. The rest of my audio equipment is all second hand or junk. I was thinking of replacing my amp with an almost exact equivalent, that is, another analog stereo amp, the Marantz PM4001 OSE that's on sale now in the local high-end store for about €275 together with its associated cd player for €150. I thought about just buying it since I don't feel like educating myself so much on this topic. What I gain is quality, a remote control, and my current amp now and then shuts off one of the speakers. I'm guessing it would still sound better with my current speakers, until I come around to replacing them as well. If I need digital I'll just put in an external DAC, expensive yes but I like component building ;). I just consider it a nice investment for the future.

But my line of thinking was: bad stuff run through excellent speakers, you'll hear every detail of how bad it really is. My current speakers are quite warm, they tend to create a bit of a fuzzy spacial sound, little detail.

whiic
Posts: 575
Joined: Wed Sep 06, 2006 11:48 pm
Location: Finland

Post by whiic » Sun Sep 14, 2008 3:56 am

xen: "I myself have been watching low-quality anime encodes for years, and I must say I have been stunned now and then to see the real (DVD) thing. I think DVD has great quality. I know that some (many?) people use their DVD/harddisk recorder to record tv shows and movies in less than maximum quality because, well, they think the minor improvement in quality that can be had is not worth the harddisk space."

With poor compression of codecs used by DVD/HDD recorders and with non-expandability of capacity within HDD recorders this might be a viable alternative.

(Sure, some recorders can have HDD swapped. Topfields are good for this. Unfortunately most of them use PATA HDD which are difficult to obtain in size bigger than 500GB while SATA HDD come as big as 1500 GB... and growing.)

But for downloading from the net, higher res AVC/AAC (or AVC/Vorbis) encodes are usually the same size as lower red DivX/MP3 encodes. AVC (h.264) and AAC just happen to have superior compression ability with less perceived loss of quality than old encodes, allowing either making smaller filesize for the same resolution or superior resolution of same filesize. The cost is, even if you make a small filesize encode that wasn't any better looking than the legacy DivX encode, AVC will require more CPU calculating power.

xen: "Everybody seems to be busy getting their systems ready for Full HD video. I think it is madness. The industry naturally pushes this development because they need the world to dispose of the current generation of audio/video/computing hardware, so that they can then sell the next generation to us. Another generation of devices made obsolete."

Half HD (which is what I like to call "HD Ready", since HD Ready will never ever mysteriously transform to real HD) doesn't require any investments. My "old" 1280x1024 19" TFT display can display Half HD in it's native resolution. In fact my 19" is probably the best divice existing for displaying that material. Why? It's no even widescreen.

Reason: there's no widescreen 720p displays or televisions. There's only 1366x768 (TV 16:9), 1440x900 (monitor 16:10), 1680x1050 (monitor 16:10)... all do upscaling from 720p. With monitors it'd be upscaling video footage only since all GPUs support the 16:10 aspect ratios.

Most GPUs also support certain 16:9 aspect ratios for DVI->HDMI connection to table TVs. Unfortunately they typically support resolutions in HDTV specifications (720 and 1080), not resolutions that are most typically used in such TVs (768) so no only do you have upscaled video but in case you extend your desktop to secondary display, you'd have upscale of you whole f***ing desktop! Now that's major pain in the arse looking at everything on your "HD Ready" TV in blurred mode... anything in native resolution looks better... even if it's a lower resolution.

To get around this problem, I did what seemed resonable: I bought a Full HD instead of Half HD. Why? Half HD specs say 1280x720 or better. Half HD manufacturers say 1366x768. Full HD specs say 1920x1080 or better. Full HD manufacturers say 1920x1080 and not a single pixel extra. Thus, I have native resolution even if my GPU doesn't support (de jure) non-standard 1366x768 resolution.

"HD Ready" is the biggest fail of HDTV. Sure, it sells better than Full HD but that's not because of it's qualities... it's because of marketing.

My display arsenal:
19" 1280x1024
42" 1920x1080
26" 1920x1200
...and all are capable of displaying HDTV (720p or 1080p, depending on display) in native resolution. Letterboxed, on both 19" and 26"... though letterboxing of 26" is only 120 pixels.

Me modernizing my computer equipment also has something to do with the motherboard of my previous computer starting to have problems with capacitors on mobo. Having changed leaking or bulging ones, it's still only 90% stable. It can play HDTV content through it's AGP card, though 1080 is only supported interlaced and GPU drivers reduce hardware video overlay by factor of 2. That is video overlay area, whether it's full screen or not, every 2x2 pixel is considered as one, thus 960x540 maximum resolution when fullscreened. When not using hardware acceleration, that GPU does support 1080i even for video but Prescott based Celeron single core at 2.66 or even 3.00 GHz overclock can't hand 720p video in CPU heavy AVC encoding. With hardware acceleration, it can handle it with minor stuttering. 1080p is out of question with or without hw acceleration.

But, I upgraded to quadcore computing. More than ten times the calculating power but no increase on CPU power consumption (Prescotts were extremely bad on power efficiency). I can watch 1080p in even the CPU heaviest AVC encodes with a small fraction of CPU utilization.

AZBrandon: "It looks like I’d have to spend at least $40 to get a card that will properly run 1366x768 and 1920x1080 out the DVI."

Since you already have the CPU power to decode, and the TV to display it that $40 is going to be the best $40 dollar you'll have spend in upgrading you equipment. I really, really recommend it. Running stuff in native 1080p on Full HD secondary display is a bliss. I specially recommend getting the GPU since you have a 1080p capable TV (not any of those odd capacity 768p TVs).

My attempt at upgrading my Prescott system to Full HD wasn't that successful due to drivers or cards hw accelering capability but that was an old AGP card (GeForce 6200) I had bought before my 1080p TV for purposes other than HDTV. The cheap ~$40 GPUs today (such as ATI 2400Pro I use on my quadcore system) are capable of 1080p hardware accelerated video overlay. (Otherwise my 2400Pro is a big fail: 3D calculating power is next to none and in addition to that, I get artefacts in half of the games I'd like to play. Some games just crash spectacularly and I even got corrupted ASCII characters on my desktop from time to time. I guess this all started when I upgraded GPU BIOS... except for 3D calculating power that has always sucked.)

whiic
Posts: 575
Joined: Wed Sep 06, 2006 11:48 pm
Location: Finland

Post by whiic » Sun Sep 14, 2008 5:28 am

autoboy: "The $200 HDMI cables are ruining HD for most as I find it comical when I see people with their $60 upscaling DVD player and their $70 HDMI cable. Those poor misguided people..."

xen: "Good to know that the €20 euro 10m DVI-HDMI cable I have in mind will do just fine."

Even a 5 to 10 eur cable will do fine. At least my cable bought from Biltema works perfectly and it was dirt cheap. It's digital data so it either works perfectly or it doesn't work at all. If it doesn't work at all, it's hoax and you can send that crap back to the seller.

It's quite different with analogue equipment as one can't expect perfection for 1/10th the price but with digital, the premium and the el-cheapo should both work. Digital is bliss for the cheap people.

AZBrandon: "Yeah I think I paid $16 for my 6' DVI to HDMI cable. Granted, it's video only, no audio signal, but one place online wanted $90 for the same cable"

HDMI can carry sound but I don't know if DVI can. Maybe, maybe not. I haven't investigated it.

Nevertheless, typical graphics cards DON'T have sound card functionality integrated... so a sound carrying DVI->HDMI adapter cable would be a waste of money... IF it's even possible to carry sound through DVI.

With future GPUs (hopefully) coming with native HDMI, I hope there'll be sound card functionality implemented on them as well. At least 2.0 stereo as most TV are capable for at least that. Especially for laptops, I hope HDMI would become defacto standard for external displays instead of that VGA output they have had so far.

While VGA output does work with 1920x1200 computer monitors, it usually doesn't work with 1920x1080 TVs! The limitation is within TV's end and it can't receive high-res analogue signals. Also, with the need for a cable to transfer audio to TV set, playing HD video on a table TV streaming from laptop computer takes a while extra to set up. HMDI audio connection would not only reduce number of cables but also eliminate effect of noise when transmitting the signal to TV set.

scdr: "The thing that I find most disturbing about the whole HDTV conversion (here in the USA) is that it is being forced upon us."

Actually, I envy you. In Finland, we also had the forced transition to digital TV. The only exception being, you were upgraded from horrible quality NTSC (480 pixel) analog to good quality (720 pixel) digital. In Finland, we got transitioned from decent quality PAL (576) analog to decent quality (576) digital. The viewers received very little to nothing in the conversion... only bandwidth in aerial transmissions was saved. And that is being used by sending more channels... who offer SMS chat services and games. Total waste.

The only plus in the conversion with digital transmissions, image quality doesn't deteriorate as easily in bad weather. There's no visible noise in picture... until the signal is so bad the picture disappears completely (if it were analogue, it'd be pain for the eyes but still viewable!). So the only plus is also a minus! We got nothing from the transition. And we aren't even expecting 720 pixel aerial transmissions for the next few years... maybe even 10.

Finland FAIL.

tehcrazybob: "analog broadcasts require many times the bandwidth of digital ones, and there are few things in this world worth more money and radio frequency spectra. The bandwidth currently occupied by TV signals will be free for some other purpose."

Yes, that's true. And how that freed bandwidth is used... to send the same still image at 30 or 60 frames per second so people can play SMS games and guess words on a puzzle. That sure is a waste of bandwidth worse than analogue transmissions ever was.

In Finland we could have waited just a few more years for the digital TV to mature. We would have had cheaper HD televisions with better integrated digital converters: DVB-T2 instead of DVB-T. We could have had 720p compressed in AVC(?) instead of wasting the same amount of bandwidth sending uncompressed low-resolution crap.

whiic
Posts: 575
Joined: Wed Sep 06, 2006 11:48 pm
Location: Finland

Post by whiic » Sun Sep 14, 2008 6:07 am

As for anime fansubs and rips, they come from following sources which usually also define how they get encoded:
VHS or LaserDisc -> worst quality (very old TV shows not re-released in DVD, old OVAs, etc.)
SDTV -> decent quality (TV series)
DVD -> slightly superior to DTV (new OVAs/movie)
HDTV -> 720p (or downscale) (new TV series)
BluRay -> 720p or 1080p (new OVA/movie)

VHS-rips are a thing of the past. It's easy to stay away from them as no-one bothers to rip or distribute them anymore. Only exceptions being real rarities that aren't available any other format... old yaoi and lolicon titles does come into mind.

One thing to remember: old & bad DVDrips may be inferior to new VHSrips. Certain Aarinfantasy's yaoi releases has some impressive restoration done to VHS footage.

Resolution usually depends on source footage. Usually. Especially in the past there was downscales as da Innanets were slow. Today I've seen some upscales... which is a waste. HDTV 720p stretched to 1080 probably worse than 720p and even at it's best it's not worth the extra filesize. Codec (and how it's used) plays big part in quality. Old Divx3 DVDrips are nowhere near the quality of Divx5/XviD or AVC releases from SDTV footage... not to mention AVC encoded DVDrips.

I didn't jump from DVD resolution right to 1080p. I watched 720p HDTVrips using my 1280x1024 monitor first. Even after I bought the 42" 1080p fullHD TV I didn't start watching 1080p video because CPU lacked power to display it. And even after upgrading to quadcore computing a year ago, I still haven't seem much 1080p material because there isn't that many BluRayRips. I've seen only a few short video clips in 1080p (incl. one AMV) and the only movie I've watched in full resolution has been Byousoku 5 Centimeter which naturally looked outstanding. Not that I have anything bad to say about story, directing, artwork... Makoto Shinkai is genious.

whiic
Posts: 575
Joined: Wed Sep 06, 2006 11:48 pm
Location: Finland

Post by whiic » Sun Sep 14, 2008 7:18 am

xen: "Pretty amazing though that something that looks perfect on a small monitor would look so bad on a large one. Perhaps that's also a drawback of LCD techonology as compared to CRT."

Vicotnik: "Yeah, the LCDs are too exact. With CRT and its built in "blur filter" we were fine. With a LCD you'll have to squint to get the desired effect.

It's like how 128kbps MP3 sounds ok (or at least no different than the source) on a pair of $5 passive computer speakers, but when you hook up the thing to your decent sound system that is no longer the case."


I don't think that LCD looking worse for viewing certain video (or still images) to be due being too "good" for the stuff to be viewed to be the main problem. I think LCD has fundamental problems to display certain images (moving or still) accurately.

- LCDs can't display black because of backlight bleed. It's dark gray.
- LCDs have poor colour reproduction. It's it's like a 16-bit monitor... but dithering may be used to fake 24 or 32-bit colourspace.

These two drawback inherent to technology used make up a problem: especially when viewing dark scenes, smooth gradients look jagged. This problem is made much worse if:
- source footage has macroblocking due to compression
- source footage is upscaled.

So upscaling not only makes the picture blurry, it will make the macroblocking and jagged gradients more apparent.

If you on the other hand watch THORA's encode of Byousoku 5 Centimeter, there's no upscaling and compression used has been chosen so that sacrifice on video quality is minimized. Even though THORA releases 1080p rips, they still emphasize on smooth colour gradients over resolution. Almost ironic... but they are the best. That is, almost for everything but keeping the filesize small (and for offering multilanguage audio). 2 to 3 gigabytes per movie, even when encoded with the most CPU heavy codec.

It's still quite an achievent to compress BluRay movies to fit a single DVD without downscaling, noticeable macroblocking or reduction of colourspace. Makes one forget the drawbacks of TFT panels completely. Much more enjoyable than watching it using CRT based monitor or TV.

lm
Friend of SPCR
Posts: 1251
Joined: Wed Dec 17, 2003 6:14 am
Location: Finland

Post by lm » Sun Sep 14, 2008 9:44 am

whiic wrote:HDMI can carry sound but I don't know if DVI can. Maybe, maybe not. I haven't investigated it.
HDMI carries sound, but DVI does not.

http://en.wikipedia.org/wiki/HDMI
http://en.wikipedia.org/wiki/Digital_Visual_Interface
whiic wrote: Nevertheless, typical graphics cards DON'T have sound card functionality integrated... so a sound carrying DVI->HDMI adapter cable would be a waste of money... IF it's even possible to carry sound through DVI.
That is just plain wrong!
http://ati.amd.com/products/hdseries.html wrote: ATI Avivo HD technology found in the ATI Radeon HD 2000 Series enables stunning big-screen entertainment with advanced hardware HD video processing (UVD*) and HDMI connectivity that includes built-in 5.1 surround audio.
ATI has had audio chips in ALL of their GPUs since the 2000 series.

Remember both HDMI and DVI use digital signals. It's not like there's a separate wire for video and audio. If the GPU only has a DVI out port physically, but supports HDMI, then it must detect if it's connected to a device with DVI input or HDMI input, and decide if it can also send audio to the device. It's not really about the cable or the ports, but the capabilities of the input device.
whiic wrote: With future GPUs (hopefully) coming with native HDMI, I hope there'll be sound card functionality implemented on them as well. At least 2.0 stereo as most TV are capable for at least that. Especially for laptops, I hope HDMI would become defacto standard for external displays instead of that VGA output they have had so far.
It's already been quite some time since the first 2000 series ATI card was released, with builtin audio through HDMI, like I already said. They can do 5.1 on 2000 and 3000 series, 7.1 on 4000 series.

It's just trivial that VGA port gets replaced, since all current display technologies have digital input natively, and the computer has digital output natively, and thus using an analog transport means having to do 2 extra conversions to the signal. Circuitry for doing this costs money. Soon GPU's won't have this circuitry at all any more, because they are not needed. There can still be niche market GPUs that include this circuitry for the minority that need them.

However it shouldn't probably be HDMI which wins, because HDMI license is not free, and so all devices that implement it must pay royalties, which also means devices are more expensive for the consumer.

The better format is the DisplayPort, which is totally free to implement.
http://en.wikipedia.org/wiki/DisplayPort
It has all the capabilities of HDMI. There are already GPUs with DisplayPort out. Support for it already was in ATI's 3000 series, and is in 4000 series also, but sadly GPU manufacturers don't enable it on all models, because it needs a physically different port, and you can't just use an adapter like with DVI->HDMI.
There's also monitors that have DisplayPort input, though not so many yet. Mainly this year 30-inchers.

whiic wrote: While VGA output does work with 1920x1200 computer monitors, it usually doesn't work with 1920x1080 TVs! The limitation is within TV's end and it can't receive high-res analogue signals. Also, with the need for a cable to transfer audio to TV set, playing HD video on a table TV streaming from laptop computer takes a while extra to set up. HMDI audio connection would not only reduce number of cables but also eliminate effect of noise when transmitting the signal to TV set.
Indeed for the PC -> TV mainstream usage the analog connects will disappear soon, since all of that is already available just the way you wanted.

Alas, nothing of this is available on Nvidia.

Disclaimer: My current card is Nvidia. However I don't need those features myself, as my audio goes from my PC to my standalone stereo amp and from there to my speakers.

whiic
Posts: 575
Joined: Wed Sep 06, 2006 11:48 pm
Location: Finland

Post by whiic » Sun Sep 14, 2008 12:49 pm

Hmmm... I see "ATI HD audio rear output" in device manager and that has only 2 channels if I open it in Volume Control, not 5.1 as you say. This could mean they are rear left and right but where the hell is my front left and right and where the hell is my subwoofer output too? And if I play video that certainly has 5.1 audio, why don't I even hear the rear outputs from my TV speakers if I set system default to ATI's sound and verify MPC uses system default for audio? Where does my sound disappear? (TV volume was on max too.)

IsaacKuo
Posts: 1705
Joined: Fri Jan 23, 2004 7:50 am
Location: Baton Rouge, Louisiana

Post by IsaacKuo » Sun Sep 14, 2008 4:19 pm

whiic wrote:My "old" 1280x1024 19" TFT display can display Half HD in it's native resolution. In fact my 19" is probably the best divice existing for displaying that material. Why? It's no even widescreen.

Reason: there's no widescreen 720p displays or televisions. There's only 1366x768 (TV 16:9), 1440x900 (monitor 16:10), 1680x1050 (monitor 16:10)... all do upscaling from 720p.
This is not true. There are true 1280x720 resolution televisions and projectors. Sadly, 1366x768 is a more common resolution for HDTV sets...however true 1280x720 HDTV sets also exist.
Most GPUs also support certain 16:9 aspect ratios for DVI->HDMI connection to table TVs.
All GPUs support more or less all resolutions up to their maximum limits. However, the stock Windows XP driver typically does not directly support all of the resolutions. You can use a third party video card tweaker like Powerstrip to customize other resolutions (or you can use a different OS, which makes the Windows XP driver limitations moot).

That said, 1366x768 is just a stupid resolution in any case.
My display arsenal:
19" 1280x1024
42" 1920x1080
26" 1920x1200
...and all are capable of displaying HDTV (720p or 1080p, depending on display) in native resolution. Letterboxed, on both 19" and 26"... though letterboxing of 26" is only 120 pixels.
I'd argue that your 42" HDTV is better for playing 1280x720p videos than your 19" monitor. If you play a 1280x720p video at 1:1 scale, it will only fill up a fraction of the screen but it will STILL be bigger than the 1:1 image on your 19" screen.

Wibla
Friend of SPCR
Posts: 779
Joined: Sun Jun 03, 2007 12:03 am
Location: Norway

Post by Wibla » Sun Sep 14, 2008 5:28 pm

1366x768 isnt an issue with a pc as source, if the tv can handle 1:1 pixel mapping.

720p native projectors are everywhere, I have one myself.

The difference between pdtv - dvdrip - hdtvrip - 720p is clear, even at 5meters distance to a 92" screen.

1080p has its merits, but most poeple dont need it.

The difference between DVD and 720p on good tvs and projectors .. well, there's no competition, 720p wins hands down. The image quality is superior.

whiic: The 5.1 audio on the ATI cards is spdif passthrough, newer cards are out (i think) with 7.1 LPCM. Personally im not a big fan (yet), this tends to mess up audio devices in winxp.

Post Reply