Everybody seems to be busy getting their systems ready for Full HD video. I think it is madness. The industry naturally pushes this development because they need the world to dispose of the current generation of audio/video/computing hardware, so that they can then sell the next generation to us. Another generation of devices made obsolete. And for what? It seems to give meaning to the lives of the majority of people.. err, consumers. Without it, their lives are empty. Now they have something to be enthusiastic about and to occupy their minds and their passtime.
I was in a computer store today. A customer was talking to a salesman, arguing that 1080p was definitely better than 1080i, and that it made a huge difference. The salesman - someone who's probably seen it all - didn't quite agree, he thought 720p was more than enough. For the customer, it is probably something to fill his meaningless life with.
Disposal of the current generation of hardware just means that huge quantities of materials are going to the landfills and - worse - the incinerators. Little of this material can be recycled. Another huge amount of materials needs to be extracted from the eart, poisonous chemicals mixed in, and sent into our homes. Keep it coming. Already 1/3 of the worlds natural resources has been used in the last century and is now sitting idly in the dumping grounds. No way we'll be able to recycle that.
And for what? My 800Mhz Duron is still plenty fast to be playing DVDs and lower quality encodes. I upgraded to 1460Mhz only because of the higher quality encodes and because Linux, of all things, was beginning to drag me down. I read somewhere that "nowadays CPU speed is more important than ever." No, it isn't. It is less important than ever because you can still do all mundane tasks with a 6-year-old system. If people would software-engineer for that, the current generation of hardware would be able to last for decades. Did you know that a 300Mhz G3 Mac is still fast enough to display OS X Exposé smoothly and have nice window shadows, provided it has enough memory? Try that with Linux Compiz. Efficiency is an art, but the Linux croud seems to think that resources are better spent designing for current/next-gen hardware. Developer-wise, yes. Earth-wise, no. Instead of providing a counter force, the caring Linux user is worse off than the caring Windows XP user. This is partly because human resources get taxed, but material resources do not in general. Our tax system is geared towards maximum material throughput with minimum human resources.
Maybe you're thinking that your Dual Core FullHD system will last you for decades. I'm sure the industry will find a solution for that. At least systems are beginning to become a bit power efficient.
I'm planning to stick to my Socket A platform for a while. When higher quality encodes start hitting the torrent sites that I can't handle, I'll just transcode them. I don't need to buy into the Blueray hype either because I can trust to good ol' pirates to provide me with another distribution channel. The only issue is when IDE harddisks will no longer be available, and when Windows XP will become legacy. SATA controllers are a nightmare. I'm beginning to run short of expansion slots, too


It seems video consumption has become the focal point of our existence. How sad. There used to be a time when our prime identity was that of the farmer, or the carpenter, or the painter, or a judge, or a scientist, but these days, we seem to have become "consumers". Am I the only one here that senses how devoid of meaning and purpose and essence that is?