Unfortunately the MPC-HC decoder only supports the "bitstream mode" at this stage, which means that only the most recent graphic cards are supported :
* nVidia series 8(9)xxx for H.264 only
* ATI Radeon HD series for H.264 and VC-1 decoding
"Motion compensation" mode might be added in the future to increase compatibility with older graphic cards, but I cannot promise anything. Mpeg2 and WMV accelerations are not supported either.
DXVA is quite picky, so if you want to use it you have to respect thoses rules :
* Windows XP users, select Overlay Mixer, VMR7, VMR9 or VMR9 renderless
* Vista users, select EVR or EVR custom renderer
The MPC-HC Video decoder must be connected directly to the renderer. That means no intermediate filters such as DirectVobSub or ffdshow can be inserted between the decoder and the video renderer.
Internal subtitles can work with the MPC Video decoder in DXVA mode, but the rules are even more restrictive :
* Windows XP users, select VMR9 renderless
* Vista users, select EVR custom renderer
* In "Options / Playback", tick the checkbox "Auto-load subtitles"
Radeon HD 4670: A perfect balance?
Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee
http://tibrium.neuf.fr/DXVASupport.html
I would warn buyers against any GPU that can't stay under ~75C under load. Temps much above that are pathetic, both in regards to the longevity of the card itself and for the heat it's putting off into the rest of your case.
If AMD/NVidia can't design adequate cooling for a given card, I will wait until an aftermarket cooler is available and purchase that when I purchase the card. I am also not afraid of downclocking a card to ensure it does not run beyond my determined target temps.
It is rather ridiculous though, that as dies shrink they are still running as hot or hotter than previous generation cards. Frustrating is probably the best word to describe that.
If AMD/NVidia can't design adequate cooling for a given card, I will wait until an aftermarket cooler is available and purchase that when I purchase the card. I am also not afraid of downclocking a card to ensure it does not run beyond my determined target temps.
It is rather ridiculous though, that as dies shrink they are still running as hot or hotter than previous generation cards. Frustrating is probably the best word to describe that.
Again with the temperature paranoia...yacoub wrote:I would warn buyers against any GPU that can't stay under ~75C under load. Temps much above that are pathetic, both in regards to the longevity of the card itself and for the heat it's putting off into the rest of your case.
If AMD/NVidia can't design adequate cooling for a given card, I will wait until an aftermarket cooler is available and purchase that when I purchase the card. I am also not afraid of downclocking a card to ensure it does not run beyond my determined target temps.
It is rather ridiculous though, that as dies shrink they are still running as hot or hotter than previous generation cards. Frustrating is probably the best word to describe that.
75C is not that much for a GPU and most cards would be ok for a very long time at that temp. Also, temperature has little to to with the heat released into the case. If you think about it I'm sure you will realize that a better cooler doesn't affect the power consumption (=heat output) of the card.
Die shrinks often brings the temperature down but not always since a smaller die is harder to cool (less contact with the heatsink), also power leakage can be a factor.
Indeed my 6600GT was going strong for 2+? years despite temps in the 90C range (possibly higher sometimes, can't remember exactly). It may very well still be going fine 1+ year later to the person I sold it too. Obviously it may be an issue for case cooling and effect on other components but ultimately if you're worried about longevitiy of the card just get a card with a decent (e.g. 3 year*) warranty. If the card dies, get it replaced. Indeed here in NZ I suspect even without a decent warranty, if the card dies because of high temperatures when it was that high by design within about 3 years or so you'd still be entilted to have it repaired or replaced although it may be somewhat difficult to convince your retailer or the supplier to do so but if you really want to the law is on your side so you could probably eventually get it done even if it takes a case to the small claims court to do it.Vicotnik wrote:Again with the temperature paranoia...yacoub wrote:I would warn buyers against any GPU that can't stay under ~75C under load. Temps much above that are pathetic, both in regards to the longevity of the card itself and for the heat it's putting off into the rest of your case.
If AMD/NVidia can't design adequate cooling for a given card, I will wait until an aftermarket cooler is available and purchase that when I purchase the card. I am also not afraid of downclocking a card to ensure it does not run beyond my determined target temps.
It is rather ridiculous though, that as dies shrink they are still running as hot or hotter than previous generation cards. Frustrating is probably the best word to describe that.
75C is not that much for a GPU and most cards would be ok for a very long time at that temp. Also, temperature has little to to with the heat released into the case. If you think about it I'm sure you will realize that a better cooler doesn't affect the power consumption (=heat output) of the card.
Die shrinks often brings the temperature down but not always since a smaller die is harder to cool (less contact with the heatsink), also power leakage can be a factor.
Ultimately if a vendor sells a card that runs at high temperatures, it's their responsibility to ensure the components are able to withstand the temperatures. If they're not then screw them, they have to replace it when it dies within an unresonable time frame.
*3 years because let's face it, 3 year old GPUs particularly of the mid range aren't worth much.
I would agree 1GB is probably mostly useless but there are definitely going to be a few cases where it'll help IMHO. For example Photoshop CS4 can use GPU acceleration. From what I understand, the number of concurrent images that can use GPU acceleration depeneds strongly on GPU ram. I haven't seen any good discussion on precisely how this varies (it'll likely depend on image size of course) and what is actually stored in the GPU ram (is each open image stored? each cache?) but from a quick read thorough of Adobe's KB http://kb.adobe.com/selfservice/viewCon ... d=kb404898 http://kb.adobe.com/selfservice/viewCon ... d=kb405745 I suspect having 1 GB could help a fair bit if you regularly work with multiple large images (or even I presume one very large image). Obviously you need sufficient system memory to match and having a faster GPU would help too but I strongly suspect just having more GPU ram can help (at the very least you need enough that all images can be GPU accelerated). It is possible having sufficient system memory and an OS that supports GPU memory virtualisation can help make up for insufficient memory with multiple images (unless you're literally working on both simulataneously) but it's not going to make up for it if you have on single very large image (e.g. 10k x 10 k at 48 bit). With GPGPU getting more popular there is probably going to be more cases like this IMHO (along with cases where even 128mb is probably enough).loimlo wrote:CPU can't access the memory that resides at display card. 1GB on 4670 is a marketing gimmick.
Notice: Powercolor returned another 4670 to me and then unfortunately I found the same gentle clicking noise like previous one. I modified Powercolor's bios fan speed control from 38% to 20% at idle, which is barely noticeable in P182 now, by using Radeon Bios Editor. All in all, it's a very quiet but not silent fan-cooled 4670. It doesn't bother me at all, but someone who is more sensitive to noise character may need to get a passive-cooled 4670.
Last edited by Nil Einne on Thu Mar 12, 2009 8:01 am, edited 1 time in total.
Semiconductors do increase resistance with temperature so it will increase power consumption, but it might not be a large factor.Vicotnik wrote:Also, temperature has little to to with the heat released into the case. If you think about it I'm sure you will realize that a better cooler doesn't affect the power consumption (=heat output) of the card.
Sapphire 4670 Ultimate (passive)
Hey all,
has anyone had any luck using any card in the slot above (from a standard tower case viewpoint) the Sapphire 4670 Ultimate? From the images I've seen http://www.cluboverclocker.com/reviews/ ... /page1.asp http://www.geeks3d.com/?tag=passive-vga-cooler it looks like a slim line card might work. How about the XFX 4650 passive http://www.techpowerup.com/87392/XFX_Gi ... oling.html ? This is unlike the HIS 46xx cards for example where I don't think you'd fit anything.
I don't believe there are any 46xx passive cards which are single slot but anyone aware of any others where you could potentially fit a card in the slot taken up? I think the PowerColor is similar to the HIS but on the other side i.e. still no chance of a card in the slot taken up http://www.fudzilla.com/index.php?optio ... &Itemid=34 and the Club3D either like the HIS or Power Color.
has anyone had any luck using any card in the slot above (from a standard tower case viewpoint) the Sapphire 4670 Ultimate? From the images I've seen http://www.cluboverclocker.com/reviews/ ... /page1.asp http://www.geeks3d.com/?tag=passive-vga-cooler it looks like a slim line card might work. How about the XFX 4650 passive http://www.techpowerup.com/87392/XFX_Gi ... oling.html ? This is unlike the HIS 46xx cards for example where I don't think you'd fit anything.
I don't believe there are any 46xx passive cards which are single slot but anyone aware of any others where you could potentially fit a card in the slot taken up? I think the PowerColor is similar to the HIS but on the other side i.e. still no chance of a card in the slot taken up http://www.fudzilla.com/index.php?optio ... &Itemid=34 and the Club3D either like the HIS or Power Color.
Actually, the resistivity of semiconductors decreases with increasing temperature. At least according to Wikipedia - http://en.wikipedia.org/wiki/ResistivityQuietOC wrote:Semiconductors do increase resistance with temperature so it will increase power consumption, but it might not be a large factor.
I'm sure the effect is close to negligible for this application however.
But leakage current increases with temperature, and switching speed decreases (which is why improved cooling helps overclocking so much), both of which increase power consumption.Vicotnik wrote:Actually, the resistivity of semiconductors decreases with increasing temperature. At least according to Wikipedia - http://en.wikipedia.org/wiki/Resistivity
I'm sure the effect is close to negligible for this application however.
You can easily test this for yourself - measure the power consumption of your PC, then stop the fan on the graphics card. Observe what happens to the power consumption as the temperature rises (you will note that it increases).
I have no fan to stop. I'm sure you're right, but I think the effect is marginal. Perhaps someone with fan cooling on their graphics card could do a little testing?Mr Evil wrote:But leakage current increases with temperature, and switching speed decreases (which is why improved cooling helps overclocking so much), both of which increase power consumption.
You can easily test this for yourself - measure the power consumption of your PC, then stop the fan on the graphics card. Observe what happens to the power consumption as the temperature rises (you will note that it increases).
I don't have a fan on my graphics card either, but this is an experiment which I have seen done many times before, so Google can save us by finding a nice graph showing the full load power consumption of a GTX280 increase by 10% as temperature increases from 60C to 90C.Vicotnik wrote:I have no fan to stop. I'm sure you're right, but I think the effect is marginal. Perhaps someone with fan cooling on their graphics card could do a little testing?
I am talking about the component not the material. Yes, semiconductor transistors use the material to both conduct and not conduct. The fact is that increase temperature increases power consumption which effectively is increased resistance of the component.Vicotnik wrote:Actually, the resistivity of semiconductors decreases with increasing temperature. At least according to Wikipedia - http://en.wikipedia.org/wiki/Resistivity.QuietOC wrote:Semiconductors do increase resistance with temperature so it will increase power consumption, but it might not be a large factor.
Damn, that's a lot of leakage. But is the "problem" limited to some graphics cards or are all modern cards affected? For example the P4 Prescott had a lot of leakage while some other CPUs have less.Mr Evil wrote:I don't have a fan on my graphics card either, but this is an experiment which I have seen done many times before, so Google can save us by finding a nice graph showing the full load power consumption of a GTX280 increase by 10% as temperature increases from 60C to 90C.
I will have to do some testing with my 4670.
-
- Patron of SPCR
- Posts: 857
- Joined: Fri Dec 27, 2002 1:49 pm
- Location: Somerset, WI - USA
- Contact:
This is a very long thread, but if you look back many of us have tried just this. The consensus that I've found is that no retail cards support voltage any lower than the voltage used by default in the power saving state. As undervolting any further causes no change in actual power draw such as that seen in the AMD reference card that SPCR reviewed.Tom. wrote:With RBE (Radeon BIOS Editor) it's possible to "undervolt" it at the power saving state.
Anyone tried that?
After some (ahem) testing for a few hours playing games, my SPCR-approved Anitec SilenT3 with the Radeon DDR4 HD4670, its idle temps are 45C on average and 75C max at load with fan set to auto.
Unlike my HD4870 when it ramps up, it doesn't sound like a leaf blower but instead sounds quite modest still. Fan speed idles at 33% and goes to 66% on load.
Unlike my HD4870 when it ramps up, it doesn't sound like a leaf blower but instead sounds quite modest still. Fan speed idles at 33% and goes to 66% on load.
Got a Biostar 4670 in today.
BIOS indicates no undervoltage, though at least it underclocks by default.
165/250, 300/500, 750/1000.
No fan monitoring, but not too noisy at idle. Temp is 35. Will have the S2 on it when I do the rebuild in a week or so anyway.
Circuit board is missing *lots of stuffs* so I doubt it has any "frivolous" power management circuitry.
BIOS indicates no undervoltage, though at least it underclocks by default.
165/250, 300/500, 750/1000.
No fan monitoring, but not too noisy at idle. Temp is 35. Will have the S2 on it when I do the rebuild in a week or so anyway.
Circuit board is missing *lots of stuffs* so I doubt it has any "frivolous" power management circuitry.
DXVA seems to be fixed now with the 9.3 drivers.
edit: In response to therealjoeblow below, this is under WinXP SP3 32bit with the system in my signature.
edit again: And it's not perfect. I use MPC-HC v1.2.908.0 32 bits and DXVA only works for most of my movies. With some movies it doesn't work and then I use MPC "non-HC" and CoreAVC instead. Kind of annoying to have to use two different players.. So the DXVA support is far from perfect for me, if it's because of the drivers, the movie itself (all are x264 in mkv containers) or something else I have no idea.
edit: In response to therealjoeblow below, this is under WinXP SP3 32bit with the system in my signature.
edit again: And it's not perfect. I use MPC-HC v1.2.908.0 32 bits and DXVA only works for most of my movies. With some movies it doesn't work and then I use MPC "non-HC" and CoreAVC instead. Kind of annoying to have to use two different players.. So the DXVA support is far from perfect for me, if it's because of the drivers, the movie itself (all are x264 in mkv containers) or something else I have no idea.
Last edited by Vicotnik on Mon Apr 06, 2009 3:42 pm, edited 2 times in total.
-
- Posts: 3142
- Joined: Mon Feb 26, 2007 9:20 am
- Location: Missing in Finnish wilderness, howling to moon with wolf brethren and walking with brother bears
- Contact:
*sighs* When I just ordered AC Accelero S2, they informed me that current heatsink is unavailable from importers, stock sellers and from manufacturer... So I have to go overkill and put Accelero S1 Rev 2 on my Asus HD 4670. But... at least I knowI can run it easily passive with it...
Edit:
Too bad the HR-03 doesn't work with HD 4670 due fan connector. However Zalman's VFN-100 passive cooler seems to work well despite the fan connector. His iSilence 4 version of HD 4670 looks very pormising.
2nd edit:
Due the freak accident Asus HD 4670 + Accelero S1 rev2 combo does not fit on MSI K9A2GM V3. Since MSI's PCI-E slot seems to lower than Asus' and Gigabyte's, the heat pipes on S1 hit directly on 1st PCI slot connector. THis is due HD 4670's core is not center of the card but closer the edge. Its couple of mm too shallow.
So I ordered Zalman VNF100-HP and see if I can fit that. If not, then I'll RMA HD 4670 and take HD 4830 instead or something... Zalman will be third cooler I'll be trying.
TR HR-03 did not fit due fan connector. AC S1 did not fit due combination problem with motherboard and core placement and S1's heatpipe combinated "Murphy-effect". I do hope Zalman's passive solution will save me...
Edit:
Too bad the HR-03 doesn't work with HD 4670 due fan connector. However Zalman's VFN-100 passive cooler seems to work well despite the fan connector. His iSilence 4 version of HD 4670 looks very pormising.
2nd edit:
Due the freak accident Asus HD 4670 + Accelero S1 rev2 combo does not fit on MSI K9A2GM V3. Since MSI's PCI-E slot seems to lower than Asus' and Gigabyte's, the heat pipes on S1 hit directly on 1st PCI slot connector. THis is due HD 4670's core is not center of the card but closer the edge. Its couple of mm too shallow.
So I ordered Zalman VNF100-HP and see if I can fit that. If not, then I'll RMA HD 4670 and take HD 4830 instead or something... Zalman will be third cooler I'll be trying.
TR HR-03 did not fit due fan connector. AC S1 did not fit due combination problem with motherboard and core placement and S1's heatpipe combinated "Murphy-effect". I do hope Zalman's passive solution will save me...
-
- Posts: 353
- Joined: Sat Oct 18, 2008 6:30 am
- Location: Moldova, exUSSR
Vicotnik, hi. You're right! I just installed the new 9.3 driver (having an old K-lite Codec Pack in the sistem), and got about 1-2% CPU load on H264 1920x1080! Now we have the true DXVA/VC1 support on HD4670!
Unlike of MPCHC, VLC isn't yet capable
But, anyway, this is great news!
///
[UPDATED!]
P.S. If someone is interested in a small Catalyst package here is one:
http://radeon.ru/downloads/drivers/n2o/n2o_9.6_cp.exe
N2O Driver Pack 9.6 + Control Panel, WinXP, 8.31MB
http://radeon.ru/downloads/drivers/n2o/n2o_9.6_nocp.exe
N2O Driver Pack 9.6 w/o CP, WinXP, 7.27MB
Unlike of MPCHC, VLC isn't yet capable
But, anyway, this is great news!
///
[UPDATED!]
P.S. If someone is interested in a small Catalyst package here is one:
http://radeon.ru/downloads/drivers/n2o/n2o_9.6_cp.exe
N2O Driver Pack 9.6 + Control Panel, WinXP, 8.31MB
http://radeon.ru/downloads/drivers/n2o/n2o_9.6_nocp.exe
N2O Driver Pack 9.6 w/o CP, WinXP, 7.27MB
Last edited by Ksanderash on Fri Jun 19, 2009 2:53 pm, edited 3 times in total.
-
- Posts: 3
- Joined: Mon Apr 06, 2009 12:36 pm
- Location: canada
You know, I get so frustrated by the "I've got it working no problem with Cat v-X" types because hardly anyone ever posts enough information to really figure out what is and what isn't working!ryboto wrote:I have tried catalyst drivers 8.11, 8.12, and 9.2. I've got a little program called "dxvachecker"...I think that's the name, anyway, it checks the dxva compatibility of your hardware. with 9.2's on my vista system with an HD3870, I have no issues with dxva, but on the xp machine with the HD4650, none of those drivers make dxva work. I've reinstalled drivers too, cleaned them from the system.incorrect wrote:yes it should work fine..ryboto wrote:It supposedly does, but I can't make my system mimick what others have achieved. I'm trying to us GBPVR in windows xp and use dxva, and some users on the gbpvr forums use XP, and have made it work. I can't even get it to work with mpchc....so I might just upgrade the system to Vista..
what have you attempted? multiple driver versions, multiple players, a reinstall or two, disabling subtitles, multiple different hd files (some will bork acceleration).. there has to be something different.
I have only tried using MPCHC, since it's the only free program I've known of that supports so many options, as well as dxva.
As for different video files, I know what you mean, and some of them don't allow dxva to work on my desktop system, either way, I tried a bunch of different files, none of them showed the dxva engaged, but I wouldn't expect it to if the checker didn't detect dxva hardware.
The only think I don't use is CoreAVC. Otherwise I've got the haali splitter installed, and the ac3filter. Guess I'll try the core codec, see if it makes a difference.Strange. My X2 is limited to 2GHz with RMClock and I have no problems with 1080p x264. I use MPC, CoreAVC, Haali Media Splitter, AC3Filter.
There are no less than 5 *DIFFERENT* versions of Windows currently and commonly in use: WinXP 32 and 64 bit; Vista 32 and 64 bit, and the Win7beta. THE CATALYST DRIVERS ALL WORK/DO NOT WORK DIFFERENTLY FOR EACH OF THESE VERSIONS!! There is NO universally "good" working driver version. People really need to tell you what version of Windows they are using when they tell you what they have working or not, otherwise you'll spend weeks trying futilely to get something working because someone else said they have that version working well.
Everyone who reports that DXVA is working 100% with the latest (Cat 9.3 at the time of this writing) is using some version of Windows other than XP32! DXVA is *completely* broken in Catalyst from 8.6 to 8.11. It will cause bluescreens shortly after activating (for DVD's as well as x264), if you ever do manage to get it to activate. I've heard that some have got it working in XP32 with 8.12, and then again it's broken to varying degrees in 9.1-9.3.
I never have though, I have a Gigabyte GA-MA78GM-S2HP motherboard with the onboard 780G chipset (HD 3200), and the last version where I could get DXVA to work for h264 and DVD decoding without bluescreens is Cat 8.5. 8.6-8.11 bluescreen; 8.12-9.3 won't activate at all.
And like you, I'm having problems with getting smooth playback of 1080p material, it doesn't really drop frames, but there's a bit of a 'stutter' or jerk every 20-30 seconds, very irritating. This is with an Athlon 64 X2 5600+, so it's definitely not a CPU problem, and it's on a brand new clean install of WinXP32 SP3, with only the drivers, PowerDVD9, FFDShow and ZppmPlayer6 installed, so it's not likely that this is a problem with too many mismatched pieces of software and drivers.
And don't even get me started with VMR9 and tearing. It's absolutely useless in XP32 with any of the drivers!
I have two other computers in the house (Asus MB with the 690 chipset with the integrated Radeon x1250; and an old ASRock mb with a single core Athlon 2600 and a Radeon 2600XT AGP, and both of these play back the same 1080p clips perfectly smooth without ever a blip, and those two machines have been put thru hell and back over the years with driver and software installs and updates, but no issues with them at all).
All I can think of is that ATI is moving too quickly to support the latest OS versions, and leaving the tried and true XP32 in the dust with their newer products. At this point, I am terribly disappointed with the performance and problems with a brand new system that should be leaps and bounds above what I already have with 2 year old technologies!
I'd love to get just one version of a driver that has everything working the way it's marketed. Enough of my rant.
If you get it figured out (how to play 1080p perfectly smooth) let us know please!
Cheers
The REAL Joe
-
- Posts: 353
- Joined: Sat Oct 18, 2008 6:30 am
- Location: Moldova, exUSSR
therealjoeblow
I use this ripped driver (about 37Mb installed) with n-lited WinXP Pro SP2 and have got NO PROBLEM with my 1080p movies (different, mkv, avi, mov) being played on the 19" monitor with MPCHC. CPU load is under 2-4%, at the idle level almost i.e. The rig is in the signature -- nothing special as you see, but I use 4670, not 780G GPU.
http://radeon.ru/downloads/drivers/n2o/n2o_9.4_cp.exe
N2O Driver Pack 9.4 + Control Panel, WinXP, 8.43MB
http://radeon.ru/downloads/drivers/n2o/n2o_9.4_nocp.exe
N2O Driver Pack 9.4 w/o CP, WinXP, 7.38MB
I use this ripped driver (about 37Mb installed) with n-lited WinXP Pro SP2 and have got NO PROBLEM with my 1080p movies (different, mkv, avi, mov) being played on the 19" monitor with MPCHC. CPU load is under 2-4%, at the idle level almost i.e. The rig is in the signature -- nothing special as you see, but I use 4670, not 780G GPU.
http://radeon.ru/downloads/drivers/n2o/n2o_9.4_cp.exe
N2O Driver Pack 9.4 + Control Panel, WinXP, 8.43MB
http://radeon.ru/downloads/drivers/n2o/n2o_9.4_nocp.exe
N2O Driver Pack 9.4 w/o CP, WinXP, 7.38MB
Last edited by Ksanderash on Fri Apr 10, 2009 2:31 am, edited 1 time in total.
-
- Posts: 353
- Joined: Sat Oct 18, 2008 6:30 am
- Location: Moldova, exUSSR
-
- Posts: 3
- Joined: Mon Apr 06, 2009 12:36 pm
- Location: canada
Well, after much more reading and troubleshooting, I solved the problem with the judder I was experiencing with 1080p video.
The problem actually only was in files with DTS audio, and after watching more and more of it, a more exact description of it was that exactly every 6 seconds, there was a frame jump. You could time it when watching the credits of a movie scroll, but it only really shows in the movie when there's consistent camera movement, so it was difficult to pin down to happening on a repeatable pattern.
In the end, the problem is that FFDShow seems to cause this judder when passing through DTS over S/PDIF. It does not appear do it for AC3, nor if DTS is being decoded and played on the sound card, only in S/PDIF passthru mode. I tried a half dozen versions of FFDShow from the most current daily build back to the latest stable Beta-6 release, and they all do this.
The solution was to disable FFDshow for DTS (and AC3 since I'm at it anyway) and use AC3Filter v-1.51a instead. It passes the DTS perfectly via S/PDIF without causing the judder.
Now if only ATI would fix the missing features for deinterlacing, pulldown, etc in the drivers, I'd finally be happy!
Cheers,
The REAL Joe
The problem actually only was in files with DTS audio, and after watching more and more of it, a more exact description of it was that exactly every 6 seconds, there was a frame jump. You could time it when watching the credits of a movie scroll, but it only really shows in the movie when there's consistent camera movement, so it was difficult to pin down to happening on a repeatable pattern.
In the end, the problem is that FFDShow seems to cause this judder when passing through DTS over S/PDIF. It does not appear do it for AC3, nor if DTS is being decoded and played on the sound card, only in S/PDIF passthru mode. I tried a half dozen versions of FFDShow from the most current daily build back to the latest stable Beta-6 release, and they all do this.
The solution was to disable FFDshow for DTS (and AC3 since I'm at it anyway) and use AC3Filter v-1.51a instead. It passes the DTS perfectly via S/PDIF without causing the judder.
Now if only ATI would fix the missing features for deinterlacing, pulldown, etc in the drivers, I'd finally be happy!
Cheers,
The REAL Joe
Gigabyte 512MB does adjust GPU voltage (checked with multimeter). Reference design.There is no "mysterious chip" nor "magic chip"on the board. This one:
Sapphire 512MB, the same reference design, does adjust the 2-pin fan.
The 2-pin fan is always on. I connected a 4-pin fan, no rpm monitoring, no adjustment. The PWM pin goes to a missing transistor. Perhaps I should solder one... I do not see where the rpm pin goes.Ksanderash wrote:... I also found that FB signal goes to a chip that is missing on my card...
Sapphire 512MB, the same reference design, does adjust the 2-pin fan.
You probably did a BIOS flash in a way, not recommended by the manufacturer. I did once...Ksanderash wrote: ...Missing physical MAC, 00-00-00-00...
-
- Posts: 353
- Joined: Sat Oct 18, 2008 6:30 am
- Location: Moldova, exUSSR
Klusu
Concerning the memory. It seems, that the DDR3 type doesn't receive so much profit from downclock as to my experience.
I've built a simple temperature sensing scheme (look in any near PSU) to autocontrol the fan noise, so no need to mess up on card pcb with soldering
...
Yep, there is such option in AFUWIN, but the "missing physical MAC" problem appeared due to my own bad head I was trying to install Realtek 8168 driver (it was sat perfectly, but no MAC), in spite of RTL8102 lying on mobo pcb... Silly me, I know...
As low as 0.9V? It doesn't matter if the "magic chip" is there, or it is not. We just need the 4670 that is realy cool in idle (remember SPCR' 3W?). And this can be possible only when 0.9V are transmited to GPU core.Gigabyte 512MB does adjust GPU voltage (checked with multimeter)
Concerning the memory. It seems, that the DDR3 type doesn't receive so much profit from downclock as to my experience.
I've built a simple temperature sensing scheme (look in any near PSU) to autocontrol the fan noise, so no need to mess up on card pcb with soldering
...
Yep, there is such option in AFUWIN, but the "missing physical MAC" problem appeared due to my own bad head I was trying to install Realtek 8168 driver (it was sat perfectly, but no MAC), in spite of RTL8102 lying on mobo pcb... Silly me, I know...
Last edited by Ksanderash on Fri May 08, 2009 7:45 am, edited 1 time in total.