Experiences with a Gigabyte 6600GT Silentpipe
Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee
Hi Guys, I'm currently using a Gigabyte 6800 Silentpipe fully unlocked.
I'd like to point out the reason why your temperatures are lower when the card is overclocked!!!!
There is a kind of fail safe built into the bios that automatically drops the core speed to around 250 (if memory serves correct) when it hits a certain temperature when overclocked past the default clock speed.
If you guys overclock your cards to 350Mhz and then run 3D mark 05 you will see that your score actually DECREASES compared to a score obtained at 325 MHz. You can use Rivatuner to actually log the current clock speed of your card and you will see that when the card is overclocked and you run a 3D app the clock speed reduces significantly.
The only way to stop this is to alter the fail safe temperature in the BIOS, but then you SERIOUSLY run the risk of FRYING your card.
I hope this sheds some light on the matter
Matt
Edit: My card idels at about 53 and never goes past 60. the heatsink does do its job, you just need to make sure you have some airflow over the card to remove the heat build up.
I'd like to point out the reason why your temperatures are lower when the card is overclocked!!!!
There is a kind of fail safe built into the bios that automatically drops the core speed to around 250 (if memory serves correct) when it hits a certain temperature when overclocked past the default clock speed.
If you guys overclock your cards to 350Mhz and then run 3D mark 05 you will see that your score actually DECREASES compared to a score obtained at 325 MHz. You can use Rivatuner to actually log the current clock speed of your card and you will see that when the card is overclocked and you run a 3D app the clock speed reduces significantly.
The only way to stop this is to alter the fail safe temperature in the BIOS, but then you SERIOUSLY run the risk of FRYING your card.
I hope this sheds some light on the matter
Matt
Edit: My card idels at about 53 and never goes past 60. the heatsink does do its job, you just need to make sure you have some airflow over the card to remove the heat build up.
-
- Patron of SPCR
- Posts: 700
- Joined: Thu Mar 13, 2003 2:38 pm
- Location: California, US
- Contact:
Thanks for the details, schmidtl. I had some interesting WoW graphics problems myself. I run a 6600GT with a homemade quiet cooler, and whenever I overclock too high or turn the fan down too low, I sometimes get random triangles with one vertice in outer space somewhere, but no crashes. Maybe I'm seeing memory errors and not GPU errors?
I used to get a system lockup very rarely (maybe once every 12 gaming-hours), but that was with the card in an older AthlonXP machine. When I upgraded to an Athlon64-based machine the crashes stopped. The PSU was the same in both machines.
Also, when I first got the card, I had occasional glitches in almost all my games. It really looked like a heat problem, but the problems were fixed when I switched to older drivers, strangely enough.
Sometimes, what looks like a graphics hardware problem is really an indication that something else has gone wrong. But it sounds like the solution to your problem was more straightforward.
I used to get a system lockup very rarely (maybe once every 12 gaming-hours), but that was with the card in an older AthlonXP machine. When I upgraded to an Athlon64-based machine the crashes stopped. The PSU was the same in both machines.
Also, when I first got the card, I had occasional glitches in almost all my games. It really looked like a heat problem, but the problems were fixed when I switched to older drivers, strangely enough.
Sometimes, what looks like a graphics hardware problem is really an indication that something else has gone wrong. But it sounds like the solution to your problem was more straightforward.
-
- Posts: 69
- Joined: Wed Apr 06, 2005 1:17 am
- Location: Sydney, Australia
This thread is quite interesting. I'm keeping on the stock cooler for now, since I paid so much money for it- I still like this card.
With respect to the thermal compound, I can see (without taking the heatsink apart) some yellow gum between the core and the heatsink. However, it seems that they are not making good contact- and the core is also 45 degrees rotated. Note: this is all from just 'looking' at the card side on- I'll get round to taking the heatsink and putting AS5 in a few months time..
With respect to the thermal compound, I can see (without taking the heatsink apart) some yellow gum between the core and the heatsink. However, it seems that they are not making good contact- and the core is also 45 degrees rotated. Note: this is all from just 'looking' at the card side on- I'll get round to taking the heatsink and putting AS5 in a few months time..
IIRC, geometry errors are usually caused by memory.SometimesWarrior wrote:Thanks for the details, schmidtl. I had some interesting WoW graphics problems myself. I run a 6600GT with a homemade quiet cooler, and whenever I overclock too high or turn the fan down too low, I sometimes get random triangles with one vertice in outer space somewhere, but no crashes. Maybe I'm seeing memory errors and not GPU errors?
I used to get a system lockup very rarely (maybe once every 12 gaming-hours), but that was with the card in an older AthlonXP machine. When I upgraded to an Athlon64-based machine the crashes stopped. The PSU was the same in both machines.
Also, when I first got the card, I had occasional glitches in almost all my games. It really looked like a heat problem, but the problems were fixed when I switched to older drivers, strangely enough.
Sometimes, what looks like a graphics hardware problem is really an indication that something else has gone wrong. But it sounds like the solution to your problem was more straightforward.
Gigabyte N66T128VP pictures:
http://www.cloudworld.net/diary/index.p ... 2005-07-31
mind U mine is the AGP version.
U can see on my mobo it's a very tight fit, the CPU hsf Zalman 7700AlCU is touching the heatsink on the back of the card, actually i have to adjust the position of several fins on the Zalman to make it fit. the bright side is the CPU fan directly blowing to it's heatsink, and the case fan helps to extract the hot air out.
temperture is good, idle 50~55'C (two nexus 120mm on 10 or 6V), load under WOW 71~74'C (two nexus on 10V). CPU temp increased a few 'C as well.
Thanks to it's 1.6ns DDRIII ram running 1.12GHz, it's 3D Mark 05 is about 20% faster than a 6600GT running 900MHz stock ram, like the Leadtek.
very satisfied so far.
http://www.cloudworld.net/diary/index.p ... 2005-07-31
mind U mine is the AGP version.
U can see on my mobo it's a very tight fit, the CPU hsf Zalman 7700AlCU is touching the heatsink on the back of the card, actually i have to adjust the position of several fins on the Zalman to make it fit. the bright side is the CPU fan directly blowing to it's heatsink, and the case fan helps to extract the hot air out.
temperture is good, idle 50~55'C (two nexus 120mm on 10 or 6V), load under WOW 71~74'C (two nexus on 10V). CPU temp increased a few 'C as well.
Thanks to it's 1.6ns DDRIII ram running 1.12GHz, it's 3D Mark 05 is about 20% faster than a 6600GT running 900MHz stock ram, like the Leadtek.
very satisfied so far.
-
- Posts: 69
- Joined: Wed Apr 06, 2005 1:17 am
- Location: Sydney, Australia
I ran into my first problems with the 6600GT today.
I downloaded the Age of Empires III demo, and proceeded to test out the eye candy . 1280x1024 full details, shaders etc..
After about 20 minutes, green blocks starting popping up on my monitor, and then I alt-tabbed back out to check my GPU temps, and they were on 92 degrees Celsius. I've been playing various games on this VGA card since I got it, and never has there been artifacting (I tested the overclock with 3dmark, rthdribl etc)
And so my question: Is the artifacting relating to heat? Or to a non-stable overclock? I've got it at 570 mhz / 1.18ghz (stock: 500 / 1.12).
I downloaded the Age of Empires III demo, and proceeded to test out the eye candy . 1280x1024 full details, shaders etc..
After about 20 minutes, green blocks starting popping up on my monitor, and then I alt-tabbed back out to check my GPU temps, and they were on 92 degrees Celsius. I've been playing various games on this VGA card since I got it, and never has there been artifacting (I tested the overclock with 3dmark, rthdribl etc)
And so my question: Is the artifacting relating to heat? Or to a non-stable overclock? I've got it at 570 mhz / 1.18ghz (stock: 500 / 1.12).
-
- Posts: 71
- Joined: Sat Mar 22, 2003 6:09 am
- Location: Wood Dale, IL USA
- Contact:
I had a similar experience a few years ago with an ATI 9700 Pro. I started seeing green dots on the screen. It gradually got worse and then the card just died. The problem was heat.timmytimmytimmy wrote:I ran into my first problems with the 6600GT today.
I downloaded the Age of Empires III demo, and proceeded to test out the eye candy . 1280x1024 full details, shaders etc..
After about 20 minutes, green blocks starting popping up on my monitor, and then I alt-tabbed back out to check my GPU temps, and they were on 92 degrees Celsius. I've been playing various games on this VGA card since I got it, and never has there been artifacting (I tested the overclock with 3dmark, rthdribl etc)
And so my question: Is the artifacting relating to heat? Or to a non-stable overclock? I've got it at 570 mhz / 1.18ghz (stock: 500 / 1.12).
Last edited by Dirty-Harry on Fri Sep 09, 2005 7:28 am, edited 1 time in total.
-
- Posts: 157
- Joined: Sun Apr 25, 2004 11:15 am
- Location: California
Try running a baseline with the current settings to confirm the artifacts...then lower the speed/performance settings say 20% and run it again. If no artifacts, you've sort of answered your own question.
It could be heat, it could be non-stable overclock, it could be the higher heat is causing non-stable overclocking and you just need to try to lower temps to get that extra tweaked performance.
It could be heat, it could be non-stable overclock, it could be the higher heat is causing non-stable overclocking and you just need to try to lower temps to get that extra tweaked performance.
-
- Posts: 69
- Joined: Wed Apr 06, 2005 1:17 am
- Location: Sydney, Australia
interesting thread
ive got a PowerColour 6600GT AGP, which has fairly average specs compared to other brands but i picked it up very cheap on ebay. I replaced the noise fan and HS with the zalman VF700-Cu. Only problem is the stock HS was an all-in-one, covering the GPU, RAM and AGP chip. The Zalman only covers the GPU, leaving he AGP chip exposed. To remedy this I placed two of the stick-on heatsinks on the chip and stuck two more on the PCB on the opposite side. The card runs fine, even overclocked - but there are no temp probes in this card (yeah, cheap i know) so i have no idea how hot it's getting.
Ive tried some rudimentary heat tests - running S&M/3DMark2005 for 30 minutes then quickly take the side off the case and carefully touch the heatsinks. The really dont seem to be that hot, but at the moment i have no way of knowing for sure. I might try to pick up one of those IR temp guns mentioned before. I guess I'm running it a bit dangerously considering I cant monitor temps, but like I said, so far there have been no lock-ups or visual artifacts (even after 8 hours of BF2).
Guess i'll wait and see if i fry the new card
ive got a PowerColour 6600GT AGP, which has fairly average specs compared to other brands but i picked it up very cheap on ebay. I replaced the noise fan and HS with the zalman VF700-Cu. Only problem is the stock HS was an all-in-one, covering the GPU, RAM and AGP chip. The Zalman only covers the GPU, leaving he AGP chip exposed. To remedy this I placed two of the stick-on heatsinks on the chip and stuck two more on the PCB on the opposite side. The card runs fine, even overclocked - but there are no temp probes in this card (yeah, cheap i know) so i have no idea how hot it's getting.
Ive tried some rudimentary heat tests - running S&M/3DMark2005 for 30 minutes then quickly take the side off the case and carefully touch the heatsinks. The really dont seem to be that hot, but at the moment i have no way of knowing for sure. I might try to pick up one of those IR temp guns mentioned before. I guess I'm running it a bit dangerously considering I cant monitor temps, but like I said, so far there have been no lock-ups or visual artifacts (even after 8 hours of BF2).
Guess i'll wait and see if i fry the new card
-
- Posts: 157
- Joined: Sun Apr 25, 2004 11:15 am
- Location: California
-
- Posts: 69
- Joined: Wed Apr 06, 2005 1:17 am
- Location: Sydney, Australia
Well, I've added a fan to my 'fanless' 6600GT. Drops load temps around 15 degrees Celsius (90 -->75)
As you can see, I've wired it up to a switch on the PCI panel, and I only turn it on for gaming, and leave it on for a few minutes after to cool down the card. The fan itself isn't too noisy, I can hear a low whoosh from it, but since I'm gaming, its drowned out by the sounds from my speakers.
As you can see, I've wired it up to a switch on the PCI panel, and I only turn it on for gaming, and leave it on for a few minutes after to cool down the card. The fan itself isn't too noisy, I can hear a low whoosh from it, but since I'm gaming, its drowned out by the sounds from my speakers.
-
- Posts: 69
- Joined: Wed Apr 06, 2005 1:17 am
- Location: Sydney, Australia
If you saw my above post, I added a fan on the bottom of the case. It was too noisy, and in true SPCR fashion I ditched it all together. But now summer is starting to kick in, with temps in the low 30s (outside, maybe a little cooler inside my room). Motherboard temps reading in at 45 degrees, CPU at 60 degrees, the whole computer is struggling to keep itself cool... The Phantom is still going good, but is quite hot.
As for the Gigabyte graphics card (AGP), it was reaching temps up to around 105-110 degrees during 20 minutes of gaming. So I gathered up the guts to void my warranty and peek around inside the card.
I read the X1800 and 6800GT Gigabyte SilentPipe threads, and my heatpipe seems much simpler. The two spring-loaded screws, as circled in the picture below, just unscrew them. As for the AGP-to-PCIe chip (at least i think it is), has a separate heatsink held in with push-pins. Just leave them there.
Once you remove the top heatsink, you can see the contact that the copper slab makes with the heatsink - not that much.
The next pictures shows the heatpipes and the copper slab, just sitting there on the core. Slide it off, no screws/backplates whatever need to be removed.
Looking at the copper slab, you can see the Gigabyte stock thermal gum (yellow). I couldn't remove the heatpipes from the copper heatsink, and couldn't clean off the yellow gum. Clean the core, apply some AS5 to the core.
I also applied AS5 to the heatsink, where the copper slab touches the other heatsink, hopefully that helps.
Putting it all back together is a little difficult, but doable. As for temps, a slight difference, but I've yet to see a dramatic improvement, maybe as the AS 'burns in'. I'll keep you guys posted with regards to temps.
One other note, it seems that this AGP 6600GT heatpipe/heatsink design is much more simplistic than what others have posted (6800GT etc.)
As for the Gigabyte graphics card (AGP), it was reaching temps up to around 105-110 degrees during 20 minutes of gaming. So I gathered up the guts to void my warranty and peek around inside the card.
I read the X1800 and 6800GT Gigabyte SilentPipe threads, and my heatpipe seems much simpler. The two spring-loaded screws, as circled in the picture below, just unscrew them. As for the AGP-to-PCIe chip (at least i think it is), has a separate heatsink held in with push-pins. Just leave them there.
Once you remove the top heatsink, you can see the contact that the copper slab makes with the heatsink - not that much.
The next pictures shows the heatpipes and the copper slab, just sitting there on the core. Slide it off, no screws/backplates whatever need to be removed.
Looking at the copper slab, you can see the Gigabyte stock thermal gum (yellow). I couldn't remove the heatpipes from the copper heatsink, and couldn't clean off the yellow gum. Clean the core, apply some AS5 to the core.
I also applied AS5 to the heatsink, where the copper slab touches the other heatsink, hopefully that helps.
Putting it all back together is a little difficult, but doable. As for temps, a slight difference, but I've yet to see a dramatic improvement, maybe as the AS 'burns in'. I'll keep you guys posted with regards to temps.
One other note, it seems that this AGP 6600GT heatpipe/heatsink design is much more simplistic than what others have posted (6800GT etc.)
-
- Posts: 556
- Joined: Tue Oct 28, 2003 5:14 am
- Location: London, UK
I've had a Gigabyte 6800 'silentpipe' for about a year now and its been running faultlessly. It uses the same heatpipe mechanism as the 6600GT in the original post.
Temps without fan were about 70/90 (idle/load). Using a Zalman fanbracket and Nexus 80mm blowing across the width of the card dropped temps to about 50/70. All that was with the card overclocked from 325 to 390 (core) and memory from 700mhz to 880mhz. The extra vertex shader was unlocked using Rivatuner to give (12pipes, 6shaders). The extra quadpipe was unlockable to give 16 pipes but it resulted in graphical artefacting so I left it disabled.
However, I sold the card recently and when the new buyer received the card he said it was hitting temperatures of 100C and crashing in 3D games etc. After some debate as to what was causing it, I offered to take it back and give him a refund. I honestly couldn't understand why it wasn't stable when it had worked fine in my PC. Intrigued, I plugged it back into my PC and the immediate temperature reading was 113C! After 20 mins of WinXP being on and doing nothing (other than showing the desktop), the PC rebooted by itself. As I knew the rest of the PC was fine, it wasn't difficult to assume that the core was somehow overheating. 3D games locked up after a couple of minutes. So I decided to take card apart (thanks to Aleksi for the tips BTW):-
- the heatsink simply hasn't been assembled with care
- there is minimal (or no) thermal compound in the areas were metal meets metal
- the heatpipes aren't flat (making contact poor)
- thermal adhesive on the core seems to have degraded and turned into a hard ceramic which may insulate rather than conduct heat
- the screws holding the heatsink onto the board are very poor quality (I sheared off one of the screw heads by simply turning it with a screwdriver)
- different thermal compounds seems to have been used for the core and the rest of the heatsink
- worst of all the entire bottom heatsink appears to held on entirely by friction (!)
The worst thing about the heatsink design is that the bottom heatsink and the way its heatpipes are attached to the core simply is by friction. The bottom heatink attached to the heatsinks which have a heatspreader at the end basically form a U shape (on its side / think of a U turned 90degrees to the left) so that it slides into the card/top heatsink and grips the core a bit like a clamp.
The problem with that design is that it only has to be knocked slightly (or heftily) to result in very poor contact with the core and thus give extremely high temperatures. Also if the U isn't bent inwards enough, its not going to grip the core tightly enough. Alternatively if the person assembling it was a bit slapdash that would also result in poor core contact. I can see that this design would make it easy to assemble but it would also result in inconsistent quality of products - some cards would be fine, others would be poor (in terms of temps). Anyone who gets a 'Friday afternoon' would be in big trouble.....I am amazed that it isn't screwed in.
Anyway, I removed most (though not all) of all the previous thermal goop, applied liberal amounts of arctic silver all over the place and reassembled it, notwitstanding the broken screw. I also bent the U shape bottom heatsink and pipes into more on an inwards U shape so that it makes better contact with the core.
Plugged it back into my PC and voila! Temps are about 60C idle and 70-75 load (2hrs 3D Mark).
My suggestion to anyone experiencing high temps would be to disassemble the card and remount it properly with high quality thermal compound and also bend in the bottom heatsinks heatpipes slightly. Black marked for Gigabyte for having such a slapdash design to their heatsink!
[Incidentally, the heatpipes of this card will touch the black graphics card fan mount on the Antec P180, and you will get sizzled plastic)
Temps without fan were about 70/90 (idle/load). Using a Zalman fanbracket and Nexus 80mm blowing across the width of the card dropped temps to about 50/70. All that was with the card overclocked from 325 to 390 (core) and memory from 700mhz to 880mhz. The extra vertex shader was unlocked using Rivatuner to give (12pipes, 6shaders). The extra quadpipe was unlockable to give 16 pipes but it resulted in graphical artefacting so I left it disabled.
However, I sold the card recently and when the new buyer received the card he said it was hitting temperatures of 100C and crashing in 3D games etc. After some debate as to what was causing it, I offered to take it back and give him a refund. I honestly couldn't understand why it wasn't stable when it had worked fine in my PC. Intrigued, I plugged it back into my PC and the immediate temperature reading was 113C! After 20 mins of WinXP being on and doing nothing (other than showing the desktop), the PC rebooted by itself. As I knew the rest of the PC was fine, it wasn't difficult to assume that the core was somehow overheating. 3D games locked up after a couple of minutes. So I decided to take card apart (thanks to Aleksi for the tips BTW):-
- the heatsink simply hasn't been assembled with care
- there is minimal (or no) thermal compound in the areas were metal meets metal
- the heatpipes aren't flat (making contact poor)
- thermal adhesive on the core seems to have degraded and turned into a hard ceramic which may insulate rather than conduct heat
- the screws holding the heatsink onto the board are very poor quality (I sheared off one of the screw heads by simply turning it with a screwdriver)
- different thermal compounds seems to have been used for the core and the rest of the heatsink
- worst of all the entire bottom heatsink appears to held on entirely by friction (!)
The worst thing about the heatsink design is that the bottom heatsink and the way its heatpipes are attached to the core simply is by friction. The bottom heatink attached to the heatsinks which have a heatspreader at the end basically form a U shape (on its side / think of a U turned 90degrees to the left) so that it slides into the card/top heatsink and grips the core a bit like a clamp.
The problem with that design is that it only has to be knocked slightly (or heftily) to result in very poor contact with the core and thus give extremely high temperatures. Also if the U isn't bent inwards enough, its not going to grip the core tightly enough. Alternatively if the person assembling it was a bit slapdash that would also result in poor core contact. I can see that this design would make it easy to assemble but it would also result in inconsistent quality of products - some cards would be fine, others would be poor (in terms of temps). Anyone who gets a 'Friday afternoon' would be in big trouble.....I am amazed that it isn't screwed in.
Anyway, I removed most (though not all) of all the previous thermal goop, applied liberal amounts of arctic silver all over the place and reassembled it, notwitstanding the broken screw. I also bent the U shape bottom heatsink and pipes into more on an inwards U shape so that it makes better contact with the core.
Plugged it back into my PC and voila! Temps are about 60C idle and 70-75 load (2hrs 3D Mark).
My suggestion to anyone experiencing high temps would be to disassemble the card and remount it properly with high quality thermal compound and also bend in the bottom heatsinks heatpipes slightly. Black marked for Gigabyte for having such a slapdash design to their heatsink!
[Incidentally, the heatpipes of this card will touch the black graphics card fan mount on the Antec P180, and you will get sizzled plastic)
-
- Friend of SPCR
- Posts: 889
- Joined: Sun Dec 05, 2004 11:34 pm
- Location: Finland -- Folding For SPCR
Hi David,
good to hear you got it back together and working without too much hassle. I really agree with what you wrote about the design and workmanship.
When I bought my card, the passive cards (the few that existed and were available in europe) were a few ten euros expensive than the normal versions with a fan. I understand that people have also had "good" cards, but what's the point in paying extra and buying a passive solution, when you have to take it apart and re-seat and re-goop it in order to get it working properly? That's voiding warranty right there.
I currently have a Zalman 80D sitting on it. Even after playing with the original heatsink I couldn't get it cool the card properly.
My next card will not be a passive Gigabyte...
good to hear you got it back together and working without too much hassle. I really agree with what you wrote about the design and workmanship.
When I bought my card, the passive cards (the few that existed and were available in europe) were a few ten euros expensive than the normal versions with a fan. I understand that people have also had "good" cards, but what's the point in paying extra and buying a passive solution, when you have to take it apart and re-seat and re-goop it in order to get it working properly? That's voiding warranty right there.
I currently have a Zalman 80D sitting on it. Even after playing with the original heatsink I couldn't get it cool the card properly.
My next card will not be a passive Gigabyte...
-
- Posts: 69
- Joined: Wed Apr 06, 2005 1:17 am
- Location: Sydney, Australia
As much as there are problems with Gigabyte's fanless graphics cards, I still commend Gigabyte for developing them in the first place - as far as I am concerned they were one of the first to develop such passive cooling solutions for the mainstream nVIDIA graphics cards. While they are problematic for some of us, its a start and constant R&D will hopefully fix the problems and lead to better cooling solutions - which can only be good for the end user and customers.
So. kudos to Gigabyte
So. kudos to Gigabyte
Silent-Pipe II
Thanks for this thread, I've been looking for some info on the silent-pipe-cards by Gigabyte.
Does anyone know how the silent-pipe II performs. The silent-pipe II is used in http://www.gigabyte.com.tw/VGA/Products ... 256DE.htm#. Also, can these silent-pipe solutions be used in any case, or must the card be rotated/oriented in a specific direction for the heatpipes to function?
Does anyone know how the silent-pipe II performs. The silent-pipe II is used in http://www.gigabyte.com.tw/VGA/Products ... 256DE.htm#. Also, can these silent-pipe solutions be used in any case, or must the card be rotated/oriented in a specific direction for the heatpipes to function?
Reminds me of my Gigabyte Radeon 8500 Pro, nice design, with fan ( a tiny loud one) which failed within 2 months. attached a 8cm casefan on it and it was fine for the next 7 months.
Then the screen was getting corrupt, no crashes though (yet).
It became worse, but i couldn't put my finger on it (the problem).
Finally, after it failed seriously i discovered that two condensators right next to the heatsink were blown up. Instead of sending it to Gigabyte for RMA (I tried, which was a hassle all allong too, return-emails which didn't work, etc.etc.) i decided to buy some of those for a few dollars and solder it myself.
Output: the card still works today Mission accomplished.
Back then, i did some research on it and there were alot of others with the same/simular issue's.
I had email contact with a Gigabyte spokesman, i told him what my experience was: bad pasta, bad applied pasta, horrible fan, bad construction (wires for the fan straight throught the ribbons of a heatsink is not a good concept).
It was obvious that the the tiny, loud, fragile fan couldn't cool the heatsink enough. Infact, the heatsink became so hot that the sink was "melting" the fan over time. Failure wasn't a issue, it was a matter of time
The gigabyte spokesman ofcource did not agree, it must have been a badly cooled case. Their product(s) were thoroughly tested and designed in their professional lab(s).
Yeah, right
I had a bigtower (Aopen H800), 6 casefans installed in it. Sure, must be my bad air (i live in a relative cool country, netherlands).
seeing all those issues here with newer stuff frmo Gigabyte i don't think their Lab have improved much.
Alot of bling,bling, much less (sadly) quality.
That's what i think of Gigabyte, and untill i see proof otherwise, i stay with it.
Then the screen was getting corrupt, no crashes though (yet).
It became worse, but i couldn't put my finger on it (the problem).
Finally, after it failed seriously i discovered that two condensators right next to the heatsink were blown up. Instead of sending it to Gigabyte for RMA (I tried, which was a hassle all allong too, return-emails which didn't work, etc.etc.) i decided to buy some of those for a few dollars and solder it myself.
Output: the card still works today Mission accomplished.
Back then, i did some research on it and there were alot of others with the same/simular issue's.
I had email contact with a Gigabyte spokesman, i told him what my experience was: bad pasta, bad applied pasta, horrible fan, bad construction (wires for the fan straight throught the ribbons of a heatsink is not a good concept).
It was obvious that the the tiny, loud, fragile fan couldn't cool the heatsink enough. Infact, the heatsink became so hot that the sink was "melting" the fan over time. Failure wasn't a issue, it was a matter of time
The gigabyte spokesman ofcource did not agree, it must have been a badly cooled case. Their product(s) were thoroughly tested and designed in their professional lab(s).
Yeah, right
I had a bigtower (Aopen H800), 6 casefans installed in it. Sure, must be my bad air (i live in a relative cool country, netherlands).
seeing all those issues here with newer stuff frmo Gigabyte i don't think their Lab have improved much.
Alot of bling,bling, much less (sadly) quality.
That's what i think of Gigabyte, and untill i see proof otherwise, i stay with it.
Must the RAM be cooled too??
My plan w a refurbed Gigabyte GV-N68T256DH (that's an AGP version of 6800GT) is to take off the awkward (tho snazzy) OEM cooler, slap a water block on that hot GPU and ... well, I hope to ignore the RAM. Please don't begrudge me my post to maximumPC where I've hung around 6-7 years!
They won't have the same perspective that I see here: you guys are already up to speed!
Thread posted here:
http://tinyurl.com/9aa6x
It was badly stated a few days ago here:
http://tinyurl.com/9aa6x
I wrote the question like this:
The following is a full copy of my question, just 1 hour old now:
(Thanks a lot SPCR!):
===========================
The Actual Question is 2/3 the way down if you don't feel patient.
My happy news is there's a Gigabyte GV-N68T256DH disassembled on the table before me. THis 6800GT 256MB ...AGP! a vanishing breed! So it goes for >$400 ... but I got a refurb !! This seemed to me, at the time, a coup: no less, at total $230. But I think there'll be more of these (for reasons I speculate abt on post #5). She's a beauty.
The gimmicky heatpipe ("SilentPipe, v.I" of which version Two "II" vents outside the box. This has a version One, as did Gigabyte's Radeon (X800?) boards. The cooler images are clear at
http://tinyurl.com/9wpt4
http://tinyurl.com/blg9a
IMO, the contrivance is a clever design but poorly executed fabrication since sloppy execution left the thermal paste glopped on, dried out in crusty heaps, even to the point of leaving gaps between chip and sinks. These small errors are rapidly catastrophic in a hot-running device like this. Lucky for us opportunists, the Gigabyte folks left the cautious config of early failure and shutdown. They won't smoke: they fail silently. Indignant dilletants will feel (justifiably) that plurging at retail $400+ should mean no troubles and so they are anxious that the warranty is not to be voided...
they RMA'd these crusty gems. Retailers will easily fix 'em but must concede that they're now refurbs! Caveat Emptor!But there is no risk if I have this right.
Guys who are DIYers, voiding all warranties in reach.. here's their reward for having a brain, confidence, and a bit of dexterity.
Or so I tell myself.
So I got it cheap. I'll play w the heat pipes elsewhere (I never owned one before) and I'll simply attach a water block to the GPU - 6800GT.
My Question:
Just how necessary is RAM chip cooling?
This complicated OEM cooler tries to apply a heatsink surface to the GPU AND also to all 8 RAM chips. These are in 4 pairs (total 256MB), and the pairs are arranged on PCB in a curve.
RAM sinks are an option- but I'd rather not (even tho I do have some A-S Epoxy & Zinc-Cu sinks. Still: it's a pain to glue 'em neatly, they're heavy and the torque pulls chips away from board. Plus, I never heard anyone convincingly argue the need for VGA ramsinks. I've never seen data to support their use.
But then, I've been out of the loop on OCing VGA boards for a year.
Please advise me whether the Gigabyters (and other OEMs) are seriously right in heatsinking these ram chips..or is it mostly just fer show?
What a shame it'd be to smoke this board after all this bragging!
Thanks for all input, Maggs
[oops: editted to correct incorrect linkage at top with corrected now:
http://tinyurl.com/9aa6x Sorry-Maggs]
They won't have the same perspective that I see here: you guys are already up to speed!
Thread posted here:
http://tinyurl.com/9aa6x
It was badly stated a few days ago here:
http://tinyurl.com/9aa6x
I wrote the question like this:
The following is a full copy of my question, just 1 hour old now:
(Thanks a lot SPCR!):
===========================
The Actual Question is 2/3 the way down if you don't feel patient.
My happy news is there's a Gigabyte GV-N68T256DH disassembled on the table before me. THis 6800GT 256MB ...AGP! a vanishing breed! So it goes for >$400 ... but I got a refurb !! This seemed to me, at the time, a coup: no less, at total $230. But I think there'll be more of these (for reasons I speculate abt on post #5). She's a beauty.
The gimmicky heatpipe ("SilentPipe, v.I" of which version Two "II" vents outside the box. This has a version One, as did Gigabyte's Radeon (X800?) boards. The cooler images are clear at
http://tinyurl.com/9wpt4
http://tinyurl.com/blg9a
IMO, the contrivance is a clever design but poorly executed fabrication since sloppy execution left the thermal paste glopped on, dried out in crusty heaps, even to the point of leaving gaps between chip and sinks. These small errors are rapidly catastrophic in a hot-running device like this. Lucky for us opportunists, the Gigabyte folks left the cautious config of early failure and shutdown. They won't smoke: they fail silently. Indignant dilletants will feel (justifiably) that plurging at retail $400+ should mean no troubles and so they are anxious that the warranty is not to be voided...
they RMA'd these crusty gems. Retailers will easily fix 'em but must concede that they're now refurbs! Caveat Emptor!But there is no risk if I have this right.
Guys who are DIYers, voiding all warranties in reach.. here's their reward for having a brain, confidence, and a bit of dexterity.
Or so I tell myself.
So I got it cheap. I'll play w the heat pipes elsewhere (I never owned one before) and I'll simply attach a water block to the GPU - 6800GT.
My Question:
Just how necessary is RAM chip cooling?
This complicated OEM cooler tries to apply a heatsink surface to the GPU AND also to all 8 RAM chips. These are in 4 pairs (total 256MB), and the pairs are arranged on PCB in a curve.
RAM sinks are an option- but I'd rather not (even tho I do have some A-S Epoxy & Zinc-Cu sinks. Still: it's a pain to glue 'em neatly, they're heavy and the torque pulls chips away from board. Plus, I never heard anyone convincingly argue the need for VGA ramsinks. I've never seen data to support their use.
But then, I've been out of the loop on OCing VGA boards for a year.
Please advise me whether the Gigabyters (and other OEMs) are seriously right in heatsinking these ram chips..or is it mostly just fer show?
What a shame it'd be to smoke this board after all this bragging!
Thanks for all input, Maggs
[oops: editted to correct incorrect linkage at top with corrected now:
http://tinyurl.com/9aa6x Sorry-Maggs]
Last edited by Maggot on Sun Mar 12, 2006 11:10 pm, edited 2 times in total.
-
- Posts: 69
- Joined: Wed Apr 06, 2005 1:17 am
- Location: Sydney, Australia