Yeah, I was just trying to say that neither are "built" for any particular frequencies. They are either able to run at a particular frequency at a particular voltage or they're not. Fact is that AMD could have sold pretty much all 3000+ Venice CPUs as 3500+ or 3800+ versions. The only reason they don't do this is that they need cheaper alternatives for the masses.~El~Jefe~ wrote:oc'ing stresses the system and also might not work with whatever ram the person wants to get.
eh. i mean, its an option, just never one I would do. I always work another week, wait that week and bam, i have enough for the higher model that is built for that speed without messsing with multipliers.
shrugs
Best Bang For the Buck? AMD CPU
Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee
-
- Friend of SPCR
- Posts: 2887
- Joined: Mon Feb 28, 2005 4:21 pm
- Location: New York City zzzz
- Contact:
the chip may be but the ram and boards are not. if you could just raise the chip's clock without having to fuss with other things, then you can get a 3000 and bring it to high limits. otherwise, its not a stable system normally.
I am not sure that all chips would have been destined to be 3500's. I mean, in the past, most chips couldnt do whatever you wanted, like Barton's, they could be raised quite a bit but each had limits. so what youre saying is that all 3000's can go to 2.3+ ghz?
I am not sure that all chips would have been destined to be 3500's. I mean, in the past, most chips couldnt do whatever you wanted, like Barton's, they could be raised quite a bit but each had limits. so what youre saying is that all 3000's can go to 2.3+ ghz?
-
- Posts: 968
- Joined: Fri Jan 07, 2005 7:13 pm
- Location: Bristol, UK
- Contact:
Not Hotter
~El~Jefe~ what you said appears to be incorrect. You said, "however, it does get a good deal hotter/more wattage usage than single core, so really, venice is the coldest solution for tweaking or non tweaking setups".
There are comparison graphs here: http://heh.pl/&Sz
"As advertised, both X2 models deliver power consumption comparable to their single-core predecessors, at least according to these system-level numbers".
Edit: I found the original link at Slashdot and decided to post it as well as some of the article comments are worth reading. http://heh.pl/&SA
Here's Anand's article with charts too: http://heh.pl/&SC
There are comparison graphs here: http://heh.pl/&Sz
"As advertised, both X2 models deliver power consumption comparable to their single-core predecessors, at least according to these system-level numbers".
Edit: I found the original link at Slashdot and decided to post it as well as some of the article comments are worth reading. http://heh.pl/&SA
Here's Anand's article with charts too: http://heh.pl/&SC
-
- Posts: 968
- Joined: Fri Jan 07, 2005 7:13 pm
- Location: Bristol, UK
- Contact:
I was talking about the CPU (as I thought you did too). But, yeah, off course you'd need a motherboard to cope with the overclock. The memory can be run on a divider, so that's not much of a problem.~El~Jefe~ wrote:the chip may be but the ram and boards are not. if you could just raise the chip's clock without having to fuss with other things, then you can get a 3000 and bring it to high limits. otherwise, its not a stable system normally.
Well, just have a look around. Many Venices can even make 2.5-2.6GHz on stock voltage. I'm not saying that all can hit 3500+ frequencies, but most can (I've never seen any Venice fail 2.2GHz). The reason for most chips being able to run on higher frequencies without any voltage increase is simply because of AMD's quite mature process technology.~El~Jefe~ wrote:I am not sure that all chips would have been destined to be 3500's. I mean, in the past, most chips couldnt do whatever you wanted, like Barton's, they could be raised quite a bit but each had limits. so what youre saying is that all 3000's can go to 2.3+ ghz?
It may be logical, but not necessarily so. Did you read the articles?StarfishChris wrote:Incorrect? It's perfectly logical. Two cores > one core, so it will dissapate more heat and require a bigger heatsink or faster fan to cool it to the same temperature.
"Clock for clock, the Athlon 64 X2 will consume less power than a 130nm Athlon 64, and less than 20% more power than a 90nm Athlon 64". From Anand's link. Less than 20% more is not very much IMO.
-
- Friend of SPCR
- Posts: 2887
- Joined: Mon Feb 28, 2005 4:21 pm
- Location: New York City zzzz
- Contact:
wait wait wait wait...
the 754 2.0ghz clawhammer i have now does not use as much as a dual core.
per cpu, well of course, that dual core is more efficient for its power usage.
and for the power... well mine uses a massive amount more heat and electricity than a dual core.
but it still shows that is sucks up a considerable draw compared to single core options. just not a prescott type of sucking (and those do SUCK)
the 754 2.0ghz clawhammer i have now does not use as much as a dual core.
per cpu, well of course, that dual core is more efficient for its power usage.
and for the power... well mine uses a massive amount more heat and electricity than a dual core.
but it still shows that is sucks up a considerable draw compared to single core options. just not a prescott type of sucking (and those do SUCK)
i'd say go for the 3200, even if u say u won't overlcock...OC for me means more performance for less money, not the races beeing done on other forums pumping 1.9V in aprocessor for 200Mhz extra.
I've got a 3000+ for 1 month now, a lucky week, week 17 and i got it to 2500MHz 1:1 with corsair pc3200 ram [280FSB] and no additional volts.
temp difference between the two speeds is of 3-4 degrees, using XP-90 and low rotating 120mm fan
my point is, AMD64 is very cool, if u get lucky enough u will get higher speed if u really want it. for OC like mine, "push until u're happy with it OC" i'd say the 3000+ is pretty ok, 3200 has 10x multiplier so maybe u won't push your ram like mine, and afford to go lower with the FSB...though u'd have to do some research to find out what a64 3200 weeks are beeing sold out there...guess my week 17 3000+ was pure luck.
I've got a 3000+ for 1 month now, a lucky week, week 17 and i got it to 2500MHz 1:1 with corsair pc3200 ram [280FSB] and no additional volts.
temp difference between the two speeds is of 3-4 degrees, using XP-90 and low rotating 120mm fan
my point is, AMD64 is very cool, if u get lucky enough u will get higher speed if u really want it. for OC like mine, "push until u're happy with it OC" i'd say the 3000+ is pretty ok, 3200 has 10x multiplier so maybe u won't push your ram like mine, and afford to go lower with the FSB...though u'd have to do some research to find out what a64 3200 weeks are beeing sold out there...guess my week 17 3000+ was pure luck.
-
- Friend of SPCR
- Posts: 2887
- Joined: Mon Feb 28, 2005 4:21 pm
- Location: New York City zzzz
- Contact:
i mean power for power, a 754 chip doesnt do squat against a dual core, power for power consumption and computing power, the dual uses less energy for its computing power is what im saying.Mikael wrote:Aren't these two quotes basically contradicting each other?
~El~Jefe~ wrote:the 754 2.0ghz clawhammer i have now does not use as much as a dual core.Not sure I'm following here...~El~Jefe~ wrote:and for the power... well mine uses a massive amount more heat and electricity than a dual core.
also, in general, having tested this, my 754 cpu does NOT use the watts of power in idle state that people are suggesting. wthell is anandtech saying about 2.0 clawhammers I am not sure.
the x2 3800 though is the real processor answer now in this forum....
it beats alll suggestions hands down. Best dollar for dollar value besides those who can manage to oc a single core up 30% I guess.
-
- Posts: 26
- Joined: Mon Aug 01, 2005 11:47 am
~El~Jefe~: "2.2GHz minimum" is bullshit
You know when Hz'es counted for x86 ? Back when you could still calculate how many miliseconds a screen copy or sprite blitting would take (640x200 x 2 bits per pixel for CGA, takes so much MOVs, so many clockticks per MOV, etc.). These were chips that simply couldn't move enough data to display a smoothly updating screen (yet do anything usefull with that data in the mean time). As soon as they left that latency problem behind it was more important to know how much it could actually calculate.
The only place where Hz's are still somewhat relevant: if you are comparing the exact same internal chip layout (stepping). Else you might as well compare blue cheese with apples . Please don't compare Intel and AMD chips on clockspeed..
But there are even troubles with comparing the same stepping in terms of clockspeed. Modern CPU's are much faster at moving data than the interconnecting bus on the motherboard. This means that they are often starved for data. Compare it with building a plant that uses so much resources you can't run it all the time. Now go screaming "faster! faster! faster!" at the engineers that control the 'clockspeed' of the building, and see if it helps
If someone isn't targeting to buy some expensive gaming utopia machine I'd just recommend to buy the system with the highest memory bus speeds that can be bought around the 'sweet spot' of GHz/EUR (or $) for the chosen cpu brand/type.
You know when Hz'es counted for x86 ? Back when you could still calculate how many miliseconds a screen copy or sprite blitting would take (640x200 x 2 bits per pixel for CGA, takes so much MOVs, so many clockticks per MOV, etc.). These were chips that simply couldn't move enough data to display a smoothly updating screen (yet do anything usefull with that data in the mean time). As soon as they left that latency problem behind it was more important to know how much it could actually calculate.
The only place where Hz's are still somewhat relevant: if you are comparing the exact same internal chip layout (stepping). Else you might as well compare blue cheese with apples . Please don't compare Intel and AMD chips on clockspeed..
But there are even troubles with comparing the same stepping in terms of clockspeed. Modern CPU's are much faster at moving data than the interconnecting bus on the motherboard. This means that they are often starved for data. Compare it with building a plant that uses so much resources you can't run it all the time. Now go screaming "faster! faster! faster!" at the engineers that control the 'clockspeed' of the building, and see if it helps
If someone isn't targeting to buy some expensive gaming utopia machine I'd just recommend to buy the system with the highest memory bus speeds that can be bought around the 'sweet spot' of GHz/EUR (or $) for the chosen cpu brand/type.
-
- Friend of SPCR
- Posts: 2887
- Joined: Mon Feb 28, 2005 4:21 pm
- Location: New York City zzzz
- Contact:
having 10-15% faster frames for advanced settings now will mean that in a year, his machine will be around 20% lower than what is standard, which means he can play at medium settings most likely.
its a rule of thumb that always wins out. not exact numbers but the concept does.
my friend 2.8 intel, if it was 3.4, it would run his games at a decent framerate at higher quality settings, but it doesnt.
also, for 100 more dollars or LESS he can future proof his machine.
people dont have enough I tell them to wait, I mean, if that 100 dollars doesnt come ever to you, you shouldnt be blowing 700-800 dollars on a new system anyways.
its a rule of thumb that always wins out. not exact numbers but the concept does.
my friend 2.8 intel, if it was 3.4, it would run his games at a decent framerate at higher quality settings, but it doesnt.
also, for 100 more dollars or LESS he can future proof his machine.
people dont have enough I tell them to wait, I mean, if that 100 dollars doesnt come ever to you, you shouldnt be blowing 700-800 dollars on a new system anyways.
Let's Stay On Topic
~El~Jefe~
Btw, you assumed I am a guy. I'm not a guy. I'm a middle-aged female into building her own rigs.
I like multi-tasking a lot on my current Intel w/HT so at this point the lowest X2, at 3800+, looks best. I am going to wait it out a bit and see if the prices start to drop in a few weeks.
Btw, you assumed I am a guy. I'm not a guy. I'm a middle-aged female into building her own rigs.
I like multi-tasking a lot on my current Intel w/HT so at this point the lowest X2, at 3800+, looks best. I am going to wait it out a bit and see if the prices start to drop in a few weeks.
-
- Posts: 68
- Joined: Thu Jan 22, 2004 12:35 pm
- Location: Norway
-
- Friend of SPCR
- Posts: 2887
- Joined: Mon Feb 28, 2005 4:21 pm
- Location: New York City zzzz
- Contact:
yeah best combo right now is:
asrock m1695 board, 2x1 gig sticks of ram and like a 6600GT and a 200gig sataII Samsung, with a dual core 3800+.
probably get you farther on best sweet spot of spending than anything i can see.
low heat and power requirements, most processing power, overclockable to slightly futureproof, enough ram to actually use a 64 bit OS properly if added 2 more gigs (buying 512's i think is short cited in a big way this fall).
thats my recommendation.
Only other possibility would be a 6800GT if you get a good price and a good card (6600GT's are basically all the same for the 170 dollar pricerange, the 6800's vary in speeds a bit it seemed)
asrock m1695 board, 2x1 gig sticks of ram and like a 6600GT and a 200gig sataII Samsung, with a dual core 3800+.
probably get you farther on best sweet spot of spending than anything i can see.
low heat and power requirements, most processing power, overclockable to slightly futureproof, enough ram to actually use a 64 bit OS properly if added 2 more gigs (buying 512's i think is short cited in a big way this fall).
thats my recommendation.
Only other possibility would be a 6800GT if you get a good price and a good card (6600GT's are basically all the same for the 170 dollar pricerange, the 6800's vary in speeds a bit it seemed)