Originally posted by IMY
Intel is better in the long run because there is less stress on the cpu when it comes to overclocking. Where as amd heats up pretty fast. I had a 2500k aircooled at 5ghz and water cooled at that speed. As for the extra $40 i would pay that knowing that my cpu will not overheat as much as amd. It's a common heating issue with most AMD products i.e cpus, gpus etc.. AMD is a flop in the UK. I would recommend 2500k/2600k cpu though, it can be easily overclocked. Use the right rams and gpu and you will end up with a beast system at a affordable price. If you want that little bit extra then go for water cooling. My temps when I had my pc built where at 20c. I would recommend watercooling to everyone. When I said 98% bench mark, I was meant to say generally speaking intel cpus would outperform Amd cpu. I don't know how the value is in USA for Amd cpus but in UK Amd cpus value drops extremely fast compared to Intel.
---------- Post added at 10:21 PM ---------- Previous post was at 10:14 PM ----------
Amd will overclock better???? No way!! Amd's are known to have heating issues even at stock speed. Intel CPU's are much much easier and safer to overclock than AMD.
I can tell you have never even looked into owning an AMD product in your life. First of all AMD CPU's don't "heat up pretty fast." Actually Intel chips heat up much faster compared to AMD chips. Most AMD chips don't even hit 60 degrees under full load. They use a different type of silicon in their chips that make their chips overall cooler, however they cannot withstand 70+ degree temps on the chips, but they'll never hit those if you are running something like an H100.
I don't know if you've seen the 3770k temps. Intel crapped out and saved cash by using a very poor IHS. Not only that but the 22nm manufacturing process and the very tightly packed transistors caused even more problems for heat dissipation. I'm not sure why countries have anything to do with this. The chips you buy in the UK are the same as they are in Canada, or in the USA.
I highly doubt your machine only has 20 degree temps when rocking 5GHz. The voltage for that chip is at least 1.4V. My 2600k with a H80 isn't even hitting those temps at 4.3GHz on idle. And you are still missing the point. The point is that the 8350 is still the best chip out there for the money. Sure the 3770k almost always beats it, but it's an extra 100+ dollars for not that huge increases in performance. I know the intel chip is better, but Intel has always been a little more on the expensive side of things.
And I don't know what you mean by "drops in value extremely quickly". You mean resale? Or do you mean the retail prices drop quickly? Regardless of that, every PC part looses value.
And finally, AMD chips are much more overclockable in general. The 8350 can go from 4GHz to 5Ghz on almost all chips. They can withstand more voltage and appeal to those who are exotic coolers because they can hit ridiculous clock speeds using liquid nitrogen (like 7-8Ghz).
Intel has a voltage/heat barrier. Every intel chip has one, although some are at much higher clock speeds. There is very little consistency with Intel chips, meaning that you could buy one but the overclock you can hit on it is a hit or miss. Some people get lucky and get one that hits 5Ghz, while others can't even get close to 4.5GHz. Also, Intel chips have a voltage wall. Eg. I'm at 4.5Ghz using 1.36V. If I want to get 4.6GHz I will need to increase my voltage all the way to 1.45V. These are problems that a lot of Intel chips had, especially during Sandy Bridge times.
AMD chips don't have the above problems nearly as much, and are far more consistent when it comes to overclocking.
And the GPU overheating on the AMD side of things is just bogus. GPU's are made to withstand higher temps for prolonged periods of time. Anyone who leaves fans on auto while doing hardcore gaming is going to hit high temps. Download MSI Afterburner and increase the fan speed, and your problem is solved. If I leave my 580 while gaming on auto fans it will hit 80 degrees. If I turn the fans up it won't even touch 70 degrees. It boggles my mind how people can just leave the fans on auto then complain about their GPU temps.