I've been buying AMD for -- holy shit -- 25 years now, and have never once regretted it. I don't consider myself a fanboi; I just (a) prefer having the best performance-per-dollar rather than best performance outright, and (b) like rooting for the underdog.
But if Intel keeps fucking up like this, I might have to switch on grounds of (b)!
spoiler
(Realistically I'd be more likely to switch to ARM or even RISCV, though. Even if Intel became an underdog, my memory of their anti-competitive and anti-consumer bad behavior remains long.)
Same here. I hate Intel so much, I won't even work there, despite it being my current industry and having been headhunted by their recruiter. It was so satisfying to tell them to go pound sand.
Sorry but after the amazing Athlon x2, the core and core 2 (then i series) lines fuckin wrecked AMD for YEARS. Ryzen took the belt back but AMD was absolutely wrecked through the core and i series.
Source: computer building company and also history
tl:dr: AMD sucked ass for value and performance between core 2 and Ryzen, then became amazing again after Ryzen was released.
I've had nothing but issues with some computers, laptops, etc... once I discovered the common factor was Intel, I haven't had a single problem with any of my devices since. AMD all the way for CPUs.
Genuinely, I've also been an AMD buyer since I started building 12 years ago. I started out as a fan boy but mellowed out over the years. I know the old FX were garbage but it's what I started on, and I genuinely enjoy the 4 gens of Intel since ivy bridge, but between the affordability and being able to upgrade without changing the motherboard every generation, I've just been using Ryzen all these years.
arm is very primed to take a lot of market share of server market from intel. Amazon is already very committed on making their graviton arm cpu their main cpu, which they own a huge lion share of the server market on alone.
for consumers, arm adoption is fully reliant on the respective operating systems and compatibility to get ironed out.
RISC-V isn't there yet, but it's moving in the right direction. A completely open architecture is something many of us have wanted for ages. It's worth keeping an eye on.
If there were decent homelab ARM CPUs, I'd be all over that. But everything is either memory limited (e.g. max 8GB) or datacenter grade (so $$$$). I want something like a Snapdragon with 4x SATA, 2x m.2, 2+ USB-C, and support for 16GB+ RAM in a mini-ITX form factor. Give it to me for $200-400, and I'll buy it if it can beat my current NAS in power efficiency (not hard, it's a Ryzen 1700).
I have a 13 series chip, it had some reproducible crashing issues that so far have subsided by downclocking it. It is in the window they've shared for the oxidation issue. At this point there's no reliable way of knowing to what degree I'm affected, by what type of issue, whether I should wait for the upcoming patch or reach out to see if they'll replace it.
I am not happy about it.
Obviously next time I'd go AMD, just on principle, but this isn't the 90s anymore. I could do a drop-in replacement to another Intel chip, but switching platforms is a very expensive move these days. This isn't just a bad CPU issue, this could lead to having to swap out two multi-hundred dollar componenet, at least on what should have been a solidly future-proof setup for at least five or six years.
I’m angry on your behalf. If you have to downclock the part so that it works, then you’ve been scammed. It’s fraud to sell a part as a higher performing part when it can’t deliver that performance.
So here's the thing about that, the real performance I lose is... not negligible, but somewhere between 0 and 10% in most scenarios, and I went pretty hard keeping the power limits low. Once I set it up this way, realizing just how much power and heat I'm saving for the last few few drops of performance made me angrier than having to do this. The dumb performance race with all the built-in overclocking has led to these insanely power hungry parts that are super sensitive to small defects and require super aggressive cooling solutions.
I would have been fine with a part rated for 150W instead of 250 that worked fine with an air cooler. I could have chosen whether to push it. But instead here we are, with extremely expensive motherboards massaging those electrons into a firehose automatically and turning my computer into a space heater for the sake of bragging about shaving half a milisecond per frame on CounterStrike. It's absurd.
None of which changes that I got sold a bum part, Intel is fairly obviously trying to weasel out of the obviously needed recall and warranty extension and I'm suddenly on the hook for close to a grand in superfluous hardware next time I want to upgrade because my futureproof parts are apparently made of rust and happy thoughts.
I have a 13 series chip, it had some reproducible crashing issues that so far have subsided by downclocking it.
From the article:
the company confirmed a patch is coming in mid-August that should address the “root cause” of exposure to elevated voltage. But if your 13th or 14th Gen Intel Core processor is already crashing, that patch apparently won’t fix it.
Citing unnamed sources, Tom’s Hardware reports that any degradation of the processor is irreversible, and an Intel spokesperson did not deny that when we asked.
If your CPU is already crashing then that's it, game over. The upcoming patch cannot fix it. You've got to figure out if you can do a warranty replacement or continue to live with workarounds like you're doing now.
Their retail boxed CPUs usually have a 3(?) year warranty so for a 13th gen CPU you may be midway or at the tail end of that warranty period. If it's OEM, etc. it could be a 1 year warranty aka Intel isn't doing anything about it unless a class action suit forces them :/
The whole situation sucks and honestly seems a bit crazy that Intel hasn't already issued a recall or dealt with this earlier.
If you're in the UK or I expect EU, I imagine if it's due to oxidation you can get it replaced even on an expired warranty as it's a defect which was known to either you or intel before the warranty expired, and a manufacturing defect rather than breaking from use, so intel are pretty much in a corner about having sold you faulty shit
The article is... not wrong, but oversimplifying. There seem to be multiple faults at play here, some would continue to degrade, others would prevent you from recovering some performance threshold, but may be prevented from further damage, others may be solved. Yes, degradation of the chip may be irreversible, if it's due to the oxidation problem or due to the incorrect voltages having cuased damage, but presumably in some cases the chip would continue to work stable and not degenerate further with the microcode fixes.
But yes, agreed, the situation sucks and Intel should be out there disclosing a range of affected chips by at least the confirmed physical defect and allowing a streamlined recall of affected devices, not saying "start an RMA process and we'll look into it".
They do say that you can contact Intel customer support if you have an affected CPU, and that they're replacing CPUs that have been actually damaged. I don't know -- and Intel may not know -- what information or proof you need, but my guess is that it's good odds that you can get a replacement CPU. So there probably is some level of recourse.
Now, obviously that's still a bad situation. You're out the time that you didn't have a stable system, out the effort you put into diagnosing it, maybe have losses from system downtime (like, I took an out-of-state trip expecting to be able to access my system remotely and had it hang due to the CPU damage at one point), maybe out data you lost from corruption, maybe out money you spent trying to fix the problem (like, on other parts).
But I'd guess that specifically for the CPU, if it's clearly damaged, you have good odds of being able to at least get a non-damaged replacement CPU at some point without needing to buy it. It may not perform as well as the generation had initially been benchmarked at. But it should be stable.
"Clearly damaged" is an interesting problem. The CPU would crash 100% of the time on the default settings for the motherboard, but if you remember, they issued a patch already.
I patched. And guess what, with the new Intel Defaults it doesn't crash anymore. But it suddenly runs very hot instead. Like, weird hot. On a liquid cooling system it's thermal throttling when before it wouldn't come even close. Won't crash, though.
So is it human error? Did I incorrectly mount my cooling? I'd say probably not, considering it ran cool enough pre-patch until it became unstable and it runs cool enough now with a manual downclock. But is that enough for Intel to issue a replacement if the system isn't unstable? More importantly, do I want to have that fight with them now or to wait and see if their upcoming patch, which allegedly will fix whatever incorrect voltage requests the CPU is making, fixes the overheating issue? Because I work on this thing, I can't just chuck it in a box, send it to Intel and wait. I need to be up and running immediately.
So yeah, it sucks either way, but it would suck a lot less if Intel was willing to flag a range of CPUs as being eligible for a recall.
As I see it right now, the order of operations is to wait for the upcoming patch, retest the default settings after the patch and if the behavior seems incorrect contact Intel for a replacement. I just wish they would make it clearer what that process is going to be and who is eligible for one.
Went 13th -> 14th very early in both’s launch cycles because of chronic crashing. After about swapping mobo, RAM and SSDs i finally swapped to AMD and my build from late 2022 is FINALLY stable. Wendell’s video was the catalyst to jump ship. I thought I was going crazy, but yea… it was intel
Whoa, that's even worse. It's not just the uncertainty of knowing whether Intel will replace your hardware or the cost of jumping ship next time. Intel straight up owes you money. That sucks.
When did you buy it? Depending on the credit card you have, they will sometimes extend on any manufacturer warranty by a year or two. Might be worth checking.
switching platforms is a very expensive move these days.
It's just a motherboard and a cpu. Everything else is cross compatible, likely even your cpu cooler. If you just buy another intel chip... it's just gonna oxidize again.
Personally i'd wait for the next release to drop in a month.... or until your system crashes aren't bearable / it's worth making the change. I just don't see the cost as prohibitive, it's about on par with all the alternatives. Plus you could sell your old motherboard for something.
I'm not really that knowledgeable about AM5 mobos (still on AM4) but you should be able to get something perfectly sensible for 100 bucks. Are you going to get as much IO and bells and whistles no but most people don't need that stuff and you don't have to spend a lot of money to get a good VRM or traces to the DIMM slots.
Then, possibly bad news: Intel Gen 13 supports DDR4, so you might need new RAM.
I mean, happy for you, but in the real world a 200 extra dollars for a 400 dollar part is a huge price spike.
Never mind that, be happy for me, I actually went for a higher spec than that when I got this PC because I figured I'd get at least one CPU upgrade out of this motherboard, since it was early days of DDR5 and it seemed like I'd be able to both buy faster RAM and a faster CPU to keep my device up to date. So yeah, it was more expensive than that.
And hey, caveat emptor, futureproofing is a risky, expensive game on PCs. I was ready for a new technology to make me upgrade anyway, if we suddenly figured out endless storage or instant RAM or whatever. Doesn't mean it isn't crappy to suddenly make upgrading my CPU almost twice as expensive because Intel sucks at their one job.
Intel is about to have a lot of lawsuits on their hands if this delay deny deflect strategy doesn't work out for them. This problem has been going on for over a year and the details Intel lets slip just keep getting worse and worse. The more customers that realize they're getting defective CPUs, the more outcry there'll be for a recall. Intel is going to be in a lot of trouble if they wait until regulators force them to have a recall.
Big moment of truth is next month when they have earnings and we see what the performance impact from dropping voltages will be. Hopefully it'll just be 5% and no more CPUs die. I can't imagine investors will be happy about the cost, though.
I want to say gamers rise up, but honestly gamers calling their member of Congress every day and asking what they’re doing about this fraud would be way more effective. Congress is in a Big Tech regulating mood right now
A few years ago now I was thinking that it was about time for me to upgrade my desktop (with a case that dates back to 2000 or so, I guess they call them "sleepers" these days?) because some of my usual computer things were taking too long.
And I realized that Intel was selling the 12th generation of the Core at that point, which means the next one was a 13th generation and I dono, I'm not superstitious but I figured if anything went wrong I'd feel pretty darn silly. So I pulled the trigger and got a 12th gen core processor and motherboard and a few other bits.
I recently built myself a computer, and went with AMD's 3d cache chips and boy am I glad. I think I went 12th Gen for my brothers computer but it was mid range which hasn't had these issues to my knowledge.
12th gen isn't affected. The problem affects only the 13th and 14th gen Intel chips. If your brother has 12th gen -- and you might want to confirm that -- he's okay.
For the high-end thing, initially it was speculated that it was just the high-end chips in these generations, but it's definitely the case that chips other than just the high-end ones have been recorded failing. It may be that the problem is worse with the high-end CPUs, but it's known to not be restricted to them at this point.
The bar they list in the article here is 13th and 14th gen Intel desktop CPUs over 65W TDP.
In my case I upgraded from threadripper 1950x to a 14900k and the machine died after four months. Went back to threadripper 7960x like I should have. My 14th gen cpu still posts, but haven't thrown any load at it yet. I'm hoping it can still be a streaming box...
I switched to AMD with the Ryzen 3000 series and can't see myself going to Intel for at least 2 or 3 more upgrades (like 10 years for me), and that's only if they are competitive again in that amount of time.
1 DOA CPU that the physical store I went to purchase it at didn't have any more of so I got a cheaper Intel CPU they DID have. Tbh that might have been the store dropping it or storing it improperly, they weren't a very competent electronics store.
And a Sapphire GPU that only worked with 1 very specific driver version that wasn't even on their website anymore when I tried to install it for some reason. I eventually got it working after hours of hunting and fiddling, which was repeated when I gave the PC away to a friend's little brother and they wiped it without checking the driver versions I left behind like I told them.
Recently built my wife a new AMD based system because grudges have to end eventually and I think I couldn't have picked a better time tbh
Is there really still such a market for Intel CPUs?
I do not understand that
AMDs Zen is so much better and is the superior technology since almost a decade now.
Intel is in the precarious position of being the largest surviving American owned semiconductor manufacturer, with the competition either existing abroad (TSMC, Samsung, ASML) or as a partner/subsidiary of a foreign national firm (NVidia simply procures its chips from TSMC, GlobalFoundries was bought up by the UAE sovereign wealth fund, etc).
Consequently, whenever the US dumps a giant bucket of money into the domestic semiconductor industry, Intel is there to clean up whether or not their technology actually works.
Small correction: only surviving that makes desktop/server class chips. Companies like Texas Instruments and Microchip still have US foundries for microcontrollers.
The argument was that while AMD is better on paper in most things, Intel would give you rock solid stability. That argument has now taken an Iowa-class broadside to the face.
I don't watch LTT anymore, but a few years back they had a video where they were really pushing the limits of PCIe lanes on an Epyc chip by stuffing it full of NVMe drives and running them with software RAID (which Epyc's sick number of cores should be able to handle). Long story short, they ran into a bunch of problems. After talking to Wendel of Level1Techs, he mentioned that sometimes, AMD just doesn't work the way it seems it should based on paper specs. Intel usually does. (Might be getting a few details wrong about this, but the general gist should be right.)
This argument was almost the only thing stopping AMD from taking over the server market. The other thing was AMD simply being able to manufacture enough chips in a short time period. The server market is huge; Intel had $16B revenue in "Data Center and AI" in 2023, while AMD's total revenue was $23B. Now manufacturing ramp up might be all that's stopping AMD from owning it.
Intels have been working in my Linux server better than AMD. The AMDs kept causing server crashes due to C-state nonsense that no amount of BIOS tweaking would fix. AMD is great for performance and efficiency (and cost/value) in my gaming PC but wreaking havoc with my server which I need to be reliably functional without power restarts.
The new AMD generation kinda tossed all the good out the window. Now they are the more expensive option and even with this Intel fuckup they are likely still going to be the go to for people that have more sense then money.
Funny that the good old zen 3 stuff is still swinging above its weight class.
AMD keeps some older generations in production as their budget options - and as they had excellent CPUs for multiple generations now you also get pretty good computers out of that. Even better - with some planning you'll be able to upgrade to another CPU later when checking chipset lifecycle.
AMD has established by now that they deliver what they promise - and intel couldn't compete with them for a few generations over pretty much the complete product line - so they can afford now to have the bleeding edge hardware at higher prices. It's still far away from what intel was charging when they were dominant 10 years ago - and if you need that performance for work well worth the money. For most private systems I'd always recommend getting last gen, though.
Intel’s iGPU is still the by far the best option for applications such as media transcoding. It’s a shame that AMD haven’t focussed more on this but understandable, it’s relatively niche.
Why does that graph show Epyc (server) and Threadripper (workstation) processors in the upper right corner, but not the equivalent Xeons? If you take those away, it would paint a different picture.
Also, a price/performance graph does not say much about which is the superior technology. Intel has been struggling to keep up with AMD technologically the past years, and has been upping power targets and thermal limits to do so ... which is one of the reasons why we are here points at headline.
I do hope they get their act together, because we an AMD monopoly would just be as bad as an Intel monopoly. We need the competition, and a healthy x86 market, lest proprietary ARM based computers take over the market (Apple M-chips, Snapdragon laptops,...)
Links a list where the three top spots substantiate the claim, followed by a comparatively large 8% drop.
To add a bit of nuance: There are definitely exceptions to the claim. But if I had to make a blanket statement, it would absolutely be in favor of AMD.
On what workloads? AMD is king for most games, and for less price. It's also king for heavily multicore workloads, but not on the same CPU as for games.
In other words, they don't have a CPU that is king for both at the same time. That's the one thing Intel was good at, provided you could cool the damn thing.
Amd processors have literally always been a better value and rarely have been surpassed by much for long. The only problem they ever had was back in the day they overheated easily. But I will never ever buy an Intel processor on purpose, especially after this.
The only problem they ever had was back in the day they overheated easily.
That's not true. It was just last year that some of the Ryzen 7000 models were burning themselves out from the insides at default settings (within AMD specs) due to excessive SoC voltage. They fixed it through new specs and working with board manufacturers to issue new BIOS, and I think they eventually gave in to pressure to cover the damaged units. I guess we'll see if Intel ends up doing the same.
I generally agree with your sentiment, though. :)
I just wish both brands would chill. Pushing the hardware so hard for such slim gains is wasting power and costing customers.
That’s not true. It was just last year that some of the Ryzen 7000 models were burning themselves
I think he was referring to "back-in-the-day" when Athlons, unlike the competing Pentium 3 and 4 CPUs of the day, didn't have any thermal protections and would literally go up in smoke if you ran them without cooling.
Problem is that it's getting extremely hard to get more single-threaded performance out of a chip, and this is one of the few ways to do so. And a lot of software is not going to be rewritten to use multiple cores. In some cases, it's fundamentally impossible to parallelize a particular algorithm.
Yeah. I just meant AMD cpus used to easily overheat if your cooling system had an issue. My ryzen 7 3700x has been freaking awesome though. Feels more solid than any PC I've built. And it's fast AF. I think I saved over $150 when comparing to a similarly rated Intel CPU. And the motherboards generally seem cheaper for AMD too. I would feel ripped off with Intel even without the crashing issues
I've been on team AMD for over 20 years now but that's not true.
The CoreDuo and the first couple of I CPUS were better than what AMD was offering and were for a decade.
The Athlon were much better than the Pentium 3 and P4, the Ryzen are better than the current I series but the Phenom weren't.
Don't get me wrong, I like my Phenom II X4 but it objectively wasn't as good as Intel's offerings back in the day.
My i5-4690 and i7-4770 machines remain competitive to this day, even with spectre patches in place. I saw no reason to 'upgrade' to 6/7/8th gen CPUs.
I'm looking for a new desktop now, but for the costs involved I might just end up parting together a HP Z6 G4 with server surplus cpu/ram. The costs of going to 11th+ desktop Intel don't seem worth it.
I'm going to look at the more recent AMD offerings, but I'm not sure they'll compete with surplus server kit.
The only problem they ever had was back in the day they overheated easily.
Very easily.
In college (early aughts), I worked as tech support for fellow students. Several times I had to take the case cover off, point a desktop fan into the case, and tell the kid he needed to get thermal paste and a better cooler (services we didn't offer).
Also, as others have said, AMD CPUs have not always been superior to Intel in performance or even value (though AMDs have almost always been cheaper). It's been a back-and-forth race for much of their history.
Yeah. I never said they were always better in performance. But I have never had an issue other than the heat problem which all but one time was fully my fault. And I don't need a processor to perform 3% better on random tasks... which was the kind of benchmark results I would typically find when comparing similar AMD/intel processors (also in some categories amd did win). I saved probably a couple grand avoiding Intel. And as another user said, I prefer to support the underdog. The company making a great product for a lot less money. Again I say: fuck Intel.
It kinda, has, with Fermi, lol. The GTX 480 was... something.
Same reason too. They pushed the voltage too hard, to the point of stupidity.
Nvidia does not compete in this market though, as much as they'd like to. They do not make x86 CPUs, and frankly Intel is hard to displace since they have their own fab capacity. AMD can't take the market themselves because there simply isn't enough TSMC/Samsung to go around.
So like, did Intel lay off or deprecate its QA teams similar to what Microsoft did with Windows? Remember when stability was key and everything else was secondary? Pepperidge farms remembers.
My problem is really in how they handled the situation once they knew that there was a problem, not even the initial manufacturing defect.
Yes, okay. They didn't know exactly the problem, didn't know exactly the scope, and didn't have a fix. Fine. I get that that is a really hard problem to solve.
But they knew that there was a problem.
Putting out a list of known-affected processors and a list of known-possibly-affected processors at the earliest date would have at least let their customers do what is possible to mitigate the situation. And I personally think that they shouldn't have been selling more of the potentially-affected processors until they'd figured out the problem sufficient to ensure that people who bought new ones wouldn't be affected.
And I think that, at first opportunity, they should have advised customers as to what Intel planned to do, at least within the limits of certainty (e.g. if Intel can confirm that the problem is due to an Intel manufacturing or design problem, then Intel will issue a replacement to consumers who can send in affected CPUs) and what customers should do (save purchase documentation or physical CPUs).
Those are things that Intel could certainly have done but didn't. This is the first statement they've made with some of that kind of information.
It might have meant that an Intel customer holds off on an upgrade to a potentially-problematic processor. Maybe those customers would have been fine taking the risk or just waiting for Intel to figure out the issue, issue an update, and make sure that they used updated systems with the affected processors. But they would have at least been going into this with their eyes open, and been able to mitigate some of the impact.
Like, I think that in general, the expectation should be that a manufacturer who has sold a product with a defect should put out what information they can to help customers mitigate the impact, even if that information is incomplete, at the soonest opportunity. And I generally don't think that a manufacturer should sell a product with known severe defects (of the "it might likely destroy itself in a couple months" variety).
I think that one should be able to expect that a manufacturer do so even today. If there are some kind of reasons that they are not willing to do so (e.g. concerns about any statement affecting their position in potential class-action suits), I'd like regulators to restructure the rules to eliminate that misincentive. Maybe it could be a stick, like "if you don't issue information dealing with known product defects of severity X within N days, you are exposed to strict liability". Or a carrot, like "any information in public statements provided to consumers with the intent of mitigating harm caused by a defective product may not be introduced as evidence in class action lawsuits over the issue". But I want manufacturers of defective products to act, not to just sit there clammed up, even if they haven't figured out the full extent of the problem, because they are almost certainly in a better position to figure out the problem and issue information to mitigate it than their customers individually are, and in this case, Intel just silently sat there for a very long time while a lot of their customers tried to figure out the scope of what was going wrong, and often spent a lot of money trying to address the problem themselves when more information from Intel probably would have avoided them incurring some of those costs.
I wouldn't conclude that from an Intel Employee even if they did claim it because they(Intel) already lied multiple times in this afair.
But they didn't even do that, they just said desktop processors are affected, this doesn't say mobile ones are not.
Many companies have already reported that their telemetry records many crashes with the exact same symptoms and software on their laptops while AMD still isn't affected.
Could also be the fucking GPU of it's doing that, apparently
Had some sag on my GPU after years and didn't really notice. Tried troubleshooting and was about to go mad til someone on Reddit from a year ago had a comment saying to try resetting the GPU and then bracketing it
I think a lot of things can cause that. Unfortunately it's difficult to diagnose hardware issues for certain without just having a bunch of spare cpus, spare mobos, spare ram, etc lying around and a lot of time on your hands to keep swapping out parts until you find a swap that fixes it. Especially when it's an issue that happens occasionally so you have to keep using your computer without issue for long enough until you think it's likely that the problem is fixed.
Also not guaranteed to be a hardware issue but probably is. I've sometimes had similar issues that were a combination of the kernel not working well with a specific piece of hardware I use.
The other day, when this news hit for the first time, I bought two ITM Put options on INTC. Then, I waited three days and sold them for 200% profit. Then, I used the profit to invest in the SOXX etf. Feels good to finally get some profit from INTC’s incompetence.
I have an Intel Core i9-14900K 3.2 GHz 24-Core LGA 1700 Processor purchased in March. Is there any guesses for the window yet of potential affected CPUs?
I predict that they will indeed end up eventually doing a recall despite what they told The Verge, but only after their reputation is even more irreversibly damaged than it already has been due to this issue.
Any real world comparison. Gaming frame rate, video encoding... The 13-700 beats the 7900x while being more energy efficient and costing less.
That's even giving AMD a handicap in the comparison since the 7700x is supposed to be the direct comparison to the 13-700.
I say all this as a longggg time AMD CPU customer. I had planned on buying their CPU before multiple different sources of comparison steered me away this time.
Ok, so maybe you are missing the part where the 13 and 14 gens are destroying themselves. No one really cares if you use AMD or what not, this little issue is intel and makes any performance,power use or cost moot as the cpu's ability to not hurt itself in its confusion will now always be in question.
Also I don't think CPU speeds have been a large bottleneck in the last few years, why both AMD and Intel keep pushing is just silly.
Yeah that does suck. But I was replying specifically to the person saying Intel hasn't been relevant for years because of a supposed performance dominance from AMD. That's part just isn't true.