Intel and Nvidia deny the rumor that they are colluding against AMD

Illustration for article titled Intel and Nvidia probably don't team up to keep AMD out of gaming laptops

Screenshot: Nvidia

If you AMD subreddit, the Linus Tech Tips forums, or elsewhere in recent months, you may have come across a conspiracy theory that Intel and Nvidia have struck a secret deal with each other to keep more expensive GPUs out AMD Ryzen 4000 series laptops. If you look at the list of AMD laptops released last year, you might believe it. The Asus ROG Zephyrus G14, Lenovo Legion 5 and others all came with an AMD processor, but nothing higher than an RTX 2060. Conspiracy theories are tempting, but this one appears to be nothing more than a product of the Intel / AMD / Nvidia wars . It doesn’t help with that unsubstantiated claims from blogs and news sites around the world keep telling the same story. All it takes is a bit of digging to see that there’s no juicy scandal here – just a complicated web of how CPUs and GPUs work together.


In April 2020, Frank Azor, AMD’s Chief Architect of Gaming Solutions and Marketing, responded to a Twitter user’s question about the lack of high-end GPUs in AMD laptopssaying, “You need to ask your favorite OEMs and PC builders.” That was around the time the conspiracy theory started to take shape, but Azor was right. Laptop configurations are determined by OEMs, not chip manufacturers. And those configurations are usually cost-driven, but they also need to make sense. A substandard CPU with an overloaded GPU is not a good combination, and that’s kind of a trap that something like the Ryzen 9 4900HS, or lower, falls into.

Azor even sat down with it The Full Nerd in May 2020 to address the issue again, specifically speaking of OEMs’ confidence in Ryzen processors. “I think Ryzen 4000 has exceeded everyone’s expectations, but for the most part everyone was with us. That’s why it was difficult to imagine a world where we were the fastest mobile processor, ”said Azor. “I think when you plan your notebook portfolio as an OEM, and you haven’t come to that realization yet – and remember, all this planning for these notebooks was done last year – you’ve been leaning a little bit with AMD.”

In essence, the confidence of OEMs that AMD has a blazingly fast mobile processor just wasn’t there. So why would they pair a high-end mobile processor with something they thought would be inferior? The middle way, the ‘meat of the market’, as Azor put it, were laptops running RTX 2060s and below. But even with this reasonable explanation, the rumor mill continues to spin, looking for clues in the processor’s specifications for answers.

Gizmodo contacted Intel and Nvidia about these rumors, which both companies vehemently denied. A spokesman for Nvidia told Gizmodo: “The claim is not true. OEMs decide on their system configurations and select GPU and then CPU to pair with it. We support both Intel and AMD CPUs throughout our product package. “

An Intel spokesman echoed the same sentiment: “These allegations are false and there is no agreement. Intel is committed to conducting business with uncompromising integrity and professionalism. “

Nvidia and Intel’s firm denials certainly suggest this theory is little to no water, but I don’t think you even need their denial to get the theory is bunk bed. The fact is, the Ryzen 4000 series was never going to be a strong contender for high-end mobile gaming.


There are three elements from AMD’s Ryzen 4000 series that likely took into account the OEM’s decision not to pair it with an advanced graphics card. They are PCIe restrictions, CPU cachee, and the most obvious: single core performance.

Gaming relies more on single core performance than multi-core, and Intel typically has the better single core performance. This is true both historically and with regard to Intel’s 10th generation versus AMD’s Ryzen 4000 series. Heck, the 10th Gen Core i9-10900K gaming benchmarks are even on par with AMD’s newer Ryzen 9 5950X when both are paired with an RTX 3080.

IIn our previous laptop tests, AMD’s Ryzen 9 4900HS in Asus’ ROG Zephyrus G14 had weaker single-core performance than the Intel Core i7-10875H in MSI’s Creator 15. The Core i7-10875H is not at the top of Intel’s 10th generation mobile line, but the Ryzen 9 4900HS is at the top of AMDs. YWith nearly the same GPU (RTX 2060 Max-Q on the G14, RTX 2060 on the Creator 15), the Intel system was still 1-3 fps higher on average (1080p, ultra settings.). Pairing a more powerful GPU with a Ryzen 9 4900HS would most likely have given some games a bottleneck due to its single core performance.

That will lead to less than great performance compared to Intel’s offering, especially when combined with the cotton ball L3 CPU cache in the Ryzen 4000 series. Only 8 MB L3. That’s half from Intel’s. So the average time it takes to access data from main memory would be slower than with the Intel mobile processor.

The Ryzen 4000 series PCIe limitations may also have contributed to OEMs reluctance to adopt, but that idea is a bit shaky. It originated from a blog post on igor’sLAB explained that, since Ryzen 4000s CPUs have only eight PCIe 3.0 lanes reserved for individual GPUs, this can create a bottleneck if paired with something higher than an RTX 2060. Each PCIe device needs a certain number of lanes to run. to run at full capacity, and both Nvidia and AMD GPUs require 16. Since Intel’s 10th Gen processors have 16 lanes of support, they fit better with the RTX 2070 and higher GPUs in last year’s gaming laptop lineup.

However, many people on Reddit and other online forums have pointed out that the performance drop from pairing a Ryzen 4000 CPU with an RTX 2070 or higher GPU would be very small, or noticeable at all, so for them the explanation didn’t make sense. . (More fuel for the conspiracy theory.) I had to test all this myself to see if there really is a drop in performance from 16 lanes to 8.

I did my own tests, I thought that 16 lanes do indeed offer better performance in the higher segment GPUs, but that performance can also be pretty negligible. G.ranted I used a much more powerful processor than the Ryzen 9 4900HS, so it was capable enough to handle an RTX 2060 and above no matter how many PCIe lanes were available.

My test PC was configured with: an Intel Core i9-10900K, Asus ROG Maximus XII Extreme, 16GB (8GB x 2) G.Skill Trident Z Royal DDR4-3600 DRAM, Samsung 970 Evo 500GB M.2 PCIe SSD, a Seasonic 1000W PSU and a Corsair H150i Pro RGB 360mm AIO for cooling.

Gaming performance hardly changed after I changed the PCIe configuration from 16 lanes to 8 lanes, but the performance difference was noticeable in synthetic benchmarks. By comparing an RTX 2060 to an RTX 2070 Super (the closest GPU I had to an RTX 2070) I ran benchmark tests in GeekBench 5, 3DMark, PCMark 10, Shadow of the Tomb Raider, and Metro Exodus, some of which are part of our usual series of tests.

Frame rates increased by up to 4 fps, the most notable difference Shadow of the Tomb Raider at 1080p. This supports what many have said that gaming performance is not substantially affected by halving the number of PCIe lanes to the GPU until you get something this powerful with the RTX 2080 Ti.

The synthetic benchmark tests didn’t change much from 8 lane to 16 lane with the RTX 2060, but the difference in scores was more pronounced with the RTX 2070 Super, suggesting there is a measurable difference that may be of interest in other applications. The RTX 2070 Super’s GeekBench score increased by 3000 when all 16 lanes were made available for the GPU. Time Spy delivered results that were in line with the gaming benchmarks, and strangely enough, the RTX 2060 saw a bigger boost in the PCMark test compared to the 2070 Super.

Synthetic benchmarks aren’t a measure of real-world performance, of course, and PCIe bandwidth isn’t the main thing that will slow down your system. But since many reviewers use these metrics to paint a picture of a laptop or desktop, each of the AMD 4000 series processors, paired with slightly higher than an RTX 2060, would have scored lower than normal. For high-end GPUs that are ‘performance-oriented’, every extra number, every extra frame matters, especially when there are many OEMs vying for a spot on your desk.

This suggests that, yes, OEMs will prefer the “better” CPU, even if the better CPU is only marginally better. A lack of AMD 4000 series processors coupled with advanced Nvidia graphics may be due to OEMs underestimating how many consumers were actually interested in those kinds of laptop configurations last year, but it is probably due to L3’s lack of 4000 series cache and slower single-core speeds. Sure, the RTX 2070 and above can run fine on PCIe x8, but if the CPU doesn’t have the juice to handle the GPU, none of that matters.


There is one last point to debunk this theory. When Intel and Nvidia conspire to shut out AMD why be more OEMs wholeheartedly embracing the AMD / Nvidia combo This time? Many of their AMD Ryzen 5000 Series powered laptops have an RTX 3070 or 3080; the latest AMD Ryzen mobile processors have 20MB L3 + L2 cache and support up to 24 PCIe Gen4 lanes (16 lanes for a discrete GPU) – just what it needs to pair well with something higher than a card from the Mid section.

Companies appear to be regularly involved in a large number of shady places activities that boost their bottom line while hurting consumers and influencing the choices we make every time we enter a Best Buy with money in our wallets. But no, Intel and Nvidia probably aren’t responsible for the slow adoption of AMD CPUs by OEMs. AMD has spent the past few years rebuilding its reputation and creating processors that truly compete with Intel in the mobile space and can support the powerful GPUs that Nvidia produces for laptops.

The Ryzen 4000 series was very good, but not quite ready to compete in the areas that matter most gamers and gaming laptop OEMs. The Ryzen 5000 series, if OEM acceptance indicates anything, is going to be a whole different beast. And it will likely be found in all major gaming laptops that the 4000 series was not. Nvidia and Intel have nothing to do with it.

.Source