Matthew Connatser, Author at PremiumBuilds Fri, 07 May 2021 15:06:02 +0000 en-US hourly 1 https://wordpress.org/?v=6.2.4 https://premiumbuilds.com/wp-content/uploads/2021/03/cropped-premiumbuilds-favicon-new-2-32x32.png Matthew Connatser, Author at PremiumBuilds 32 32 160969867 AMD vs Nvidia: Which GPU is best for you? https://premiumbuilds.com/graphics-cards/amd-vs-nvidia/ https://premiumbuilds.com/graphics-cards/amd-vs-nvidia/#respond Fri, 07 May 2021 15:04:58 +0000 https://premiumbuilds.com/?p=807445 The GPU is arguably the most important component in a gaming PC, and your choices start with this simple question: should you choose AMD or Nvidia? Though Nvidia has historically dominated the graphics scene, today the two vendors are perhaps more equally matched than ever before. Still, there are key differences between AMD and Nvidia… Read More »AMD vs Nvidia: Which GPU is best for you?

The post AMD vs Nvidia: Which GPU is best for you? appeared first on PremiumBuilds.

]]>
AMD vs Nvidia

The GPU is arguably the most important component in a gaming PC, and your choices start with this simple question: should you choose AMD or Nvidia? Though Nvidia has historically dominated the graphics scene, today the two vendors are perhaps more equally matched than ever before. Still, there are key differences between AMD and Nvidia GPUs which you should consider before making a decision.

Performance and Pricing

Unlike the prior RX 5000 series, the new RX 6000 series makes a serious attempt to dethrone Nvidia at the very top. Here’s a rough tier list:

GPU Tier ListAMDNvidia
Bragging RightsRX 6900XTRTX 3090
Top EndRX 6800XT, RX 6800RTX 3080
Upper High EndRX 6700XTRTX 3070, RTX 3060Ti
Lower High EndRX 5700XT, RX 5700RTX 3060

It should be noted that this list was created using benchmarks conducted at 1440p and without ray tracing or DLSS or other related settings. The resolution is relevant because the margin between RX 6000 and RTX 3000 GPUs depends on it. As the resolution increases, Nvidia actually gains ground on AMD. I decided to just focus on 1440p since it’s right in between 1080p and 4K. If you want to min-max framerates, AMD tends to do better at lower resolutions and Nvidia at higher. That being said, there’s no reason why you can’t use a 6800XT at 4K or a 3080 at 1080p, since the difference is only a handful of frames either way.

Unlike previous generations, AMD is not positioning itself as a budget alternative; consequently, the RX 6000 series is the most expensive generation of AMD GPUs in a long while. AMD still represents better bang for buck, but barely. There should not be a significant dilemma when choosing either Nvidia or AMD if you’re concerned about value.

Nvidia provides better options in the “Lower High End” tier thanks to the introduction of the RTX 3060. The RX 5000 series is certainly fine but it lacks major features the RX 6000 series introduced. Both vendors, however, cover nothing below $300, which means you have to spend hundreds of dollars for the newest GPUs. If you want to spend less money for less performance, you have to buy an older GPU.

Some other tidbits: AMD tends to offer a little more memory throughout the stack, but it’s not clear whether or not this will actually matter. Nvidia’s power consumption is usually much higher than AMD’s, and given the performance that means efficiency is also lower, sometimes by a wide margin.

Ray Tracing and Upscaling Technology

Ray tracing is the next big thing in gaming, and Nvidia was the first to support it starting in 2018 under the RTX brand. AMD finally introduced ray tracing with the RX 6000 series, but the performance just isn’t as good as Nvidia’s. Both AMD and Nvidia GPUs lose frames when enabling ray tracing, but RX 6000 GPUs lose far more. This isn’t necessarily a critical loss for AMD, however, since ray tracing is supported in very few games and sometimes doesn’t improve visual quality very much. It’s difficult to tell how fast the industry will adopt ray tracing, so I can’t say whether or not AMD’s poorer performance really will matter.

Super sampling and visual smoothing technologies are also the next big thing. Alongside ray tracing support, Nvidia also introduced DLSS (or deep learning super sampling). Like anti aliasing, DLSS takes a lower resolution image and makes it look better. However, just like ray tracing, DLSS is not present in many games because it requires a high level of support from developers and Nvidia. In the future, however, we should expect more games to utilize DLSS because the popular game engines Unreal and Unity support it natively. It appears that developers will soon be able to easily support DLSS, but we will have to wait and see.

AMD, meanwhile, doesn’t really have its own DLSS-like technology. They have been teasing FidelityFX Super Resolution, but there are basically no details about visual quality or performance. All AMD has confirmed is that it will arrive before the year ends and that it doesn’t use machine learning. Though, AMD didn’t need to confirm that second point since we already knew RX 6000 doesn’t have machine learning hardware. AMD does have Radeon Image Sharpening (RIS), but all this is a very primitive feature by comparison.

Software Features

Another distinct advantage Nvidia has on AMD is NVENC. I’ll spare you the technical details on what exactly it is, all you need to know is that Nvidia GPUs usually record game footage at a higher quality than AMD GPUs. You aren’t required to use Nvidia’s first party Shadowplay software to use NVENC, too. Third party programs like OBS (one of the most popular recording and streaming applications) support the feature.

AMD does however have two unique features which Nvidia has never tried to emulate: Radeon Chill and Radeon Boost. Radeon Chill was made back in 2016 when AMD GPUs were very power inefficient. It lowers performance (and by extension, power consumption) when you’re not moving much. If you’re really concerned about power efficiency, it might be a worthwhile feature. Radeon Boost was introduced in 2019 and it is more broadly applicable. Radeon Boost will dynamically lower the resolution when you move in order to increase the framerate. Of course, when you’re moving around, you probably won’t notice the resolution decreasing. Sadly, both features don’t work in every game. Chill apparently works in “most games” except for Windows store games, and Radeon Boost only works in a dozen or so games.

G SYNC and FreeSync

I should also mention G SYNC and FreeSync. These are anti-screen tearing technologies and back when they were introduced, they were fairly simple first party standards. Nowadays, there are multiple versions of each. There are two versions of G SYNC: the proprietary one using an FPGA (which increases costs significantly) and the one based on Adaptive Sync, another anti-screen tearing technology which is vendor agnostic. FreeSync on the other hand is easily the worst offender for different versions, however; there is regular FreeSync, FreeSync 2 (which is seemingly discontinued), FreeSync Premium, and FreeSync Pro. Generally speaking, the different categories reflect not the quality of the technology, but the monitor. They’re like those HDR labels monitors have been shipping with recently. Both technologies do the same thing more or less.

Finally, the race is also even when it comes to recording game footage. Both vendors offer GPU based recording software: ReLive for AMD, Shadowplay for Nvidia. It used to be the case that AMD didn’t even have a Shadowplay alternative, but some years ago ReLive was finally introduced. Besides for NVENC, Nvidia doesn’t have a significant advantage over AMD for recording games anymore.

Driver Suite

This section focuses not on driver stability or performance, but the driver suite UI and features. Both Nvidia and AMD of course offer first party software so that you can customize your experience. Many of the above features can only be enabled or customized through these suits.

AMD’s driver suite requires no login of any kind and has a plethora of settings. However, most of the graphics settings don’t appear to be very useful. The only ones I would adjust would be tessellation and Enhanced Sync (which is a software mitigation for screen tearing). There are also some color and display related settings I also have never seen any reason to use, except for FreeSync and Virtual Super Resolution. On the other hand, there are tons of useful settings like those for ReLive, Radeon Boost, Radeon Chill, and Wattman, a built in overclocking tool which I personally really like, though not everyone feels the same way. Though AMD’s suite could use a trim, overall it’s quite useful.

When it comes to UI design, AMD’s driver suite looks good and performs well. It has actually been a primary goal of AMD’s since 2015 to deliver modern looking driver suites; you might even think AMD has gone a little overboard. Browsing from menu to menu and from option to option is fast and happens at a high framerate. There are a few themes to choose from and they all look pretty nice.

Nvidia, on the other hand, has gone a very different direction. When you install Nvidia drivers, you get two choices: install just the driver suite or install the driver suite with GeForce Experience. I’ll get to GeForce Experience in a moment, but first, the driver suite. It is atrocious. It really looks like it hasn’t been redesigned since the XP days. When going from one menu to another, or changing a setting, there’s usually a good second or two of lag. How could there be lag when this UI is so barebones?

But perhaps the worst thing about the basic driver suite is that you don’t get all of the features and settings. To use features like Shadowplay and Ansel, you need GeForce Experience, which is more of a media oriented extension of Nvidia’s driver suite. It requires you to login with an account which is pretty annoying, even if it’s only once. But if you don’t want Shadowplay or Ansel or the other media features, GeForce Experience is almost entirely useless. GeForce Experience also doesn’t have the same customization options as the base UI, because it’s not a replacement.

AMD is the clear winner when it comes to driver usability, though this probably won’t matter to most people, just those who like to tinker around with their GPU.

So, which should you go with?

It’s been about 15 years since the PS3 and the Xbox 360 faced off, and I can’t help but draw parallels between that debate and this one. It’s really hard to choose between AMD and Nvidia because they’re so close. Neither one holds a significant advantage over the other. However, Nvidia has a significant head start on ray tracing and upscaling technology, which will be important in the future. But let’s consider one thing: ray tracing and DLSS are still uncommon features, and we’re already into the second generation of ray tracing GPUs. By the time ray tracing and DLSS are truly industry standard features, we might already have RTX 4000 and RX 7000 GPUs. In a couple of years, perhaps AMD will catch up to Nvidia or even beat them.

Whether you go with AMD or Nvidia, you’re getting a similar experience overall. If you have very particular needs, then you might find that one is really much better than the other. Most people, however, won’t be able to tell the difference.

Sources:

  1. AMD Radeon RX 6900XT Review“, by Techspot.
  2. AMD Radeon RX 6900 XT Review – The Biggest Big Navi“, by TechPowerUp.
  3. AMD Radeon RX 6700XT Review“, by Techspot.
  4. AMD Radeon RX 6700 XT Review“, TechPowerUp.

The post AMD vs Nvidia: Which GPU is best for you? appeared first on PremiumBuilds.

]]>
https://premiumbuilds.com/graphics-cards/amd-vs-nvidia/feed/ 0 807445
TN vs IPS vs VA: Which Panel Type is the Best for Gaming? https://premiumbuilds.com/monitors/tn-vs-ips-vs-va-display-panel-best-for-gaming/ https://premiumbuilds.com/monitors/tn-vs-ips-vs-va-display-panel-best-for-gaming/#respond Thu, 22 Apr 2021 16:05:34 +0000 https://premiumbuilds.com/?p=807442 Visuals are crucial for the gaming experience, and today there are hundreds if not thousands of gaming monitors on the market.  They might not only differ in resolution, or physical size, or refresh rate, but also in more fundamental ways. When it comes to gaming monitors, manufacturers tend to use one of these three distinct… Read More »TN vs IPS vs VA: Which Panel Type is the Best for Gaming?

The post TN vs IPS vs VA: Which Panel Type is the Best for Gaming? appeared first on PremiumBuilds.

]]>
tn vs ips vs va panel monitor for gaming

Visuals are crucial for the gaming experience, and today there are hundreds if not thousands of gaming monitors on the market.  They might not only differ in resolution, or physical size, or refresh rate, but also in more fundamental ways. When it comes to gaming monitors, manufacturers tend to use one of these three distinct types of panel technologies: twisted nematic (TN), in-plane switching (IPS), and vertical alignment (VA). Each one has its strengths and weaknesses for gaming.

TN

TN monitors have been around for some time, and consequently they are usually the cheapest monitors on the market. Though this technology is old, it is legendary for amazingly low pixel response times, which is the amount of time required for a pixel to update its color, usually measured in milliseconds. At one point, only TN panels were able to deliver response times lower than 1 ms and refresh rates at 240 Hz. Although TN is no longer the leader in refresh rate, it is still a leader in response times.

To put that in perspective, at 60 FPS, the average amount of time between each frame is 16.6 (repeating of course) ms. At 120 FPS, it’s 8.3 ms, and at 240 FPS it’s just about 4 ms. At 1000 FPS, it takes 1 ms to create a new frame. Of course, sub 1 ms response times are technically overkill when TN panels can only refresh the display every 4 ms or so (this is the 240Hz refresh rate), but it significantly reduces or completely eliminates ghosting. For those unaware, ghosting is a visual artefact which occurs when pixels fail to update quickly enough, causing old colors to stick around like a ghost. This results in a smearing effect as multiple frames, new and old, are blended together.

 However, TN’s fatal flaws are in color and viewing angles. The best TN panels can only cover about 100% of the sRGB color gamut, which is mediocre these days. Though, TN does make up some ground in darker scenes because, well, black is the absence of color, so a narrow gamut doesn’t matter as much. Viewing angles are also bad; most TN panels will look horrible unless viewed more or less head on.

The best TN gaming monitor will have a refresh rate of 240Hz, sub 1 ms response times, very poor colors, and very narrow viewing angles, and it’ll typically be one of the cheaper options on the market.

IPS

IPS is a much more recent technology and it is almost the polar opposite of TN. Whereas TN is best known for response times, IPS has become synonymous with good colors and wide viewing angles, thus making it one of the most used technologies for mobile devices like phones and commercially oriented displays (for instance those used advertisements in a mall).

IPS displays can reach the color gamut of DCI-P3, which is a bit overkill for gamers but ideal for professionals; more consumer oriented displays can easily reach sRGB’s gamut and beyond. Viewing angles are so wide that IPS displays look fine from almost any angle. IPS succeeds where TN fails.

However, IPS also has the exact opposite weaknesses of TN; the technology suffers from poorer pixel response times and has poorer detail in darker scenes. Top end IPS displays have been able to match TN’s 1 ms pixel response times, but other displays can have higher response times, sometimes significantly so. For example, the display ASUS used for its last generation G14 (a bonafide gaming laptop) had about a 33 ms response time, which could result in smearing starting at just 30 FPS. IPS panels also struggle to discern slightly different levels of darkness, so detail is lost in darker scenes.

IPS has recently overtaken TN in one key area: refresh rate. TN for a long time was the only technology which could reach 240Hz, but last year companies like Acer were able to finally launch 240Hz IPS monitors. But before those monitors even released, Nvidia and ASUS had already announced they would have a 360Hz IPS display launched by the end of the year. IPS is actually the best technology for refresh rates, not TN.

The best IPS based gaming monitor will have a refresh rate of 360Hz, sub 1 ms response times, good or great colors, and wide viewing angles. However, it will also command a high price.

VA

You might not have heard of VA, because it is a fairly niche panel technology. The technology itself is about as old as TN but it hasn’t enjoyed the same levels of popularity. For gaming, it is often considered a middle ground between TN and IPS; monitors using VA tend to be competent enough in response time, color gamut, and refresh rate.

VA panels cap out at around 90% of the DCI-P3 gamut, but this is adequate for gaming as it’s well above sRGB. Furthermore, details in darker scenes are better preserved compared to IPS. Viewing angles are decent and better than TN, though not quite as good as IPS. However, more extreme angles can cause loss in saturation level, which could be fairly noticeable. Some VA monitors reach sub 1 ms response times, but just like IPS, there will be models which do not perform nearly as well. Unfortunately, just like TN, VA is stuck at 240Hz.

The best VA gaming monitor will have a refresh rate of 240Hz, sub 1 ms response times, decent or good colors, and wide viewing angles. However, not too many VA monitors are on the market, so your choices are more limited compared to TN and IPS.

Sources:

  1. “How to Choose Between TN, VA, and IPS Panels for the Games You Play”, BenQ.
  2. “The Evolution of VA Panels”, AOC.
  3. “IPS”, LG.
  4. “IPS monitors have finally reached the coveted 1ms response time”, PCGamer.
  5. “NVIDIA unveils a 360Hz ‘world’s fastest’ esports display”, Engadget.

The post TN vs IPS vs VA: Which Panel Type is the Best for Gaming? appeared first on PremiumBuilds.

]]>
https://premiumbuilds.com/monitors/tn-vs-ips-vs-va-display-panel-best-for-gaming/feed/ 0 807442
AMD vs Intel: Which is Best for Gaming? https://premiumbuilds.com/guides/amd-vs-intel-which-is-best-for-gaming/ https://premiumbuilds.com/guides/amd-vs-intel-which-is-best-for-gaming/#respond Wed, 21 Apr 2021 14:40:27 +0000 https://premiumbuilds.com/?p=807434 This is a question which will not die until desktop gaming dies. It’s not a bad question by any means, but the typical answers are lacking and might even be bad advice. The consensus today is that AMD is the best for gaming, and before that it was Intel for many, many years. Now, to… Read More »AMD vs Intel: Which is Best for Gaming?

The post AMD vs Intel: Which is Best for Gaming? appeared first on PremiumBuilds.

]]>
amd vs intel which is best for gaming

This is a question which will not die until desktop gaming dies. It’s not a bad question by any means, but the typical answers are lacking and might even be bad advice. The consensus today is that AMD is the best for gaming, and before that it was Intel for many, many years. Now, to answer this question, I could just take all the Ryzen 5000 reviews and average out the results, but there’s a clear problem with that: the reviews aren’t consistent. They do consistently agree that Ryzen 5000 CPUs are in the lead, but they can’t agree by how much. Yet most reviewers were pretty consistent on RX 6000 vs. RTX 3000 around the same time, so what gives? Solving this conundrum is key to understanding which gaming CPU is best. Yes, I am going to get very technical in this article, so if you just want to know which one is better then feel free to skip to the end.

Why are gaming CPUs good at gaming?

Firstly, what even makes a CPU good at gaming? AMD’s last generation of Ryzen CPUs, the 3000 series, was well ahead of Intel in almost everything with the very notable exception of gaming. Games are, by nature, highly dependent on good latency, and they also tend to not use many actual resources. A long 4K video taking about an hour to render is pretty good, but gamers want a brand-new frame at least every 33 millisecond (which translates to 30 FPS) and most PC gamers want a new frame every 16 ms (60 FPS). There is very little time create a single frame, but thankfully game applications don’t ask very much of the CPU; to get the job done, all the CPU needs to do is handle a small amount of data, do a little math on it, and communicate that to the rest of the system.

Ryzen 3000, despite its beefy cores, suffered in gaming because of one of its key design features: the core complex (or CCX). One of the building blocks of modern AMD CPUs is the CCX, which from 2017 to 2020 was a group of 4 cores (or 2 cores in some very low-end CPUs). A CCX has components like cache which are normally shared across the entire CPU. To add more cores, AMD has to add more CCXs, which works very well for tasks which demanded large amounts of resources. However, the CCX has a fatal flaw; when a core within one CCX needs to communicate with a core in another CCX, there is a high amount of latency. Non-shared cache was better than having less cache (because asking the RAM for something takes forever), but shared cache would have been so much better.

This is the key reason why AMD struggled to catch up to Intel in gaming despite introducing new instructions, new features, bigger cores, and more cores. Though Intel did not change the design of its cores for 6 years, their CPUs were able to stay ahead because they had very low latency from core to core and from core to RAM, and they had sufficient amounts of resources. Any AMD CPU with more than 4 cores had to use multiple CCXs, which severely impacted gaming performance where more than 4 cores was needed. While Intel’s old Skylake cores could not match AMD’s modern Zen 2 cores in most applications, if latency was at all important, then Intel would be able to stay ahead or at least be competitive.

In 2020, AMD flipped the tables with Zen 3. They enlarged the size of the CCX to 8 cores, and consequently all Ryzen 5000 CPUs with 8 cores or less have just one CCX and a single block of cache. Now AMD is on top in gaming thanks to the combination of low latency and a massive amount of cache. Meanwhile, Intel’s new Rocket Lake based 11th gen actually lost ground in some games compared to their previous 10th gen, because their new cores increased latency by a significant amount.

Why are the benchmarks inconsistent?

A good gaming CPU has low latency and the right amount of resources for certain tasks. So why is the benchmarking data for this all over the place? Well, here’s the other piece of the puzzle: games aren’t demanding much if any more resources from the CPU in order to increase the framerate. You can test this yourself in any game of your choice. Open up MSI Afterburner or even Task Manager and monitor CPU usage and power consumption. Then, set a framerate cap using your driver suite or third party software (MSI Afterburner also comes with Rivatuner which can limit the framerate). If you steadily increase the framerate, CPU usage doesn’t go up very much and neither does power consumption. This is very much unlike GPUs, which would have to increase resource and power consumption to achieve higher framerates, assuming there’s still headroom for more frames without decreasing visual quality settings. If GPU Alpha is twice as fast as GPU Beta at “ultra quality”, it’ll probably be about twice as fast at “high quality”. But CPU Gamma and CPU Zeta might be the same performance at both presets, or maybe they’ll be the same performance at “ultra quality” but suddenly show a gap at “high quality”.

CPU gaming benchmarks are complicated by the fact that there is a limit to how many frames a CPU can create in each game. You can’t just keep tweaking settings to decrease the load on the CPU when there are few if any of those settings to tweak, and even if there were, reading and executing code is always going to take up some time because the code might be imperfect or because there’s no hardware acceleration. The basic rule for CPU benchmarking is that, as the framerate on the fastest CPU increases, the gap between it and slower CPUs will also increase. Let’s say CPU Gamma can do 240 FPS while CPU Zeta can only do 200 in a certain title. They can both do 200 FPS, however, and if the GPU isn’t fast enough to output more than 200 FPS, the two CPUs will appear to be equal. But if the GPU is fast enough for 240 FPS, then suddenly there’s a noticeable gap between CPU Gamma and CPU Zeta, and this is exactly what is happening with the Ryzen 5000 reviews. Most of the reviews which find little difference between AMD and Intel are only getting 200 FPS or less on the fastest CPU, while the reviews which find a big difference between AMD and Intel are seeing the fastest CPU reach at least 500 FPS.

In the most absolute sense, AMD is the clear winner for gaming. Even at its worst, Ryzen 5000 is tying Intel CPUs, and at its best it can be faster by 30% or more and approaches 1000 FPS in titles like Strange Brigade. I’ve linked to Anandtech’s review here because it gives such good insight into CPU performance and how it varies depending on test conditions. But here’s another question: does it matter? Gaming monitors only go up to 360Hz, and it’s unlikely that anyone but an esports professional can notice the difference between a frame coming every single ms vs. just every three ms. If you just want 120 FPS or even just 60, then plenty of recent CPUs can do just fine.

Pricing

If performance is pretty much fine across the board, then we need to find a different angle. Pricing is a much more interesting discussion for comparing AMD and Intel. In today’s market, when supply is good, CPU pricing for prior generations is pretty good. I would regard Ryzen 1000, 2000, and 3000 as generally being the best value (especially in the used market, Intel CPUs retain their value far too well), but Intel 8th, 9th, and 10th gen can be worthwhile if you want a consistently higher framerate; previous Ryzen CPUs can struggle in certain games and only Ryzen 3000 is truly competitive most of the time with Intel. That being said, all of these CPUs should reliably do 60 FPS gaming but only if they have 6 cores. 4 core CPUs can perform well but newer games are starting to demand resources which few 4 core CPUs can muster. Even a cheap 6 core CPU is a good investment.

Platform

The platform differences between the two are also really important. I would again consider AMD to be the better choice here, because AMD offers more features, and the single most important feature here is the Ryzen upgrade path. A budget PC gamer might decide to go with a Ryzen 5 2600X and a B550 motherboard, and down the line they can upgrade to a 2700X, or any Ryzen 3000 or 5000 CPU. Intel’s upgrade path is more limited, as their 8th and 9th gen CPUs are limited to 200 and 300 series boards, and they’re both architecturally identical. 10th gen is alongside 11th gen on the 400 and 500 series boards, but 11th gen isn’t much of an upgrade from 10th. This, not pricing, is the most important reason to go with AMD, in my opinion.

Other things

Finally, we can’t forget about things that some users might care about even if they’re not related to gaming. Intel CPUs perform well enough but AMD is significantly ahead in several types of applications; thankfully, these are usually hobby or work oriented applications for things like rendering, so casual users won’t need to worry. Intel CPUs also consume quite a bit more power than Ryzen 3000 and 5000 CPUs and consequently require better power supplies and cooling. If you want to build a small form factor PC (or if you just want less heat in general), AMD is a better bet.

Verdict: AMD and Intel are both fine

Generally speaking, I think older AMD Ryzen CPUs are the most ideal for the budget-conscious gamer, but it’s not like buying Intel is a waste of money. In this market, Intel might be your only option depending on what the supply looks like, and if you’re just gaming, then it’s honestly not a bad choice. Really, choosing between AMD and Intel is like choosing between different kinds of pizza; they’re both pizza at the end of the day and they’ll taste pretty much the same. Buy a Ryzen 5000 CPU if you need every single frame possible. Buy Intel or Ryzen 3000 if you want more than 120 FPS reliably. Buy Ryzen 1000 or 2000 if you’re comfortable with 60-120 FPS. Finally, make sure whatever CPU you buy has 6 cores. That’s really all you need to worry about, and if you follow these simple guidelines, you should have little issue gaming the way you want to.

Sources:

The post AMD vs Intel: Which is Best for Gaming? appeared first on PremiumBuilds.

]]>
https://premiumbuilds.com/guides/amd-vs-intel-which-is-best-for-gaming/feed/ 0 807434