Hi, this is Wayne again with a topic “Spiderman Remastered — Testing on an i5-6500 + RX 6600 8GB — 1080p Benchmark — Will It Play?”.
Hello and welcome to Tech deals. If ever there was a game that demonstrates the difference between, will run and will run really well. This is it Spider-Man remastered the 2022 remaster of the 2018 game by the same name, two different detail, settings 1080p being played on an eight-year-old CPU and a two-year-old graphics card. Do you have something from 2015 or 2016? Do you want to drop in a 230 RX? 6600 and get some awesome, amazing, gameplay experiences and new AAA games. Well, you probably need a computer upgrade, because this is not going to do it, but it will run well enough that a you can do it and you can make it work B.
You can think it runs really well what it really doesn’t and see. I’M gon na Crush all your hopes and dreams here in a minute and show you what an i7 8700k does because holy smokes. The difference in the frame time graph is amazing. You’Re, looking at high detail right now, low detail will be in just a minute and if you look at the upper left hand corner this is very important. The CPU is tasked to capacity it’s nearly at 100, with four cores. Four threads no hyper threading on this I5. Basically, all four cores are in full use. The graphics card is Bored.
Not only is the graphics card not at 100 use its clock. Speed is throttling as well. One of the features of amd’s cards for energy conservation is that they don’t just use less percentage of the card, but they will actually lower their voltage and clock speeds in order to save on electricity. Take a look at the power consumption here, we’re pulling between 30 and 50 watts of power most of the time from our RX 6600.
This is a graphics card with an 8 Pin, PCI Express power connector, and yet basically it is completely wasted here. This computer is unable to make effective use of this graphics card. Now we are playing at 1080p High detail if you crank it to 1440p ultra to be sure that the graphics card utilization will go up, but it will become a worse experience, because the frame time graph will get worse. The overall game performance will get worse because there’s more data to work with at 1440p Ultra and the I5 6500 is already fully loaded.
Really, you need a new CPU, but I have no doubt that there are people out there who say well. I’Ve got this CPU. It loaded up it worked, I don’t understand. What are you talking about? My frame rate may not be 60 frames per second, but it’s playable, I’m getting in the 40s to 50s.
My one percent loads are in the 30s. I can control the game heck. This is better than a PlayStation 4.. What you talking about Willis! This is what we’re talking about.
This is an i7 8700k 1080p High detail and look at that frame time graph. It is smooth as butter now I can hear you all already saying, but it’s not the same video card, it doesn’t matter, neither graphics card is being fully utilized. The limitation here is the CPU, not the graphics card. We we could stick in RX 6600 in here and we’d be getting close to, if not quite the performance of the RTX 3090 at 1080p, but even if it was running a bit slower, the frame time graph would be running smooth because it is your CPU. Not your GPU that provides that frame time graph, moving right along to low detail on the RX 6600 95 6500 you’ll notice that the frame rate doubles. We are now getting well over 60 frames per second we’d, be getting about 60. If you were at medium detail or tweaked it and customized it a bit currently 74 to 75 frames per second average, with a one percent low end of 39..
This is while much slower than the bottom Benchmark frame time. Graphs, people frame time graphs. This is what controls the perceived smoothness of a game and while, yes, the frame rate is better on low detail. It is all over the place and it will always be all over the place because you do not have enough course and threads. One question some of you may ask in the comment section is what about an I3 13 100. That’S four cores and eight threads and that’s Raptor like so. It would run this much better.
Wouldn’T it! Yes, it would because it has eight threads versus the four threads that we’re looking at here, and it has a much higher per core performance, higher IPC, clock, speed, Etc. So, yes, it would run better now. I do not think any of you should be buying i3s in 2023 to play AAA games. I think that is incredibly short-sighted and foolish.
The I5 13600k is tremendous value for the money it can be purchased for under 300 and with 14 cores and 20 threads. It is the New Deal in terms of price 2 performance in the PC gaming space. Interestingly enough, even with an RTX 3090 TI, the I5 13600k is still the bottleneck at 1080p. High detail notice. The 3090 TI is not at 100 usage. Really all these fancy new graphics cards are kind of wasted if you’re playing a 1080p, but if you’ve got a 1440p high refresh rate monitor a 4k monitor an ultra wide 1440p monitor. Then yes, they’re, wonderful, otherwise, they’re a complete waste of money. The RX 6700 XT for 360 dollars is by far the deal if you’re looking for something in the mid-range space with premium 1080p performance, just not on an i5 6500.
Thank you all so much for watching. Let me know what you thought of this, including the inclusion of two more modern CPU benchmarks in this article. I do believe that’s a first in this type of video series. I rarely rarely do that, but I can do more of it if you guys enjoyed it now.
I do not have an RX 6600 installed on a more modern CPU like that in this game, to show you or of course I would have shown you an RT x-3090 TI is, of course, not comparable whatsoever, but I do think it’s a fair comparison, considering that Neither graphics card was fully being utilized, so you’re, seeing the true CPU performance and not what the graphics card can do thanks so much for watching. I will see all of you next time. Foreign foreign .