Home Artists Posts Import Register
The Offical Matrix Groupchat is online! >>CLICK HERE<<

Content

Link for email users: https://www.youtube.com/watch?v=5OyyC-SE-Fs

Hi all,

This is our second "Patrons Ask GN" video! As a reminder, this operates on something of an honor system. It is unlisted and exclusive content for Patreon supporters. You have been a major help for us over the past year, and we hope that this helps offer something in return.

You can find it above. Timestamps below:

00:17 - TigerOne: "Most titles like Battlefield don't have benchmarks in multiplayer because of the inability to reproduce results consistently.  If we were to limit the variance on the network (say, netcat'ing the packets on to the interface or something), would there be any other obstacle to producing such a "benchmark as a service"?"

03:32 - ZDG: "Is this a Steve sighting? Ouch about the shoulder."

04:48 - Red Mage Cecil: "GN Staff @buildzoid specifically - With these new AMD APUs out and the iGPU being (from the looks of things) overclocking friendly, and considering people will be putting these things into budget B350 boards... what could we be looking at in terms of stress/temps to these un-heatsinked 4+2 VRMs if people want to push these iGPUs?"

07:27 - Notwist: "Hi @Steve Burke , with GPU Boost 3.0, overclocking and "winning" the GPU lottery seems to be less variable between cards. But is there some good method to determine whether somebody "lost"? How would somebody determine whether their card is clearly in the sup-par performance bracket? Or are the margins between a good and bad card so narrow in real world usage that this whole winning/losing concept is only for users chasing high 3DMark scores?

09:05 - NULLOBANDITO: "@Steve Burke #askgn-questions Do you think the release of the new GTX 20 series cards will have any impact on miners, or rather is there a chance that the prices are low enough at launch that people will actually be able to buy new graphics cards?"

10:14 - "@Steve Burke #askgn-questions From my understanding 6/7/8th gen Intel CPUs have 16x PCI-E 3.0 lanes, then a DMI 3.0 interface to the chipset (roughly equal to 4x PCI-E 3.0). When Ryzen CPUs 24x PCI-E 3.0 (16x shared between 2 slots, 4x for M.2 and 4x for the chipset). Is there any cases on the Intel platform where you can saturate the DMI interface and cause a performance drop that wouldn’t be seen on Ryzen due to the more dedicated storage lanes?"


Files

Patrons Ask GN 2: GTX 20 Series Impact on Mining

Our second episode of "Patrons Ask GN" - thanks again for your ongoing support! TIMESTAMPS 00:17 - TigerOne: "Most titles like Battlefield don't have benchmarks in multiplayer because of the inability to reproduce results consistently. If we were to limit the variance on the network (say, netcat'ing the packets on to the interface or something), would there be any other obstacle to producing such a "benchmark as a service"?" 03:32 - ZDG: "Is this a Steve sighting? Ouch about the shoulder." 04:48 - Red Mage Cecil: "GN Staff @buildzoid specifically - With these new AMD APUs out and the iGPU being (from the looks of things) overclocking friendly, and considering people will be putting these things into budget B350 boards... what could we be looking at in terms of stress/temps to these un-heatsinked 4+2 VRMs if people want to push these iGPUs?" 07:27 - Notwist: "Hi @Steve Burke , with GPU Boost 3.0, overclocking and "winning" the GPU lottery seems to be less variable between cards. But is there some good method to determine whether somebody "lost"? How would somebody determine whether their card is clearly in the sup-par performance bracket? Or are the margins between a good and bad card so narrow in real world usage that this whole winning/losing concept is only for users chasing high 3DMark scores?" 09:05 - NULLOBANDITO: "@Steve Burke #askgn-questions Do you think the release of the new GTX 20 series cards will have any impact on miners, or rather is there a chance that the prices are low enough at launch that people will actually be able to buy new graphics cards?" 10:14 - "@Steve Burke #askgn-questions From my understanding 6/7/8th gen Intel CPUs have 16x PCI-E 3.0 lanes, then a DMI 3.0 interface to the chipset (roughly equal to 4x PCI-E 3.0). When Ryzen CPUs 24x PCI-E 3.0 (16x shared between 2 slots, 4x for M.2 and 4x for the chipset). Is there any cases on the Intel platform where you can saturate the DMI interface and cause a performance drop that wouldn’t be seen on Ryzen due to the more dedicated storage lanes?" We have a new GN store: https://store.gamersnexus.net/ Like our content? Please consider becoming our Patron to support us: http://www.patreon.com/gamersnexus ** Please like, comment, and subscribe for more! ** Follow us in these locations for more gaming and hardware updates: t: http://www.twitter.com/gamersnexus f: http://www.facebook.com/gamersnexus w: http://www.gamersnexus.net/ Host: Steve Burke Video: Andrew Coleman

Comments

Anonymous

#askgn-question maybe source a gpu from a miner, thats been used for a while for degregration testing

Anonymous

While it may sound interesting to test it out, Steve has mentioned a fair amount of times before that peripherals like gpus are good until they break. The test would be largely redundant unless you're looking at taking the card apart and getting beyond the level of detail the channel focuses on; that'd be interesting but may not be so for everyone.

Anonymous

Fair point

Anonymous

I am wanting to create a top end AM4 build. I have a cpu and gpu but really good ddr4 memory is scarce. Are the Fabs being converted to DDR5 and AM4 users are getting left behind?