Home Artists Posts Import Register

Content

Hey everyone, it's Rich here with your update on what's happening this week at Digital Foundry in the wake of this morning's #content meeting. A full house this week with myself, John, Tom, Oliver, Alex and Will in attendance! Here's what's cooking:

* Rich (ie me) is deep in GPU benchmarking and reviewing territory. Beyond the numbers I'm interested in the advantages of DLSS 3 on midrange PCs bearing in mind how CPU limited we are these days in many games.

* John is finishing up work on his Zelda: Tears of the Kingdom piece, produced in collaboration with MVG! There'll be a deeper look at portable performance (not possible in the review period), Logan vs Mariko comparisons plus how overclocking improves the outlook.

* Alex is working on a Company of Heroes tech retrospective this week, just in time for the new game's launch on consoles. This is a sponsored video produced in assocation with Sega - which I believe is the first such #content we've done this year. We try to make these pieces with the same level of care, attention and passion as our other videos and turn down a bunch of stuff. This one should be great!

* Oliver is finishing up an analysis of emulation on Xbox Series consoles and stands ready to follow up on whatever happens at the PlayStation showcase on Wednesday.

* Tom is going to check out Lord of the Rings: Gollum to see if there's any potential stories cooking there.

Beyond that, I discussed a couple of ideas that came to me at 3am this morning. Firstly, what if frame-rate as a metric didn't exist? How would we measure performance? How would this affect the perception of relative GPU performance? Presumably rather than expressing performance as a fraction of any given second, we would be looking at frame-time instead. This would totally transform barcharts, that's for sure, and I'd really like to include frame-time in milliseconds at least as a new option in the Eurogamer benchmark viewer. 

Secondly, assuming we can figure out die costs accurately and map what a 320-360mm2 7nm part would translate to in the 5nm space and factor in the implications of moving from x86 to ARM, what would an Nvidia-based Xbox or PlayStation console look like? Could it run Cyberpunk 2077 RT Overdrive? How would DLSS2 and DLSS3 change the picture?

Just a couple of feverish nighttime thoughts. Feel free to discuss them with me on Discord but for now, I hope you all have a great week!


Files

Comments

Anonymous

Turok 2008 retro play wen??

Anonymous

Daikatana retro play WEN?!

MittenFacedLass

No plans to cover Lego 2K Drive?

Anonymous

Hey guys! I'm dying to know what you all think of this recent press release from Nvidia regarding the VRAM design on the Ada Lovelace series (linked below). They're saying that because the L2 cache of the new 4060 cards is 16 times greater than their ampere counterparts, it will reduce memory bus traffic by 50%. They claim that this will allow the GPU to use its memory bandwidth twice as efficiently, which means an Ada card with 288 GB/sec bandwidth would perform similarly to an Ampere card with 554 GB/sec bandwidth. Do you all think these claims hold any water? I'd be very curious to see an actual head-to-head against Ampere once the 4060 cards are released. Thanks and I hope your week is off to a great start! https://www.techpowerup.com/308795/nvidia-explains-geforce-rtx-40-series-vram-functionality#comments

The Knight Who Says Ni!

So Happy to see another collaboration with MVG, my two patreon subs working together again.

Eric Hurst

I’ve found people tend to notice input latency before they notice the visual difference of frame rate. After all, frame rate is really about input latency first, and it just happens to have a visual effect as well. I think it’ll be important to start measuring end-to-end input latency, especially since things like cloud gaming and VR will only be getting more popular, and the consequences frame rate has will be felt there more substantially (arguably).

VeryProfessionalDodo

I find the idea of a world without frame-rate metrics quite interesting. I think frame-rates are often either misconstrued or misunderstood, because they're not at all what's most important. I would say that frame-time health is the critical factor. Frame-rates can often be misleading, since a game can be perfectly fine at 60, but with some frame-time spikes it could drop to 58. Even in a VRR monitor, it could "feel" horrible, but the frame-rate counter would not accurately tell the whole picture. Someone who's not well versed in what frame-times are and only barely understands the concept of frame-rate (which I have no doubt are the majority in the console space, and a good chunk of the PC community), will think "that's weird, the frame rate is 58, so this should feel great, but it doesn't". It would also be interesting to think of frame-time health relative to their performance target. A game with a dogged lock to 30fps (with proper frame pacing) might feel a whole lot better than a varying 45-60fps game.

VeryProfessionalDodo

I also think it's important for content creators like DF to explain to the "ordinary Joe" in a simple to understand term how performance "feels". They might see the frame rate counter going from 50-60fps, but if they're not looking at the frame-time counter, they could maybe not be aware that it will feel worse that what that metric shows. Maybe you could switch around the priority in the video, make frame-times matter more visually than the frame-rate?