Games Analysis |OT| Time To Argue About Pixels And Frames!

Thanks! So does this mean that if a game can only muster up say 25 fps on average, it’ll get smoothed out to 30?

It’s meant to prevent screen tearing and motion judder, so you get a smoother and cleaner experience. There’s many ways it can be put in place – some on the Set side and some on the GPU Driver side. I’m a little fuzzy if it requires specific support from the devices, like the consoles, so I was trying to find something specific from RTings reviews but nothing yet.

I did find this blurb from AMD talking about it. In your example of 25 fps, I’m not sure if it would double up the frames to run at 50 Hz or if it would only double up some of them to hit 30 Hz since that’s within range.

What is Low Framerate Compensation?

Low framerate compensation (LFC), allows FreeSync technology to work when the framerate falls below the minimum refresh rate of the display. When the framerate drops below the minimum refresh rate of the display, frames are duplicated and displayed multiple times so that they can sync to a refresh rate that is within the displays refresh rate range. For example, a display with a 60 – 144Hz refresh rate, would be able to sync the frames of a game running at 40 FPS, by doubling them so that the display could sync and run at 80 Hz. A display with LFC effectively results in the removal of the minimum refresh rate boundary. All displays in the FreeSync Premium and FreeSync Premium Pro tier are certified to meet mandatory LFC requirements.

1 Like

Oh wow, this is pretty complicated but it sounds pretty cool! Thanks for taking the time to look into it, I appreciate that, the link, and the explanation :smiley:

Edit: in case someone else is wondering, this reddit post seems like it has a good deal of information.

The public release for The Coalition’s UE5 tech demo has been delayed to Monday. :confused:

Just to add to what BRIT posted. LG C1 OLED Review (OLED48C1PUB, OLED55C1PUB, OLED65C1PUB, OLED77C1PUB, OLED83C1PUA) - RTINGS.com

"Update 06/29/2021: The LG CX has been retested, and low framerate compensation is now working properly. The LG CX and LG C1 are both nearly tear-free below 20Hz when connected to an HDMI 2.1 source.

The LG C1 supports FreeSync and HDMI Forum VRR and is NVIDIA-certified as G-SYNC-compatible, and we didn’t experience any issues. To enable VRR, turn on Game Optimizer and make sure VRR and G-Sync is toggled for G-SYNC and HDMI Forum and AMD FreeSync Premium is ‘On’ for FreeSync. Like the LG CX OLED, the VRR range is extended to a minimum of 20Hz when using an HDMI 2.1 source. With HDMI 2.0, it begins at 40Hz."

3 Likes

@Stefrah

About the One X memory bus. It has a typical memory bus layout – all segments are addressable and offer the full speed. It has a 384 Bit memory bus. It has 12 GB memory total, where 3 GB is used by the OS and 9 GB is accessible to games.

It can be thought to have 12 Channels of 32 Bits each. Here is a rough picture to help you visualize how the full speed is available to games.

image

2 Likes

yes, lets move this discussion over here.

Memory bandwith on One X is simple: one memory pool and all memory chips have the same connection. If a game allocates 4 GB memory distributed over all memory chips it can access them at 326 GB/s.

1 Like

Not true depending on TV.

LG tvs already activated VRR to even 20fps using low frame rate compensation (as in a 20fps triggers the screen to update at 40hz and then they show each frame twice), but a recent update added more mulipliers than 2, which makes the VRR range active to pretty much any framerate (for example a 10fps image triggers the screen to update at 40hz and show each frame 4 times).

Rtings has tested this new update, essentially LG tvs are now judder/tear free no matter what you throw at them, even below 20fps

1 Like

No, VRR doesn’t smooth the framerate like that. Though a game ranging in 25 to 30 fps would indeed be felt like a constant 30.

The reason that 25-29fps feels so bad on a non VRR tv isn’t because the framerate is so lower than 30, but because it’s not a perfect divisor for the screen refresh rate of 60, so you either get tear or an inconsistent amount of time each frame is displayed, neither is particularly a good solution.

VRR solves this by letting all 25 frames to be shown properly paced, so it looks very smooth still.

1 Like

That calculation shows the full bandwidth of the 12gb memory. The game doesn’t get to use all of that, though, right?

Edit:

To be clearer, the distinction here is bandwidth and speed.

The bandwidth of a pool of memory is speed X amount of ram.

So the bandwidth you spec for the whole pool of memory (12gb) is not the bandwidth of part of that memory (such as the game allowance, 9gb in the case of the One X).

The game gets full 326 GB/s. The non-shaded portion of those memory chips is what the game can use. The full 12 lanes at 32 Bits at full speed.

The OS uses 1/4th of each memory chip at 12 lanes at 32 Bits at full speed. The Games use 3/4ths of each memory chip at 12 lanes at 32 Bits at full speed.

The OS is using the same bus, right?

The shaded portion is OS memory, runs 12 lanes by 32 Bits by full speed for 326 GB/s. The nonshaded portion is Game memory, runs 12 lanes by 32 bits by full speed for 326 GB/s.

It’s full 326 GB/s.

Did not forget that before the launch there were few “insider” reports of expensive HBM2 memory being used in PS5. But ultimately it ended up with lower bandwidth than Series X.

Of course. If a game or an app accesses one of its assets which is partitioned over all 12 memory chips it gets the according bandwith, which is 326 GB/s. Nothing more, nothing less.

If I get you right you are saying that as the OS and game allocations are spread across all 12 channels, theoretically the game can access the entire bandwidth? But the OS is always going to be taking from that bandwidth. It’s the same bus. What am I missing?

It’s beyond theoretical. It’s practical. It’s what happens.

1 Like

It’s what would happen if the OS was turned off, right? But the OS is always using that same bus, taking potential bandwidth away from the game?

There are two different things.

Another runnning app or operating system process will always take ressources away (memory, bandwith, processing power), yes. But why is your operating system hogging so much bandwidth, what is this thing doing??

The other point is the different memory bus layout between Xbox One X and Xbox Series. This has nothing to do with OS processes, its entirely on the hardware level. If a game accesses memory which is on the slower chips it gets less bandwith.

1 Like

It uses the same bus, but the OS does not always operate. The aspects of CPU/GPU reservations is beyond the scope needed to be discussed for memory bandwidth.

For inclusion here are the Xbox One generation resource reservations (taken from my post at B3D):

Around early 2014 Microsoft changed System Reservation from time-sliced 10% GPU to 2% GPU. The initial 10% was broken down to 2% for Voice and 8% for Video, giving developers use of 98% of GPU.

Around late 2014 Microsoft changed System Reservation from 100% CPU Core #7 to allow for using only 20% to 50% of CPU Core #7, giving developers use of 80% to 50% of CPU Core #7.