Display FrameRate second value

phil.svphil.sv Posts: 29Player
If I enable "Display FrameRate" in the video settings I display TWO VALUES: the frame rate ed another value in ms.
In all the world I can't see what is this second value. Nor I have found a tutorial about the AAPG settings.
What is an what is a decent value for the second one?


  • frankoffrankof Posts: 1,018Moderator
    the second value is frame display time, its how long each frame is displayed on your screen in milli seconds(ms)
  • phil.svphil.sv Posts: 29Player
    edited August 10
    Ty for the answer but... are you sure?

    My monitor is an old 60Hz one, not a new free sync.

    Since the old graphic card never reachs 60 FPS, in theory the second value should depend on the first.

    Instead I can have 40 FPS with 20 ms and 40 FPS with 40 ms.... O.o
  • Dct.F|LeventeDct.F|Levente Posts: 550Beta Tester
    Instead I can have 40 FPS with 20 ms and 40 FPS with 40 ms.... O.o

    How? Maybe for a moment, but I have a hard time believing that you'd see that when your framerate is relatively stable.

    What I don't know is how is the 2nd value calculated. It might me the time from the last frame to the current frame. It might be the average time between frames for the last X frames. It might be the maximum time between frames in the last X frames.

    But why is this so important?

    Theory and reality are not that different. In theory.
  • frankoffrankof Posts: 1,018Moderator
    FPS is frames per second, its a long time in this context. so its just a corse pointer to give you a indication on whats going on.
    It wont tell you why it feels "laggy" as it cant display a stop in frame rendering in a numeric value.

    Frame display time is much better for that as it will almost instantly tell you if a frame is displayed for a considerably longer time than the frame before and after (aka freezeframe or stutter)
    Unreal Engine uses a smoothed value over 10 frames for readability as it would only be a blur trying to read values that change in less than 5ms.

    BTW, its a direct correlation between fps and display time, they cant give two different results.

    1s =1000ms
    100fps is 100 frames displayed in one second, each frame is displayed in avg. 10ms
    So at a stable 100fps the second value would read 10ms, thats a ideal situation that never will happen irl.
    Instead you have a bunch of frames displayed at various times where the avg amounts to 100fps
    one frame is displayed at 5ms, the next is displayed at 15ms the avg is 10 ms.
    With such a short series you wont notice a difference, but with over a longer period you could have a couple of severely slower frame, like 30ms or more, the avg. could still be 100fps.
Sign In or Register to comment.