summary refs log tree commit diff stats
path: root/docs/specs
diff options
context:
space:
mode:
authorNikola Pavlica <pavlica.nikola@gmail.com>2020-01-08 13:13:42 +0100
committerGerd Hoffmann <kraxel@redhat.com>2020-01-14 07:26:36 +0100
commitc4c00922cc948bb5e879bfae60764eba1f8745f3 (patch)
tree537394e289e96b1bb151422d552d61eca8a871a7 /docs/specs
parentc388f408b538b38db83999fd20b17020d5fdf68f (diff)
downloadfocaccia-qemu-c4c00922cc948bb5e879bfae60764eba1f8745f3.tar.gz
focaccia-qemu-c4c00922cc948bb5e879bfae60764eba1f8745f3.zip
display/gtk: get proper refreshrate
Because some VMs in QEMU can get GPU virtualization (using technologies
such as iGVT-g, as mentioned previously), they could produce a video
output that had a higher display refresh rate than of what the GTK
display was displaying. (fxp. Playing a video game inside of a Windows
VM at 60 Hz, while the output stood locked at 33 Hz because of defaults
set in include/ui/console.h)

Since QEMU does indeed have internal systems for determining frame
times as defined in ui/console.c.
The code checks for a variable called update_interval that it later
uses for time calculation. This variable, however, isn't defined
anywhere in ui/gtk.c and instead ui/console.c just sets it to
GUI_REFRESH_INTERVAL_DEFAULT which is 30

update_interval represents the number of milliseconds per display
refresh, and by doing some math we get that 1000/30 = 33.33... Hz

This creates the mentioned problem and what this patch does is that it
checks for the display refresh rate reported by GTK itself (we can take
this as a safe value) and just converts it back to a number of
milliseconds per display refresh.

Signed-off-by: Nikola Pavlica <pavlica.nikola@gmail.com>
Reviewed-by: Philippe Mathieu-Daudé <philmd@redhat.com>
Message-id: 20200108121342.29597-1-pavlica.nikola@gmail.com

[ kraxel: style tweak: add blank line between vars and code ]

Signed-off-by: Gerd Hoffmann <kraxel@redhat.com>
Diffstat (limited to 'docs/specs')
0 files changed, 0 insertions, 0 deletions