0 I am comparing the result of glGetIntegerv function with GL_MAX_SAMPLES on two different windows installed on virtual machines on Mac using Parallels 17. Both of them have the same hardware, meaning same graphic card named "Parallels Display Adapter (WDDM)" with same updated driver versions. I compared all the relevant parameters using DirectX Diagnostic Tools. I compared all the relevant Graphics configurations of the virtual machines in parallels configurations. I reinstalled parallel tools . The OpenGL Extension Library used is GLEW version 1.11.0 The only difference between them is that one is windows 11 Home 64-bit (10,build 22000) And the second one is Windows 11 Pro Insider Preview 64-bit (10.0,build 23451) The problem : When using glGetIntegerv function for querying GL_MAX_SAMPLES (reference) , The home edition return 1 while the Pro returns 4. Can anyone help me with anything I missed to compare ? I wish I could look inside the implementation of glGetIntegerv in order to understand where should I search for more for differences, but as I read it is driver related function and there is no open source of it ? Thanks !