Kubernetes vGPU required drivers

Hi, I’m trying to share a L40S among multiple vms in Kubernetes cluster that is running on Linux physical server. Vms are run via KubeVirt. I would like to use vGPU to guarantee memory isolation (in terms of reserved framebuffer per vm, I know mig is not supported) but documentation is not much clear about which drivers to install to make it work and on which layer. I understood that through the Gpu Operator I can install, on the worker node host os, the vGPU Manager and Device Manager to create the virtualized resources and the KubeVirt Device Plugin to expose them to the cluster. Is it correct? Then, once the vm is created and running, are there any other drivers or packages that need to be installed on its os? I don’t mean virtual gpu software (vWS, vPC, etc.), I just want access to the vgpu resource on the virtual machine to use it for whatever workload. Is vGPU for compute Guest Driver the correct add on required for this purpose?

Thank you for your attention