I have just spent the past 5 hours fiddling with Ubuntu 10.04, ATI X1300 FOSS OpenGL drivers and VMware Player 3.1.0 build 261024.
All I wanted was to launch a Windows XP SP3 VM, run the damn dxdiag and smile like an idiot at the spinning cube test. It was not possible.
The errors “Required extension GL_EXT_framebuffer_object is missing” or “Required extension GL_EXT_texture_compression_s3tc is missing” in vmware.log clearly mean that in order to enable 3D acceleration VMware needs the following OpenGL extensions enabled:
Ubuntu Lucid Lynx comes with Mesa3D (or libgl1-mesa-glx if you prefer) 7.7.1 that is supposed to expose both extensions, but I was not that lucky.
Asking Google I was able to enable the second extension: just install driconf (sudo apt-get install driconf), launch it from System/Preferences/3D Acceleration and force “S3TC Texture Compression …” under the “Image Quality” tab to “yes”.
In order to enable/expose GL_EXT_framebuffer_object, I tried to install libgl1-mesa-swx11 and I was happy to see the extension appearing in glxinfo’s output. Unfortunately I did not realise that this is a software rendered, therefore I was missing DRI and the ability to enable S3 Texture Compression.
So I have decided to try the new Gallium 3D drivers 0.4 R300g. After fiddling with PPAs, installing a ton of updates and rebooting (re-enabling modeset in grub) I had GL_EXT_framebuffer_object back, but I was not able to use driconf anymore (hell yeah).
According to Gallium’s support forum at the moment it is not possible to enable S3TC at will, maybe in the next months it will be possible (but there are some patent issues) but for now you just can’t.
Then it came to my mind the reason why I did disable kernel mode set in the first place…
The combination of Radeon X1300, Intel 3945abg and mode setting is bad, because it leads to frequent disconnections (probably because of an IRQ conflict, but it is not clear if it is confirmed or not).
Luckily enough there is a workaround: just set in grub’s configuration the kernel parameter “radeon.modeset=0” and the disconnections will be gone forever. Together with GL_EXT_framebuffer_object, yeah!!!
Anyway I did a rollback towards the original Mesa 7.7.1 drivers, keeping kernel mode setting enabled and using driconf to enable S3 Texture Compression.
After 5 hours I had everything VMware needed to run with 3D acceleration turned on.
I was happy for 5 or 6 minutes top, just the time to realise that dxdiag was crashing (even during the directdraw tests) with VMware Player just exiting in a POOF. I tried to limit the virtual SVGA memory and to disable its off-screen memory, without luck: the bastard still crashes.
Funny thing is that I can smile at the spinning cube test on my Mac thanks to its craptastic Intel GMA950.
Maybe I will be able to do it using Linux in a couple of years if and when Gallium3D/VMware player will show some improvements.
Some more info about fixing DirectDraw.
If the full screen DirectDraw testing is failing on you (with or without 3D acceleration enabled) either with VMware crashing or DXdiag throwing the following:
DirectDraw test results: Failure at step 20 (Colorfill to back buffer): HRESULT = 0x887601c2 (error code)
Insert in the virtual machine .vmx definition file this string:
svga.noOffscreen = “TRUE”
It will slow things down, but your VM/application should not crash anymore when handling DirectDraw.
If you are wondering I am running:
– Ubuntu 10.04 with kernel 2.6.32-24-generic
– ATI Radeon X1300 with Mesa 3D 7.7.1 (libgl1-mesa-glx 7.7.1-1ubuntu3)
– X 2:1.7.6-2ubuntu7.3
About VMware crashing with Direct3D, it could be a bug related to DRI that should be fixed with kernel 2.6.34-something. I will post an update as soon the kernel gets pushed into the repository.
No joy with Fedora 13 (Kernel 184.108.40.206-147.2.4.fc13.i686, X 1.8.2-3.fc13, Mesa 3D 7.8.1)
No joy with Ubuntu 10.10 alpha 3