OpenGL, GLX and Linux
Linux is often criticized by the Windows community at large for not having an infrastructure that can play games. As proof, they point to the general design nature of X11. While Windows and X both have the primary features necessary to manipulate graphical bits of data, X also has the ability to display data on multiple devices. Windows boasts of its tightly coupled relationship with the OS, while X is a user space application. At first glance, it would appear that getting the same performance from X as you get with Windows would take some serious re-engineering of X, if not the creation of a tightly coupled, kernel level graphical subsystem for Linux.
Linux 3D Options
In general, there are two types of 3D acceleration under Linux: Glide and OpenGL. In many cases, OpenGL is used synonymously with Mesa, a Free, Open Source implementation of OpenGL. Traditionally, in order to get the largest performance gain from Linux 3D, you needed a 3dfx based graphics card using Glide. Often these cards are associated with 3dfx's Voodoo line of accelerated cards. Another more recent option is to use GLX. GLX is what makes using X and OpenGL work together since OpenGL does not handle presentation devices. The problem is that GLX uses X, and therefore, as mentioned earlier, you might not get the performance you want.
Utah-GLX
With the introduction of GLX into the Open Source community, Steve Parker began work on GLX in 1998. Simon Pogarcic of SuSE and later Terrence Ripperda helped forge the project for Linux.
Author's Update: Stephen Crowley began the work on the G200 GLX soon after Matrox released their specs. The other mentioned indivduals came on board after that. Thanks to Matthew Pavlovich for the correction!!
Today, this project supports Matrox G200/G400 as well as Riva TNT, ATI RagePro, S3 ViRGE and Intel's 810 chipsets. John Carmack, of Doom/Quake/id Software fame, began helping the project in April of 1999 by making some benchmark notes about G200 performance under Linux using q3test. Though the whole project is not specific to Quake, John's presence definitely helped identify weaknesses in the driver with regards to the very popular first person shooter.
Matrox and Other 3D HW Company Involvement
A huge improvement to the G200 GLX module was achieved by implementing DMA support. This was made possible by having the register documentation from Matrox for the G200 based cards. Though Matrox has been criticized for not giving the GLX team much information, this one piece produced some of the best performance improvements for the project. It may not seem like much, but even a little bit of information in the hands of Open Source developers goes a long, long way. The Matrox driver is considered to be the fastest and most mature driver to date.
Nvidia, unfortunately, chose a more closed-door approach with regards to Linux. They determined that they would support Linux, but they wanted more control on the source. They have released their source under the Xfree86 style license, but the work is primarily Nvidia's to move forward now. Some have accused Nvidia of Open Source abuse by leveraging the exiting GLX code base and then moving it forward on their own. My opinion is that Nvidia's work under Linux will suffer because they chose to take more ownership of the project. Nvidia also implies that their goal is to create a single OpenGL implementation for both Window and Linux. With their recent announcement of working with SGI on a "true" OpenGL port, this may signify the demise of what little GPL software (the Mesa part) they currently deploy. Time will tell if the Nvidia drivers will ever come up to speed with the drivers developed purely in the Open Source arena.
Newcomers, S3 and ATI have been very good about sharing information from what I understand. This could make these drivers even better than the current Matrox GLX drivers. Something to keep a watch on.
Using GLX
In order to use GLX to it's maximum efficiency, you will need to download and compile from source. Matrox does put recent snapshots of GLX binaries on its website, but they haven't always had all of the features compiled in (this may be changing though).
The primary source of all information with regards to installing GLX on your Linux box is at:
http://utah-glx.sourceforge.net/
There you find details about how to get GLX implemented. The instructions in general are:
Get sources for glx and Mesa 3.2 (Mesa development tree)
Compile.
Get sources for agpgart.
Compile.
Install libraries, agpgart kernel module and glx.so module.
Point to glx.so module in XF86Config file.
Tune options in /etc/X11/glx.conf.
Benchmark Equipment
Celeron 300A o/c'd 450Mhz
128M DIMM
Matrox G200 8M SDRAM
Abit BH6
motherboard
NOTE: I could
not get 2x AGP to work with this version of agpgart. Under Windows
95, 2x AGP is not implemented anyway.
WOW! The
difference in results are almost negligible. This very impressive
coming from a driver that many consider to be about half-way there.
At 21.9 fps, Q3Demo is very playable using 1st generation
3D HW.
Again, the
performance difference is barely there!
But what about under stress?? How will Linux hold up under heavier 3D requirements?
Appears that not only did Linux hold its own against Windows, but actually beat it in DEMO2!
What about just
raw cranking FPS....that is, how does it perform under little
stress?
Look out
Windows!! Looks like the current GLX for G200 has you beat all over
on just raw speed. The margins are significant here too!
Conclusion
Is Linux ready for 3D gaming? Sure looks like it. Even with less than stellar HW, Linux peformance was at or even beyond that of the Windows 95. Another note is that image quality is better as well. Quite frankly, there are problems with the OpenGL drivers under Windows for the G200.
In
Windows 95 (top), notice the abnormally bright textures. They seem
to glow in the dark. Also, the light from the flame is too bright.
No problems
under Linux (bottom).