The G200 is a 2D, 3D, and video accelerator chip for personal computers designed by Matrox.It was released in 1998. Hello Eric, this actually looks like a problem with your drivers. What NoMachine does is querying the screen content few times per second. This is not a high work load and the X servers normally handles it without any problem on the widest range of systems we tested.
Hello, I recently installed Ubuntu 15.04 server for our new server.
Here are the specs:
Again it is dell poweredge r730 server
96 GB memory
25 TB storage
2 CPU 2.6 GHz, 10core.
The graphics card on the server is: NVIDIA Tesla K10 GPU
I've installed ubuntu-desktop as a gui.
However, we are running into an issue with recognizing the graphics card and driver for 3D acceleration. Right now, I can't log into unity, so I was forced into installing gnome-session-flashback. I can't login into Compiz either, but I can login to Metacity.
Here is some information:
When I try to type glx commands I get the following error messages:
Here is some info from the Xorg.0 log file:
Something is not quite right.Right now I have nvidia-346 (346.59) installed as the driver. I tried installing bumblebee in a previous install than the one I have currently. That was able to get glxinfo and glxgears command to work, but that was only through software rendering. It still was not picking up my Hardware GPU.
Uitimately, the goal is configure my GPU so it is used by Ubuntu 15.04. Is this a simple process? Or a simple step I'm overlooking? I have some experience using Ubuntu server, but I'm not too familiar with GPU and integrating it into Ubuntu server operations. I would appreciate any feedback, help, suggestions.
Thank you for your time.
c
The G200 is a 2D, 3D, and video accelerator chip for personal computers designed by Matrox. It was released in 1998.
History[edit]
Matrox had been known for years as a significant player in the high-end 2D graphics accelerator market. Cards they produced were excellent Windows accelerators, and some of the later cards such as Millennium and Mystique excelled at MS-DOS as well. Matrox stepped forward in 1994 with their Impression Plus to innovate with one of the first 3D accelerator boards, but that card only could accelerate a very limited feature set (no texture mapping), and was primarily targeted at CAD applications.
Matrox, seeing the slow but steady growth in interest in 3D graphics on PCs with NVIDIA, Rendition, and ATI's new cards, began experimenting with 3D acceleration more aggressively and produced the Mystique. Mystique was their most feature-rich 3D accelerator in 1997, but still lacked key features including bilinear filtering. Then, in early 1998, Matrox teamed up with PowerVR to produce an add-in 3D board called Matrox m3D using the PowerVR PCX2 chipset. This board was one of the very few times that Matrox would outsource for their graphics processor, and was certainly a stop-gap measure to hold out until the G200 project was ready to go.
Overview[edit]
With the G200, Matrox aimed to combine its past products' competent 2D and video acceleration with a full-featured 3D accelerator. The G200 chip was used on several boards, most notably the Millennium G200 and Mystique G200. Millennium G200 received the new SGRAM memory and a faster RAMDAC, while Mystique G200 was cheaper and equipped with slower SDRAM memory but gained a TV-out port. Most G200 boards shipped standard with 8 MB RAM and were expandable to 16 MB with an add-on module. The cards also had ports for special add-on boards, such as the Rainbow Runner, which could add various functionality.
G200 was Matrox's first fully AGP-compliant graphics processor. While the earlier Millennium II had been adapted to AGP, it did not support the full AGP feature set. G200 takes advantage of DIME (Direct Memory Execute) to speed texture transfers to and from main system RAM. This allows G200 to use system RAM as texture storage if the card's local RAM is of insufficient size for the task at hand. G200 was one of the first cards to support this feature[citation needed].
The chip is a 128-bit core containing dual 64-bit buses in what Matrox calls a 'DualBus' organization. Each bus is unidirectional and is designed to speed data transfer to and from the functional units within the chip. By doubling the internal data path with two separate buses instead of just a wider single bus, Matrox reduced latencies in data transfer by improving overall bus efficiency. [1] The memory interface was 64-bit.
G200 supported full 32-bit color depth rendering which substantially pushed the image quality upwards by eliminating dithering artifacts caused by the then-more-typical 16-bit color depth. Matrox called their technology Vibrant Color Quality (VCQ). The chip also supported features such as trilinear mip-map filtering and anti-aliasing (though this was rarely used). The G200 could render 3D at all resolutions supported in 2D. Architecturally, the 3D pipeline was laid out as a single pixel pipeline with a single texture management unit. The core contained a RISC processor called the 'WARP core', that implemented a triangle setup engine in microcode.
G200 was Matrox's first graphics processor to require added cooling in the form of a heatsink.
Performance[edit]
With regards to 2D, G200 was excellent in speed and delivered Matrox's renowned analog signal quality. The G200 bested the older Millennium II in almost every area except extremely high resolutions. With 3D, it scored similar to but generally behind a single Voodoo2 in Direct3D, and was slower than NVIDIA Riva TNT and S3 Savage 3D. However, it was not far behind and was certainly competitive. [2][3] G200's 3D image quality was considered one of the best due to its support of 32-bit color depth (assuming driver bugs weren't a problem).
G200's biggest problem was its OpenGL support. Throughout most of its life G200 had to get by, in popular games such as Quake II, with a slow OpenGL-to-Direct3D wrapper driver. This was a layer that translated OpenGL to run on the Direct3D driver. This hurt G200's performance dramatically in these games and caused a lot of controversy over continuing delays and promises from Matrox. [4] In fact, it would not be until well into the life of G200's successor, G400, that the OpenGL driver would finally be mature and fast.
Early drivers had some problems with Direct3D as well. In Unreal, for example, there were problems with distortions on the ground textures caused by a bug with the board's subpixel accuracy function. There were also some problems with mip-mapping causing flickering in textures. As drivers matured these problems disappeared.

Today[edit]
Matrox G200 series, especially the G200e is still a popular choice for server motherboard manufacturers, like DELL's PowerEdge series, due to its robustness, low power consumption and limited features needed just for VGA display.[5]
G200A & G250[edit]
Around 1999, Matrox introduced a newer version of G200, called G200A. This board used a newer 250 nm manufacturing process instead of G200's original 350 nm. This allowed Matrox to build more graphics processors per wafer at the factory as well as to reduce heat output of the chip, and the G200As came without even a heatsink. Some G200A boards were named G250, which were clocked slightly higher than the normal G200, and sold only to OEMs, with Hewlett Packard perhaps being the only buyer. [6][7]
Matrox G200er Driver Windows 10
Models[edit]
Board Name | Core Type | Process | Core (MHz) | Memory (MHz) | Pipe Config | T&L? | Memory Interface | Notes |
---|---|---|---|---|---|---|---|---|
Millennium G200 | Eclipse | 350 nm | 84–90 | 112–120 | 1×1 | No | 64-bit | SGRAM. 'SD' model uses SDRAM. 'LE' max 8 MB SDRAM. 250 MHz RAMDAC. AGP/PCI |
Mystique G200 | Eclipse | 350 nm | 84 | 112 | 1×1 | No | 64-bit | SDRAM. 230 MHz RAMDAC. TV out. AGP. |
Marvel G200 | Eclipse | 350 nm | 84 | 112 | 1×1 | No | 64-bit | SDRAM. 230 MHz RAMDAC. TV in & out. Breakout box for extra I/O. AGP/PCI |
G200 MMS | Eclipse | 350 nm | 1×1 | No | 64-bit | Quad GPU graphics card for 4 monitor support. Some have TV input. PCI | ||
Millennium G200A | Calao | 250 nm | 84 | 112 | 1×1 | No | 64-bit | Die-shrink G200. 'LE' max 8 MB SDRAM. 250 MHz RAMDAC. No heatsink. Power Consumption 4 Watts. AGP/PCI |
Millennium G250 | Calao | 250 nm | 96 | 128 | 1×1 | No | 64-bit | overclocked G200A, OEM-only. |
References[edit]
Matrox G200er Driver
- ^AnandTech: Matrox Millennium G200 - Date: 10 August 1998 / Topic: Video Card / Manufacturer: Matrox / Author: Anand Lal Shimpi
- ^3D Game Benchmark Results - Forsaken Mark - Tom's Hardware : New 3D Chips - Banshee, G200, RIVA TNT And Savage3D- 1:01 PM - 18 August 1998 by Thomas Pabst / Source: Tom's Hardware US
- ^iXBT: Matrox G200 - First PreView
- ^Hardware Upgrade- icd driver g200
- ^Dell Matrox Graphics driver
- ^G200 core - MURC - 5 July 2000, 13:22
- ^G250? - MURC-11 August 2000
Matrox Electronics Systems Ltd. G200er2 Cuda
- Notes
- Bruno, Pasquale. ICD Open GL Driver for Matrox G200, Hardware Upgrade, 12 December 1998.
- Mazur, Grzegorz. MatroX Files (Pins files for clocks), accessed 21 August 2007.
- Lal Shimpi, Anand. Matrox Millennium G200, Anandtech.Com, 10 August 1998.
- Matrox G200 - First Preview at iXBT.
- Pabst, Thomas. New 3D Chips - Banshee, G200, RIVA TNT And Savage3D, Tom's Hardware, 18 August 1998.
- HP Matrox G250 Installation Guide (and Technical Specifications), Hewlett-Packard Company, 23 June 2000.
