The competition reaps victims. The IT sector is highly competitive and over the years several companies have given way. Let’s take a look at some big best graphics card for gaming makers that we lost along the way.
We will not talk about some brands, like OEMs or companies that only made 2D accelerators. We focus on 3D by inserting realities like Matrox, once very active in the sector.
We will also talk about Rendition, S3 and the beloved 3DFX. Some might notice the absence of the PowerVR proposals (do you remember the Kyro?), But Imagination Technologies (formerly Videologic) is still active and realizes GPUs for the mobile sector. Potentially it might decide to design GPUs for dedicated video cards, so it does not technically belong to the “fallen” producers under the blows of a ruthless competition.
The first Matrox video cards were presented at the end of the ’70s, but they were rather simple solutions. The first Matrox product capable of handling 3D images was the Impression , which had to be accompanied by the Millennium 2D accelerator. Performance was reduced because the Impression was not meant for gaming but rather for CAD. As a result, Matrox Impression did not make a good impression on the fans.Matrox Mystique
In 1997 the Matrox Mystique arrived, a solution that combined 2D and 3D acceleration on a single board. This allowed to reduce costs because one had to buy a card instead of two. Mystique was more suited to gaming thanks to features such as texture mapping. The 64-bit graphics chip was equipped with a single pixel pipeline and a TMU.
The GPU lacked several important capabilities such as mipmapping, bilinear filtering and transparency support. Mystique was distributed with an amount of SGRAM from 2 to 8 MB. The memory on 2 MB cards could be updated using additional modules. Despite the best performance, the graphic quality offered by Mystique was not the best. A later version called Millennium II relied on WRAM, which allowed the card to perform better than its predecessor.
Matrox G200 & G250
Matrox’s second 2D / 3D graphics accelerator was much more successful; offered several new features including full 32-bit color support, mipmapping, mipmap trilinear and anti-aliasing filter. The G200 , released in 1998, had an internal 128-bit design with two unidirectional 64-bit buses. The memory interface had a width of 64 bits and supported between 8 and 16 MB of RAM . Although the G200 continued to use a pixel pipeline with only one TMU, the Matrox enhancements allowed the board to outperform the old Mystique by far.
Because the company adopted a 32-bit color at a time when the competition was anchored at 16 bits, the image quality of the G200 was superior and did not show artifacts like dithering. The first G200 chips were produced at 350 nanometers but Matrox switched to the 250 nanometer process in 1999. This allowed for higher frequencies. The variants based on the 250 nm chips were sold as G200A or G250. The G250 models had more aggressive frequencies and therefore better performance.
The G400 arrived in 1999, and was essentially a G200 graphics chip with twice the resources . Instead of a 128-bit architecture complemented by two 64-bit buses, the G400 used a 256-bit design with two 128-bit unidirectional links. Matrox added a second pixel pipeline with its own texture unit . The memory interface switched to 128 bits and supported twice the RAM (16 to 32 MB). The frequencies were much higher, while the chip was compatible with DirectX 6. This was one of the first cards to allow simultaneous video output to two screens.
Unfortunately the G400 suffered from rather serious driver problems since its introduction, which limited performance in the first reviews. Over time the company solved the problems with a stable OpenGL ICD and improved DirectX compatibility.
Matrox produced the G400 at 250 nanometers but then switched to 180 nanometers. The new chips were used in the G450 cards, less expensive than the previous G400s. Matrox did not increase the frequency of the G450, so the impact was rather limited. A later version called G550 added more gaming-oriented features.
The final Matrox best graphics card for gaming aimed at enthusiasts was the Parhelia-512, presented in 2002. His project was rather ambitious with four pixel pipelines, each with a vertex shader and four TMUs. The GPU was powered by a 256-bit ring bus connected to 128/256 MB of DDR memory.
Matrox said that the ring bus topology allowed the 256-bit memory interface to operate as if it were 512 bits wide. It usually operated between 200 and 275 MHz and was fully compatible with DirectX 8.1. He also had partial support for DirectX 9.
The Parhelia-512 far surpassed the old Matrox cards, but it was slower than the ATI and Nvidia competitors . It was also quite expensive and could not win many market shares.
Matrox made a less expensive version known as Parhelia-LX; he had half the resources of the Parhelia-512 and was not particularly competitive. The Parhelia-512 chip went into the 90-nanometer manufacturing process in 2007 and was put on a low-cost board with DDR2, but again it did not have much impact. After the Parhelia-512, Matrox came out of the best graphics card for gaming market for videogame enthusiasts. Today the company is still in the market, but is focused on more specialized applications.