An expensive Nvidia GPU in an online portal catches the interest of a gamer. Reddit informs him: it can do a lot, but nothing for him.
What kind of graphics card is this about? In the subreddit “pcmasterrace”, the user Squawk_1200 asks for advice. A graphics card named “ThinkSystem NVIDIA L40” confuses him. Its original price according to the data sheet is $30,000, but it is now on sale for “only” nearly $6,000.
The explanation follows quickly on Reddit, it is a server GPU of type AD102 (via Nvidia), thus from the hardware perspective, it’s a RTX 4090 TI – Nvidia’s current flagship, albeit slightly better. This is because all chips used in consumer GPUs usually do not represent the maximum potential of the respective manufacturer’s technology generation. Server chips are much closer and are consequently more expensive.
What is an Nvidia L40 with an AD102 chip for? It is housed along with several others in bulky server chassis to perform large amounts of calculations in parallel. This could involve a variety of tasks, such as training neural networks for language models, like ChatGPT.
It relies on the special architecture of Nvidia GPUs with CUDA and Tensor cores for AI computations. We have explained more about this special type of hardware and why it is also relevant for gamers, in this article.
Powerful yet useless for gamers
Is it faster than an RTX 4090 at that price? Whether the above-linked graphics card, with its still proud, although reduced price of $6,000, is faster than a conventional top GPU (around $2,000) is initially secondary. First, the question arises:
Can an interested layperson even use it at home? According to experts on Reddit and our expertise, there are three potential stumbling blocks:
- Does the GPU have outputs for monitors? This is not a given with a workstation GPU. It does not need to render an image, just “compute” data. However, in this case, several connections are indeed provided.
- Are there publicly available drivers for the software installation of the card? Probably yes.
- Do you have a case with a potent cooling setup? This is necessary since the GPU is only passively cooled. The heat must be dissipated externally. Server farms have specially designed, wind tunnel-like channels to effectively blow away the heat.
Is it now likely faster? Perhaps yes, but that would heavily depend on the drivers and the game. Because the drivers would not be optimized for gaming, but rather for other applications, and even in the best case, the increase would hardly be worth the roughly $4,000 that would be incurred purely for the card compared to an RTX 4090.
Nvidia graphics cards belong to the top class of globally used accelerators for training modern AIs such as chatbots or other neural networks. In the article above, you can learn more about this and what consequences it could have in the near future. And we have also followed an interesting development in politics: Artificial intelligence shakes up politics: An entrepreneur wanted to reinvent democracy and failed