Someone experimented with a processor from 1997 and showed that 128 MB of RAM is enough to utilize AI

Someone experimented with a processor from 1997 and showed that 128 MB of RAM is enough to utilize AI

The development team at Exo-Labs has managed to run an AI language model on a Windows 98 PC with a Pentium II processor – and that with just 128 MB of RAM.

What exactly was tested? The development team at EXO Labs undertook an unusual experiment: They wanted to find out if a modern AI language model could run on a PC that is more than 27 years old, equipped with an Intel Pentium II, 128 MB of RAM, and Windows 98.

The model used is based on LLaMA 2 but has been significantly reduced to work with only 260,000 parameters in this extremely limited hardware setup. Nevertheless, the system reportedly achieved a processing rate of 39.31 tokens per second, according to the developers – a clear indicator of how efficient AI can become.

The complete technical process and the motivation behind the experiment can be read in detail on the EXO Labs blog.

The Technical Foundation

How was it realized? According to EXO Labs, the implementation was anything but simple. Modern compilers are simply not compatible with the old architecture. Therefore, the team resorted to the outdated Borland C++ 5.02 Compiler – one of the few tools still capable of running on Windows 98.

A compiler is, in simple terms, a program that translates source code into a form that the computer can execute directly.

Screenshot (Source: X)

According to the developers and Tom’s Hardware, the retro PC was purchased on eBay for around 119 pounds (approximately 140 euros). Since USB support is limited under Windows 98, the developers used a PS/2 keyboard and mouse and transferred the necessary files via FTP from the MacBook to the old computer.

The goal was to run a functional AI model completely offline and locally – without, for example, a cloud-based solution.

By the way: In the late 1990s, a simple USB-to-PS/2 cable adapter brought about an important transformation – helping to end the era of classic ball mice.

Why is this attempt a significant step for AI development? According to EXO Labs, the goal was not to make AI usable on old hardware, but to demonstrate how efficiently current models can become through clever architecture.

We want more efforts to be made to run AI models on older hardware. There is still much development work to be done – from optimizing memory usage to exploring new architectures that can work efficiently on limited hardware.

Developers from EXO Labs via exolabs.net

The so-called BitNet architecture was used for this purpose. It employs ternary weightings (−1, 0, 1) instead of the usual 16 or 32-bit values. According to JVTech, this can massively reduce memory requirements. Consequently, it is hoped that in the future, the necessary space and the previously extreme costs of widespread AI use can also be somewhat reduced.

If projects like that of EXO Labs show that even old computers with few resources can suffice, broader access to AI technology is within reach – even for schools, health centers, or small businesses in regions with limited resources. Some nations are already ahead: In one of the largest countries in the world, AI is set to become an integral part of education.

Source(s): Titelbild via Pixabay, EXO-Lab Blog
Deine Meinung? Diskutiere mit uns!
0
I like it!
This is an AI-powered translation. Some inaccuracies might exist.
Lost Password

Please enter your username or email address. You will receive a link to create a new password via email.