In the USA, a chip has been developed that could make powerful devices significantly faster and smaller. This includes, among other things, graphics cards, processors, and also AI-based projects.
Our title image (graphics chip of an AMD graphics card) is a symbolic image.
At the University of Southern California (USC), researchers have stated that a new chip and memory have been developed that could improve both AI projects and hardware in the future (via techxplore.com).
Instead of relying only on silicon, the current most important building block for computer chips, the university is using a combination of new materials and traditional silicon technology. This is expected to increase performance while simultaneously decreasing chip size.
Chips are becoming smaller, more powerful, and energy-efficient
Joshua Yang is a professor of electrical engineering and computer science at USC. He explains that the newly developed memory chip would offer the highest information density per device (11 bits) among all known memory technologies. And this could play a crucial role in providing various devices with a significant performance boost in the future.
How does it work exactly? The technology presented focuses on using the positions of atoms to represent information. In the current technology, the number of electrons is used for calculations on a chip.
The positions of the atoms provide a compact and stable way to store more information in an analog rather than a digital manner. Additionally, the information can be processed where it is stored, instead of sending it directly to a dedicated processor.
By switching chips from electrons to atoms, the chips also become smaller. Yang adds that with this new method, more computing capacity is available in a smaller space.
Moreover, this method could offer “many more layers of storage to increase information density”.
Another point mentioned by the researchers is energy savings. Because when you turn off devices like smartphones or smartwatches, the process requires power since many things need to be reloaded. With the new technology, data is expected to be stored long-term, which could save power and resources during the next boot of your smartphone.
AI application from the cloud to the chip
As further examples of applications, especially AI projects are mentioned. Because these are almost exclusively calculated in the cloud. However, with the new chip, performance could increase so that anyone could integrate something like ChatGPT into the chip of their smartwatch, without needing the large server with hundreds of graphics cards anymore.
Currently, powerful graphics cards are becoming larger and taking up a lot of space in gaming systems. And this could change in the end. Nvidia’s new small graphics card is only partially suitable for gamers:
After months, Nvidia presents a new tiny graphics card – but you probably don’t want to buy it