Superrich buys graphics cards – news Urix – Foreign news and documentaries

At Norway’s largest university campus, the basement is humming. The Idun supercomputer is one of Norway’s smartest and most powerful. So powerful that the waste heat heats up the entire Gløshaugen in Trondheim. It is trying to find out what we should use artificial intelligence for. It is not alone in that. Computing power has become Silicon Valley’s new gold. A battle for data capacity has begun. But to understand this battle, we need to understand what AI really is, and what it can do. – Can’t come up with anything new Most of us have heard of or used ChatGTP, Google’s Gemini, or other language models. They generate text. OpenAI, the company behind ChatGTP, has experienced extraordinary growth in the past year. In two months after launch, over one hundred million users flocked to the service, according to The Verge. – Those models are really good, even in Nynorsk, says Gunnar Tufte, professor of computer science at NTNU. Professor of Informatics at NTNU, Gunnar Tufte, does not like commercial forces vacuuming up the market for graphics cards. Photo: Lars Os / news He is impressed that the big companies, with OpenAI at the forefront, have developed a tool that most of us can use in everyday life. But it is important to remember that it is only a tool and language models. None of them can “invent something new”. – Simply explained, the models calculate the most likely answer. If you enter a question, it looks statistically at the words and fills in the first logical word based on what it has learned. But the meaning of the text or the answer you want, it does not know. They are not that smart – yet. To understand what’s going on under the hood of these models, we need to explain what the brain is. The brain behind In all computers, in laptops as well as mobile phones and tablets, there is a so-called CPU. It is very good at solving all tasks, everything from running the text program Word to calculating mathematics. But the CPU is not an expert in all tasks. An AI model trained on language, for example, only needs to be able to count words. To rain a lot. Therefore, today it is graphics cards, so-called GPUs, that are best for language models such as ChatGTP. And those cards are now in demand. One card costs several hundreds of thousands of kroner. The company Nvidia was established in 1993 in the USA to produce graphics cards for the gaming industry. At that point, games started to move towards more cinematic and realistic sequences. The computers had to have computing power to be able to display high-resolution animations. At first they were simple and square 3D models. But gradually the development has accelerated rapidly. Today, games are almost lifelike in graphics. Much of the credit comes from Nvidia’s graphics processors – the “magic” graphics cards. Silicon Valley has found it to be today’s best solution for training AI models. The largest technology companies Meta, Amazon, Tesla, Apple, Microsoft, and Alphabet (Google) – are now vying for the most powerful machines on the market. And Nvidia knows that. That is why they have developed special graphics cards that ordinary consumers, gamers, video editors and animators can only dream of. Nvidia CEO Jensen Huang can laugh all the way to the bank. Their stock has risen 1,923 percent in five years thanks to AI developments. Photo: I-HWA CHENG / AFP But the battle for graphics cards could come at the expense of research environments, NTNU fears. It may also mean that ordinary graphics cards for consumers will be de-prioritised. – We were lucky to get hold of some completely new nodes recently, says Tufte and shows us part of the supercomputer Idun. A node is a computer with a CPU and several GPUs and built to stand in a closet, where they buzz and run around the clock. Down in the basement room at NTNU, it’s about as noisy as at a concert, and the heat generated is just under 35 degrees. Inside this rack there are graphics cards worth millions of kroner. NTNU would like to have many more. Photo: Lars Os / news In these cabinets are several of the graphics cards everyone is now looking for. OpenAI is one of the companies hunting for the resources. According to the Wall Street Journal, the company’s manager, Sam Altman, is looking for a staggering seven thousand billion dollars to further develop a new AI model that can generate video. – It is of course frustrating for us that these multinational companies with enormous resources are buying up the hardware. What do they really want to make money from? I doubt that Musk and Zuckerberg want to spend their billions on weather models, there won’t be much value from that, says Tufte. And when so much money is pumped into development, investors expect to profit from it. According to The Verge, however, the companies have not been able to deliver on the expected results. – Galloping investments in the technology sector are good for development, but the danger that those with the wallet lose faith is also present. Then AI development can crash, says Tufte. Energy consumption can explode NTNU’s supercomputers, including Idun, use well over one megawatt, (in comparison, an LED light bulb uses around 5 watts), for heavy calculations. Tufte is not alone in thinking about the power use in today’s computers and server parks. In Norway alone, we have at least 18 server parks. There are large halls, often located in basements or inside mountains, that contain millions of computers. They use an enormous amount of energy, almost 5TWh. In line with the development of AI models, even more power will be required in the future. And more computing power means higher energy consumption. Google’s planned data center in Skien has applied to use around 5 percent of all electricity in Norway. – Now we don’t know how much of Google’s center is used for artificial intelligence, but given today’s developments, I have no doubt that they are betting on it. The researcher compares the energy use with the human brain. – Imagine if computers could do the same as our brains, and use just as little energy. It would be revolutionary, says Tufte enthusiastically. Our brain needs around 30 watts to function. Then we can carry out small practical tasks, compose music and engage in physical activity. And we can learn from a single example, not millions as these AI models require. – Unfortunately, it will take a long time before we get there, but we have to solve the energy needs artificial intelligence uses, says Tufte. Professor Gunnar Tufte at Informatics at NTNU is closely following the development. Photo: Lars Os / news



ttn-69