@HPC Podcast Archives - OrionX.net - HPC News Bytes – 20241007
Episode Date: October 7, 2024- IDC report on datacenter energy use and growth - NTT Research on the state of optical computing - AI experts weigh in on the future impact of AI [audio mp3="https://orionx.net/wp-content/uploads/20...24/10/HPCNB_20241007.mp3"][/audio] The post HPC News Bytes – 20241007 appeared first on OrionX.net.
Transcript
Discussion (0)
Welcome to HPC News Bites, a weekly show about important news in the world of supercomputing,
AI, and other advanced technologies.
Hi, everyone.
Welcome to HPC News Bites.
I'm Doug Black of Inside HPC, and with me is Shaheen Khan of OrionX.net.
Let's start with electricity usage, the topic at the top of the IT agenda, thanks to
AI and its thirst for energy-hungry GPUs and other HPC-class gear. Market research firm IDC
released a study of the space with eye-popping stats. They see electricity as the largest
component of operating expenses for data centers, and it's growing.
Enterprise data centers spend 46% of their total spending on energy, and that number is 60% for service providers, presumably because they are better at reducing costs elsewhere.
Not surprisingly, it sees AI workloads claim a growing portion of the electricity used. They estimate AI data center capacity to grow at a compound annual growth rate of 40.5% through 2027,
with corresponding energy consumption growing at 44.7% COCR.
We can expect a lot of innovation in this area,
now that costs are high and energy efficiency is easily translated to cost savings. IDC said
global data center electricity consumption has a five-year CAGR of 19.5%. That's about 2.5x over
five years, while the AI portion is 44.7% CAGR, as you mentioned, and that's about 6.5x over five
years. For comparison, Moore's law is 41% CAGR, doubling
every two years, so this is higher than that. IDC also notes a few other parameters, like the
rising costs of electricity, compliance, geopolitics, climate change, etc., and saying electricity prices
are likely to continue to increase. So that means pressure for one, more efficient
technologies that improve power usage effectiveness, PUE, which exascale systems are doing,
as you see in the green 500 list. Two, better utilization of the systems you already have,
where supercomputing centers also lead. And three, pressure to make money from AI. Those pressures ripple
through the entire supply chain, unless it's a nation state aiming to build a strategic capability
and not too concerned with efficiency or economics until after they achieve a leadership position.
Optical computing is a recurring topic that continues to make steady progress. An article in Forbes by the
CEO of NTT Research in Japan makes the case that optical is getting closer to reducing energy usage
for selected computational kernels. It can develop into the next big breakthrough in computing,
reminiscent of the incubation that GPUs went through, and has direct relevance
to quantum computing. For use cases, the article lists matrix-vector multiplication, improving
signal-to-noise ratios in fast radio communications such as 5G, and optical processing and presentation
of image data with increased privacy. NTT also made news in late
August. They announced the first all-photonics network between Taiwan and Japan that connects
with 100 gigabits per second bandwidth and takes 34 milliseconds for a round trip. The article is
a good snapshot of the state of the industry. It's good to see progress and small niches develop. In
general, photonics can benefit from the buying power of the massive global telecommunications
industry. If it can use materials or equipment that are used by telcos and use them in a new way,
you'll have a big cost break with access to low-cost, high-volume components. Photonics also
lets you use the positive attributes of wave physics, which also
link it to quantum computing. It provides so-called spatial parallelism, ability to mix signals with
or without blending them, etc. But it presents challenges too. For example, photons are best
in point-to-point straight line connections. It's hard to switch or amplify them, and their speed,
like anything else, is a function of the medium. So photons don't go any faster in silicon than
electrons do. The article mentions key materials technologies like thin-film lithium niobate,
TFLN, which has been in the works for years and is rapidly getting commercialized.
There was a good article in the Wall Street Journal recently in which AI experts were
asked for their visions of AI in 2030.
Insights included the need to prepare for a future in which AI falls into the pattern
of other major emerging technologies, in which we tend to overestimate the effect of a technology
in the short run and underestimate the effect in
the long run. As for AI replacing human workers, one of the experts said creative workers, writers,
and programmers will be among the most impacted, and that companies successful with AI will
methodically apply a task-based approach to its implementation, recognizing that tasks are the
fundamental units of an organization.
AI is a big topic in society now, so it is interesting to get a perspective from experts
and compare it with non-experts. In broad brush terms, the difference seems to be
experts see AI as here to stay, getting better all the time, and accessing more data from more
sources, while non-experts seem to either see it as a
bubble, a fad, or as the end of the world. You also need to go a layer or two deeper to see
fresh insight as the top-line perspectives become commonly accepted and better understood. So themes
like AI everywhere or increasing integration into companies are less interesting than complexities of data privacy,
access to new sources of data, or what new policies are needed. One topic that I consider
to be a distraction is the so-called artificial general intelligence, AGI, otherwise known as
super intelligence, SI. For the next decade or so, AI is about what it can do versus what it is. And the handwriting is on the wall that it can do a lot and will impact society and needs
policies sooner than later.
All right, that's it for this episode.
Thank you all for being with us. Inside HPC. Shaheen Khan and Doug Black host the show. Every episode is featured on InsideHPC.com
and posted on OrionX.net. Thank you for listening.