@HPC Podcast Archives - OrionX.net - HPC News Bytes – 20250310
Episode Date: March 10, 2025- TSMC $100B investment in Arizona Factories - EuroHPC Project DARE using RISC-V - Julich Hybrid Supercomputer with D-Wave - Quantum Computing Stock Price Volatility - Chinese Quantum Computer Better...s Google's Willow - Supercomputing Asia 2025 (SCA25) held this week in Singapore [audio mp3="https://orionx.net/wp-content/uploads/2025/03/HPCNB_20250310.mp3"][/audio] The post HPC News Bytes – 20250310 appeared first on OrionX.net.
Transcript
Discussion (0)
Welcome to HPC News Bites, a weekly show about important news in the world of supercomputing,
AI, and other advanced technologies.
Hi, everyone.
Welcome to HPC News Bites.
I'm Doug Black of InsideHPC, and with me is Shaheen Khan of OrionX.net.
Further proof, not that any more was needed, that HBC AI technologies are critical factors
on today's geopolitical chessboard, was seen in abundance last week.
Topping the list is the news that TSMC will invest an additional $100 billion
in three new advanced chip fabs, along with packaging and R&D facilities in Arizona. The news broke against the
backdrop of President Trump in his State of the Union address calling for Congress to change and
curtail the CHIPS Act, which could deliver a major blow to Intel's effort to expand its foundry
business. In addition, the EU, through its Euro-HPC joint undertaking, is committing 240 million euros to a project dedicated to building HPC technologies
that will make them supercomputing self-sufficient.
Europe has been on this path for several years.
This is a new stage for that strategy.
Now, Shaheen, while 250 million euros is a lot of money by any standard,
is it snarky to say that this is less than 3% of TSMC's
investment in the US and less than half the cost of the first exascale system frontier
at Oak Ridge Lab, which cost $600 million?
Well, there's such an avalanche of money pouring in in the past months and years that
one has to wonder whether it is all actually spendable. I mean, going shopping is fun,
but spending
big dollars is not so easy. And many of these commitments are over a number of years, and
some of them actually don't pan out. Nevertheless, it is really quite impressive. So it's hard
to tell which of these will produce the kind of results that can change the geopolitical
pecking order. We talked a lot about that in the episode last week with Dr. Handel Jones.
Europe is very much looking recalibrated and in the mood to invest. The project there stands
for digital autonomy with RISC-V in Europe, so the name says it all. Initially, it will
focus on one CPU, a vector processing accelerator, and an AI processing accelerator, all based
on the RISC-V architecture.
It is supposed to build actual systems and also leverage existing efforts like the European
Processor Initiative and efforts to advance quantum computing and especially hybrid systems.
Palo Alto-based AI services company Turing, which reports 500 employees and 900 clients and dates to 2018,
has issued a set of benchmarks designed to measure progress
toward artificial general intelligence.
The tech site, The Information, says the benchmarks,
quote, puts the focus on realistic real-world problems
challenging the industry's current reliance
on more academic measures.
Inalturing has issued a suite of five benchmarks
that include, for starters,
intern to expert coding tasks,
ranging from tasks a junior developer could handle
to problems requiring superhuman level code generation
and architecture design skills.
We regularly cover top 500 benchmarks,
HPL, HPCG, HPL-MXP for mixed precision,
green 500, and related benchmarks like the IO 500.
In episode 91 of the full format in-depth at HPC podcast,
we also discussed AI benchmarks with special guest David Cantor of ML Commons.
MLperf benchmark for AI training is well known, and ML Commons
provides a suite of benchmarks and datasets for other aspects of AI, such as
inference, storage, and safety. When it comes to artificial general
intelligence, it's more complicated. There's already the ARC-AGI test, which
aims to measure how well AI can pick up a new skill and compares AI results with the equivalent human output,
accounting for prior knowledge by humans
and regulating what knowledge AI can use so it can enforce reasoning.
It was introduced in 2019 by Francois Chollet,
who is creator of Keras, an open source deep learning library.
We have to mention Math 500, which
consists of 500 pretty hard high school level math problems
that require complex and multi-step reasoning to solve.
There are really many others focused on specific skills,
like multilingual, academic subjects, coding,
or specific workflows.
The new Turing benchmarks are a good addition
and include, besides the coding
one that you mentioned, tests for data science pipeline, math, multimodal reasoning where
text, image, and video can be combined. I guess we used to call that multimedia and
vertical benchmarks for financial services and such. Note that these benchmarks include
relevant data sets, which are important for training.
There was news from the quantum sector this past week, one of them that the quantum systems
maker D-Wave, their stock jumped 64% on news that the ULIC supercomputing center in Germany
is acquiring D-Wave's Advantage system, which reportedly will be integrated into ULIC's
Jupiter supercomputer, Europe's first exascale class system now being installed.
The jump in the stock price, of course, is a classic characteristic of an early stage industry
with startups that are long on promise and shorter on actual revenue,
although the quantum industry surpassed the billion dollar mark last year.
Those who invest in pure play quantum stocks know that share price volatility is to be expected at this stage in the game.
Quantum tech companies that manage to go public have seen volatility in their stock price as news of this or that development jolts investors.
As you said, the whole field is at its infancy and is changing fast, and nobody has a grip on even a small sub-segment of the market. As you might recall, some quantum computing companies
went public through SPAC, Special Purpose Acquisition Company,
also known as blank check companies.
In that process, an initial public offering
creates a publicly traded company to go acquire or merge
with a private company.
There was also news that the University of Science
and Technology of China has bettered Google's Willow
quantum processing unit, which was unveiled late last year.
It ran the same random circuit sampling benchmark
a thousand trillion times faster
than the fastest classical HPC system.
And that is a million times faster than Google's results.
Just like Google's Willow,
the superconducting Zhu Shangzhi 3.0 system
uses 105 qubit QPU.
Just like Google, they publish their results
in a well-respected scientific journal,
in this case, Physical Review Letters.
And just like Google, the benchmark they ran
has no practical value,
I guess, besides showing that it actually runs.
All right, before we close,
the annual Supercomputing Asia Conference
is being held this week
at the beautiful Marina Bay Sands in Singapore
with a theme of empowering AI science and innovation.
It includes a lot of very good talks and research reports.
So if you are lucky to be there,
please share updates and pictures,
and I hope to make it there next time.
All right, that's it for this episode. Thank you all for being with us.
HPC News Bytes is a production of OrionX in association with InsideHPC.
Shaheen Khan and Doug Black host the show. Every episode is featured on InsideHPC.com
and posted on OrionX.net. Thank you for listening.