There’s
been
a
lot
of
talk
about
artificial
intelligence’s
current
phase:
infrastructure
building.
That
has
obvious
benefits
for
chipmakers
such
as
Nvidia
and
hyperscalers
like
Amazon
,
Microsoft
and
Alphabet
.
Hyperscalers
provide
the
massive
cloud
computing
power
needed
for
AI
applications,
and
analysts
have
predicted
the
growing
need
for
more
data
centers
as
they
house
the
vast
amounts
of
computing
power
needed
for
AI
workloads.
But
the
next
bottleneck
in
AI
infrastructure
—
and
one
to
invest
in
—
is
networking,
according
to
tech
analysts.
Networking
in
normal
tech
terms
refers
to
a
network
of
devices
that
can
transmit
and
share
information
over
physical
or
wireless
communications.
In
AI,
however,
the
requirements
are
higher
because
of
large
language
models
and
other
AI
applications
that
require
very
high
bandwidth
and
low
latency.
“While
Nvidia
and
its
graphics
processing
units
get
most
of
the
headlines
for
generative
artificial
intelligence,
we
see
networking
as
a
critical
companion
in
the
hardware
that
undergirds
models
and
applications
such
as
ChatGPT,”
Morningstar
analysts
said
in
a
June
2024
report.
“The
focus
largely
up
until
now
has
been
on
the
[graphics
processing
units],
the
actual
AI
chips.
These
are
the
most
important
part
of
this
puzzle,
of
course.
But
networking
is
where
we
see
the
next
bottle
neck
playing
out,”
Clare
Pleydell-Bouverie, portfolio
manager
at
Liontrust
Asset
Management,
told
CNBC
Pro
Talks
in
May.
That’s
because
the
“large-scale
systems”
that
are
coming
to
the
market,
such
as
Nvidia’s
rack
scale
systems,
require
“vastly
more”
infrastructure
content
such
as
networking,
she
said.
Morningstar’s
equity
analyst
for
technology,
William
D.
Kerwin,
and
technology
equity
strategist,
Brian
Colello,
said
they
believe
the
need
for
fast
networking
in
generative
AI
will
directly
translate
to
“strong,
long-term
growth
for
well-positioned
networking
vendors.”
An
increase
in
investment
in
generative
AI
model
training
and
inference
will
drive
the
AI
networking
spending
growth
of
34%
over
the
next
five
years,
said
the
firm.
That
translates
to
$34
billion
in
spending
in
2028,
up
from
Morningstar’s
2023
estimate
of
$8
billion.
“Networking
creates
a
performance
bottleneck
for
generative
AI
model
development,”
Morningstar’s
analysts
said.
“Well-positioned
networking
firms
are
a
great
second
derivative
play
to
invest
in
generative
AI,”
they
added.
“The
majority
of
generative
AI
spending
will
go
toward
GPUs,
but
networking
is
critical
infrastructure
to
enable
GPU
performance.”
Stocks
to
play
the
trend
Marvell
Technology
is
Morningstar’s
top
pick
to
play
the
generative
AI
networking
trend,
with
the
firm
saying
it
is
“attractively
undervalued”
currently
and
giving
investors
an
“immediately
opportunity”
to
tap
rising
generative
AI
networking
investment.
Other
key
winners
in
this
networking
trend
are
Arista
Networks
,
Nvidia
and
Broadcom
,
Morningstar
said.
However,
its
analysts
believe
that
the
generative
AI
opportunity
is
“largely
priced”
in
for
these
three
stocks,
as
their
share
prices
have
already
gone
through
“robust”
appreciation.
“However,
patient
investors
can
wait
for
a
pullback,
as
the
long-term
fundamental
opportunity
is
strong,”
said
Morningstar.
It
added
that
it’s
bullish
on
Ethernet
adoption
in
generative
AI
networks,
referring
to
a
type
of
networking
standard.
Arista
would
be
the
primary
beneficiary
of
the
transition
to
Ethernet,
according
to
Morningstar.
The
current
technology
commonly
used
is
InfiniBand.
“There
are
very
few
players
that
are
able
to
really
step
up
to
provide
this
infrastructure,”
Pleydell-Bouverie
added,
referring
to
the
networking
infrastructure.
She
named
Meta
and
Broadcom
as
stocks
to
play
the
trend.
Broadcom
is
the
“leader”
in
networking
chips,
and
it’s
set
to
benefit
as
Ethernet
emerges,
she
added.
“Ethernet
networking
is
emerging
to
be
the
sort
of
de
facto
standard
for
scaling
these
AI
workloads.
And
Broadcom
has
got
the
best
in
class
sort
of
chips
that
power
this
Ethernet
network,”
Pleydell-Bouverie
said.