WEBVTT

00:00:00.000 --> 00:00:03.288
is the ai bubble popping you might have
been watching your stocks today

00:00:03.299 --> 00:00:07.068
and wondering that yesterday the stock
market saw one of its worst days

00:00:07.079 --> 00:00:09.788
since april in part
because investors are questioning

00:00:09.799 --> 00:00:14.728
whether the ai rally is really all that
sustainable according to the wall street

00:00:14.739 --> 00:00:17.208
journal the ai boom is looking more
and more fragile,

00:00:17.219 --> 00:00:22.528
with analysts warning
that AI spending is now outpacing AI revenue.

00:00:22.539 --> 00:00:26.608
Joining us now,
Shark Tank host and CEO of the Herjavec Group,

00:00:26.619 --> 00:00:27.588
Robert Herjavec.

00:00:27.599 --> 00:00:28.548
 What a treat.

00:00:28.559 --> 00:00:32.428
Thank you for having Thanks for being
in here with us and joining my show.

00:00:32.439 --> 00:00:36.208
 So morbid, the AI bubble bursting.
 Well, it was a bad day yesterday.

00:00:36.219 --> 00:00:38.228
 What do you think?
 It was a bad day.

00:00:38.239 --> 00:00:43.888
I think the one part I disagree with is
that the AI spending is out outpacing AI revenue.

00:00:43.899 --> 00:00:44.868
Why do you disagree with it?

00:00:44.879 --> 00:00:48.328
Because of that,
it's been outpacing it for years

00:00:48.339 --> 00:00:50.608
and I think it's going to be many years to do it.

00:00:50.619 --> 00:00:52.568
 I think companies are taking two approaches.

00:00:52.579 --> 00:00:57.908
Some are building their own data centers,
aka Oracle with a huge $50 billion investment,

00:00:57.919 --> 00:01:00.528
and others are partnering like Microsoft.

00:01:00.539 --> 00:01:05.008
What sort
of deliverables do they need to meet to

00:01:05.019 --> 00:01:07.568
prove all the AI doubters wrong?

00:01:07.579 --> 00:01:10.528
I mean, right now there's a lot of talk about AI,
there's a lot of promise

00:01:10.539 --> 00:01:12.868
about replacing workforce
and all this efficiency.

00:01:12.879 --> 00:01:15.188
We haven't seen that come to life.

00:01:15.199 --> 00:01:20.628
And I ask that because one of the big AI guys,
Anthropik Dario Amodai,

00:01:20.639 --> 00:01:23.528
said that in three months,
20 % of the workforce is going to be

00:01:23.539 --> 00:01:24.008
laid off. And

00:01:24.019 --> 00:01:27.028
And that was
like six months, maybe a year ago now. But

00:01:27.039 --> 00:01:30.748
But Katie, it's happening.
Look at the recent Amazon layoffs.

00:01:30.759 --> 00:01:33.168
And I think you're going to see much more
of that. It's

00:01:33.179 --> 00:01:36.708
going to be hard to correlate direct
revenue to AI

00:01:36.719 --> 00:01:39.888
because it's going to be efficiency
at the first step.

00:01:39.899 --> 00:01:44.368
Today, with Agentec AI,
your predictability in functions

00:01:44.379 --> 00:01:45.728
that you couldn't do before.

00:01:45.739 --> 00:01:47.628
But let's step back from that.

00:01:47.639 --> 00:01:50.788
Where are we going to see
the biggest impact look what happened yesterday

00:01:50.799 --> 00:01:57.488
with the anthropic cyber attack so a chinese
group was able to manipulate i'm going

00:01:57.499 --> 00:02:02.748
to use the word anthropic using agentic
ai to do a cyber attack so instead

00:02:02.759 --> 00:02:06.268
of a bunch of hackers trying to get
into a system they had ai do it

00:02:06.279 --> 00:02:09.628
it was basically a couple of guys
and the thing

00:02:09.639 --> 00:02:15.048
with ai is it doesn't think it does
no good it doesn't know bad they were telling it to

00:02:15.059 --> 00:02:19.588
were doing a good attack to test
the perimeter defense it has no morality

00:02:19.599 --> 00:02:22.728
it has no morality
and we don't want it to have any morality

00:02:22.739 --> 00:02:27.748
so you know i started a cyber company
20 years ago as you said was called the herge

00:02:27.759 --> 00:02:32.448
of that group it's now called cyderis
cyber defense and response one

00:02:32.459 --> 00:02:36.728
of the biggest cyber companies
in the world and i used to say we work

00:02:36.739 --> 00:02:41.048
in the age of the internet internet time
and then covid happened i was

00:02:41.059 --> 00:02:47.748
like it's covid time now
with agentic ai things are moving so quickly

00:02:47.759 --> 00:02:52.408
that i don't think we're prepared
for it yet if we're not prepared

00:02:52.419 --> 00:02:55.708
for it yet what's going to happen i mean
we see the stock market that's you know

00:02:55.719 --> 00:02:59.528
teetering a little bit there's concerns
about it what if it the stock market

00:02:59.539 --> 00:03:02.868
burst that's one question and then secondly
if ai does deliver

00:03:02.879 --> 00:03:09.309
on all of these promises how are we
as a society better off especially

00:03:09.320 --> 00:03:14.349
if there's layoffs and people can't get
into the workforce well it it it's

00:03:14.360 --> 00:03:19.849
a two-edged sword i'm a huge believer
in ai partly because i see it

00:03:19.860 --> 00:03:22.529
as the next industrial revolution i'm
a huge skeptic

00:03:22.540 --> 00:03:31.769
of it so tell me why i shouldn't be so nervous
if you and i were sitting here in the 1900s we would be talking about blue-collar factory workers under horrible

00:03:31.780 --> 00:03:35.669
conditions, making no money,
being replaced by automation.

00:03:35.680 --> 00:03:37.029
Right?

00:03:37.040 --> 00:03:40.249
Today, we're talking
about AI that's going to free a lot

00:03:40.260 --> 00:03:43.369
of people in order to do high value tasks.

00:03:43.380 --> 00:03:48.589
Now, if you're a white collar worker today
and you're doing middle level management,

00:03:48.600 --> 00:03:49.849
I got news for you.

00:03:49.860 --> 00:03:53.789
 You're in trouble.
 So what will you do in the future?

00:03:53.800 --> 00:03:55.969
You have to upskill.

00:03:55.980 --> 00:03:58.889
And I think that's a responsibility
of a lot of enterprises

00:03:58.900 --> 00:04:03.289
and government to give that education to people
but look at look at the world we're living

00:04:03.300 --> 00:04:06.409
in right now we have um you could argue
that all the politics

00:04:06.420 --> 00:04:09.749
that we're experiencing
the anger that's out there is

00:04:09.760 --> 00:04:12.609
in part due to globalization, NAFTA.

00:04:12.620 --> 00:04:18.989
This outsourcing of unskilled labor overseas,
the factory workers,

00:04:19.000 --> 00:04:24.389
left a whole generation
or a whole demographic of people jobless

00:04:24.400 --> 00:04:27.589
and angry and feeling
like they're not contributing to society

00:04:27.600 --> 00:04:29.029
or not valued any longer.

00:04:29.040 --> 00:04:32.769
If we do that again,
aren't we just setting ourselves up

00:04:32.780 --> 00:04:36.669
for another revolution
in our politics where there is a

00:04:36.680 --> 00:04:40.269
whole new class of very angry people feeling
like they're left behind.

00:04:40.280 --> 00:04:47.169
Every major shift creates victims
and every major shift creates opportunities.

00:04:47.180 --> 00:04:52.849
And I think what I would say to people is
you have to take a responsibility to be

00:04:52.860 --> 00:04:53.989
on the right side of that.

00:04:54.000 --> 00:04:56.329
It's not that there isn't opportunity.

00:04:56.340 --> 00:05:01.069
I think that the anger is people feel
there is an opportunity for them,

00:05:01.080 --> 00:05:04.569
that the American dream isn't alive for them.

00:05:04.580 --> 00:05:09.369
And I think part of that is the gap
between great wealth and most people,

00:05:09.380 --> 00:05:10.829
it's just widening.

00:05:10.840 --> 00:05:17.289
But I will tell you, AI has the ability,
you can start a business today using AI,

00:05:17.300 --> 00:05:22.089
and you have access to computing power
to things you couldn't do in the past.

00:05:22.100 --> 00:05:27.649
 So if you know how to use it,
 this is an amazing opportunity.

00:05:27.660 --> 00:05:31.209
You know, it's like Mark Cuban and said,
if I was a Gen Z today,

00:05:31.220 --> 00:05:35.969
I would be doing one thing with my life,
living, breathing, and sleeping AI.

00:05:35.980 --> 00:05:38.779
 What would you have your kids study in school?

00:05:40.240 --> 00:05:42.379
 I would, that's a great question.

00:05:43.740 --> 00:05:46.949
My son has a hard time writing,
so I go to school

00:05:46.960 --> 00:05:48.869
and the teachers tell me how important it is.

00:05:48.880 --> 00:05:50.709
And I have to tell you,
I'm sitting there going,

00:05:50.720 --> 00:05:52.789
he's never going to write.

00:05:52.800 --> 00:05:59.429
Why are we having this meeting i would
have him learn two things one is anything to do

00:05:59.440 --> 00:06:04.769
with computers coding math that kind
of stuff but more importantly creativity

00:06:04.780 --> 00:06:10.329
the ability to think
outside the box i think will always have

00:06:10.340 --> 00:06:14.309
a premium so being a creative person
that doesn't that can create things that

00:06:14.320 --> 00:06:18.589
that the computer cannot yes i guess
but you can envision i mean

00:06:18.600 --> 00:06:19.749
computers are going to get creative

00:06:19.760 --> 00:06:20.069
 too.

00:06:20.080 --> 00:06:23.089
But I think
if you have the ability to see things

00:06:23.100 --> 00:06:26.969
others don't and stay slightly ahead
of the market, I mean,

00:06:26.980 --> 00:06:28.989
I think the market's going to we're
going to have some problems

00:06:29.000 --> 00:06:32.329
with some of the stocks for a short while,
but there'll be massive opportunities.

00:06:32.340 --> 00:06:36.889
I asked Chachi PT to write a paragraph
in the style of my two books

00:06:36.900 --> 00:06:42.789
and the paragraph that it came up
with scared me because it made me laugh.

00:06:42.800 --> 00:06:45.609
It was actually pretty funny.
Was it better than you?

00:06:45.620 --> 00:06:47.609
I don't know if it was better
than me but I mean

00:06:47.620 --> 00:06:52.909
if you're not a discerning consumer
I don't know I mean I think we got to be

00:06:52.920 --> 00:06:57.609
discerning consumers and that scared me
because if you're if you're replacing we're

00:06:57.620 --> 00:07:01.449
out of time creative thought
with just ai slop it's going to leave

00:07:01.460 --> 00:07:04.269
all of us kind i think in a worse place
but i got to go um

00:07:04.280 --> 00:07:07.849
but you think every day we see people
on shark tank trying to live their american dream

00:07:07.860 --> 00:07:10.949
yeah you can start a business out
of your basement you can use ai

00:07:10.960 --> 00:07:12.669
and do functions you couldn't do

00:07:12.680 --> 00:07:14.489
 in the past.
I want to write a book.

00:07:14.500 --> 00:07:18.169
Thank you so much for joining us.
Thank you.

00:07:18.180 --> 00:07:19.439
 Really good to have you.
What a treat.
