FYI - For Your Innovation: Tesla Earnings, Hollywood Strikes, AI Updates | The Brainstorm EP 07

ARK Invest ARK Invest 7/26/23 - Episode Page - 34m - PDF Transcript

Welcome to the Brainstorm, a podcast and video series from Arc Invest.

Tune in every week as we react to the latest in innovation and reflect on how short-term

news impacts our long-term views.

To learn more, visit arc-invest.com.

Arc Investment Management LLC is an SEC-registered investment advisor.

Arc and public are unaffiliated entities and do not have a relationship with respect to

either firm marketing or selling the products or services of the other.

And therefore, Arc disclaims responsibility for any loss that may be incurred by public's

clients or customers.

The information provided in the show is for informational purposes only, and should not

be used as the basis for any investment decision, and is subject to change without notice.

It does not constitute either explicitly or implicitly any provision of services or products

by Arc, and investors should determine for themselves whether a particular investment

management service is suitable for their investment needs.

All statements made regarding companies or securities are strictly beliefs and points

of view held by Arc and or show guests, and are not endorsements by Arc of any company

or security or recommendations by Arc to buy, sell, or hold any security.

Historical results are not indications of future results.

Certain of the statements contained in the show may be statements of future expectations

and other forward-looking statements that are based on Arc's current views and assumptions,

and involve known and unknown risks and uncertainties that could cause actual results, performance

or events to differ materially from those expressed or implied in such statements.

Arc assumes no obligation to update any forward-looking information.

Arc and its clients as well as its related persons may, but do not necessarily, have

financial interests in securities or issuers that are discussed.

Certain information was obtained from sources that Arc believes to be reliable, however,

Arc does not guarantee the accuracy or completeness of any information obtained from any third

party.

Hi everyone, welcome back to another episode of the brainstorm.

Episode 7, keep throwing up the fingers.

Today we have three exciting topics and a special guest.

We have Tasha Keeney, director of investment analysis and institutional strategies on to

talk about our first topic, which is Tesla earnings and FSD updates.

Then we'll get into Netflix and the ongoing strikes, and we'll end with some updates on

the AI landscape.

So let's open it up to Sam and Tasha.

Tasha, why don't you start?

Just give us a quick rundown of the quarter for Tesla and some of these FSD updates we

got.

Yeah, so I think, so Tesla reported last week, I think, you know, a lot of the concern is

sort of over, we'd call them shorter term, shorter term thing.

So mainly, you know, how our margins going to be affected by price cuts, you know, we

heard that there would be factory updates that will affect production and, you know,

as well as sort of this kind of looming concern of demand.

But I'd say from the arc side, you know, thanks to Sam's work, we are not concerned

over demand for electric vehicles.

We think they'll be the future, Tesla is still the leader there.

On the autonomy side, what was really interesting was, so one, I mean, I think, like just zooming

out over time, Sam, like FSD autonomy is such a focus.

And if you look back five years ago, it was barely talked about at all.

So I think, I think, you know, that's from our perspective, that's great, because we

think it'll be, you know, over 60% of the enterprise value of the company over the next

five years.

So I'm happy to see it talked about so much.

You know, we did hear Elon say, you know, he thinks it's going to be quasi-infinite

demand for the RoboTaxi, right?

We think that, you know, we've estimated the total addressable market is more like $11

trillion, let's say, over the next decade for RoboTaxi or autonomous ride hail.

And we heard that Tesla say that they think that their RoboTaxi vehicle could be, you

know, in terms of like produce cars per hour, you know, the highest production of any car

in history, which I think just kind of highlights their commitment to this.

Tasha, what do you say?

I feel like one of the biggest pushbacks, full self-driving, it's just never going to

happen.

It's impossible.

Yeah.

Well, I think, so we get asked this question a lot.

So our midpoint is that Tesla will launch RoboTaxi service in 2024.

So that's next year.

Like, why do we think that?

Well, okay, Tesla has said, and Elon has said many times that he thinks this year they

will solve for full self-driving, meaning, you know, right now it's a beta sign.

So software, you have to keep your hands on the wheel.

What I imagine full self-driving to be is, you know, you actually don't have to pay attention

as the driver anymore.

So I think that, one, you know, we've always been a little bit more conservative than Elon.

So, okay, we think, okay, if they cross that threshold this year, what's the likelihood

they'll actually launch a service, which kind of does take some, you know, operational and

logistics bandwidth to do.

But I also think that, I mean, there's a lot of things that give us confidence.

There's that sort of, you know, managing Tesla's timeline.

There's looking at the videos that we see online, posted on Twitter by users.

Some of them actually, you know, aren't, you don't have the steering wheel nag feature

on, so their hands are actually off.

But they're completing what look like, you know, start to finish ride-hell rides with

Uber on their phone.

Or at least, you know, barely touching the wheel, maybe in the very beginning and the

very end.

So it seems like the FSD capability, at least in certain geographies, right?

Maybe not everywhere.

Maybe California, where it works the best.

It's becoming, it looks really close to possible.

And then the third thing that I'd mention is just, you know, all the updates that we've

seen this year in AI broadly.

I mean, we know that those like, the indiffusion models and large language models, that, you

know, well, yeah, it's not exactly the same, that a lot of, a lot of those advances should

in some way, port over to other industries like autonomy.

And I think actually, diffusion is a good example, because we know that Tesla is moving

towards that even more so.

And I think, Nick, before you ask a question here, I see you want to.

I think the other thing that's important to point out is that there are already autonomous

cars out there that are charging money for rides, right?

So just the basic assumption that it's impossible and will never happen needs to be thrown out

of the window, because there's over 10 cities right now where you have autonomous cars.

And so the question isn't, is this possible?

We know it's possible.

The question is how quickly can this scale?

And I think, you know, the exciting thing about Tesla's approach is the potential for

how quickly it can scale.

But Nick, over to you.

Yeah, I have just a question on the OEM announcement and Tesla potentially licensing FSD.

You should be taking that as a good sign that FSD is getting close to full-scale release

if they're in talks with other OEMs to license it, right?

And also, what's an OEM?

Yes, original equipment manufacturer.

That's an automaker.

And yeah, I'm glad you brought that up, Nick, because, one, it was a very important take

away from the call.

I think, I don't know, there's two ways to look at this.

It's like the traditional automakers have not been very good at autonomy.

So I don't know, do I actually trust their judgment as to what their next move is?

You know, maybe not, but to your point, yes.

It's like Tesla has to be showing them something that they think is interesting.

But you know, I also just think it's kind of like the supercharger network is the gateway

drug in this situation, right?

So we saw a lot of automakers sign up for that.

It seems, it's better for them.

It's also obviously better for Tesla.

It gives them kind of a nose under the hood and to other electric vehicles.

So, and I don't think that every traditional automaker will be successful in autonomy.

And I, you know, according to Sam and I's work, we think that industry will consolidate

quite a bit.

So I'm not surprised to see it.

And I want to switch gears here.

Maybe we'll come back to FSD, but we did also get news on Cybertruck.

Elon saying demand is so far off the hook.

You can't even see the hook.

What are your thoughts?

Sam, I know you've looked at it.

Tasha, curious your thoughts there as well.

Yeah.

I mean, from the Google trends, anytime Cybertruck is mentioned, you can see the search demand

for it is pretty crazy.

It surpasses that of the Model Y, which is very rare considering that the Model Y is

in production and a car that you can actually buy.

And the Cybertruck is just starting production.

And then the interesting thing there is if you look at the states where that search volume

is highest, it is middle of the country.

And that's great because a lot of EV buyers right now are on the coast, and that's where

a lot of Tesla's customers are.

So these are people who, you know, you would imagine are using trucks more, potentially

for work as well.

And so to see that demand there is great.

Yeah.

After camo wrap on the Cybertruck, that's a chef's kiss.

That is a cool-looking vehicle.

Yeah.

I mean, I think it's incredible.

Obviously, I think Musk is hedging it and saying it's a lot of new technology.

And we know how technology ramps go and can be slow to scale.

So you do need to balance that.

And I think this quarter is another good reminder of just how short-term the market is to what

Tasha's saying.

The questions from the investment Wall Street professionals are about production in the

next quarter, next two quarters, what's pricing going to do.

Meanwhile, you have a company that's ramping over 50% growth year over year, is in the

best position cost kind of cutting prices to drive demand if they need to and still

remaining profitable.

So I think this is kind of a classic case of if you look short-term, you can find things

and say, oh, we want more, we want more.

But if you take a step back and you just appreciate this picture and the progress they're making,

it's pretty remarkable.

And then maybe as a last topic here, Tasha, I know you and Daniel have been doing work

on this.

But the rate of data collection for Tesla and full self-driving, what did we learn in

this quarter and why is it so meaningful?

Yeah.

So everyone go follow our associate Daniel McGuire on Twitter.

He's doing a lot of great work.

And what Sam's talking about is we've looked at Waymo and Cruz, which are the two companies

that actually have autonomous commercial vehicles on the road to give them credit today.

But they're doing it in somewhat of a geofenced strategy, certain cities, they're launching,

they have at most, I believe, hundreds of cars on the road, although Cruz does have

plans to scale that to over a million in the fleet by the end of the decade, yeah, by 2030.

So OK, but how quickly are they actually, let's say like collecting data?

And when I say collecting data, it's like, it's the option to look at the miles that

these cars are driving and pull interesting situations out of them.

So you're not collecting every data point, but Waymo and Cruz and Tesla, if you compare

Tesla's FSD miles, miles driven in FSD by customers, how long did it take each of these

companies to get to their most recent additional million mile mark?

So how quick can they drive a million miles?

OK, for Waymo, it's close to two years.

For Cruz, it's like a month and a half based on their most recent announcement.

For Tesla, they're actually collecting over two million miles of FSD data per day.

And that's just FSD, by the way.

They have hundreds of thousands of drivers that are opted into this right now, but they

have millions of cars on the road.

So they're able to test other features in shadow mode on a very large fleet, but just

FSD data alone, over two million miles a day compared to one and a half months and years.

I mean, I think that just shows the immense data advantage that Tesla has that we think

will continue to propel them as one of the most interesting AI plays out there.

Tasha, I want to just wrap on one last question here.

Elon, and I think it was in the investor letter note, talked about this next generation

vehicle platform.

And so you talked about the data advantage.

But what about on the manufacturing side?

Because he said this would be the fastest produced vehicle in history.

I know we don't have a lot of information on it yet, but curious what your thoughts are

in hearing the tidbits we got this last quarter.

Yeah, so it's a question of, he might mean a particular model.

But so it's like Toyota, I think, produces eight to nine million cars a year, but that's

multiple models.

But in any case, I think that, what does this mean?

I mean, again, it just shows Tesla's overwhelming commitment to actually making autonomy happen.

Yeah, we do see that Cruz wants to have a million cars on the road in the next 10 years.

But I mean, Tesla's planning on ramping this quickly.

And if they are successful on the software side, if they do cross that full autonomy

mark, I mean, just the fact that they are setting up today, the capability to make that

scale happen really matters, right?

Because they're not going to be the first.

Waymo and Cruz were the first to launch, but they might be the first to scale.

Yeah, that's a great point to end on.

And Tasha, we thank you for coming on.

We'll have to have you back again to talk all things Tesla and maybe 3D printing.

We can do a lot.

Maybe in just a couple months when full self-driving launches in general.

We'll do the episode in the back of Tesla while it's driving us around.

Exactly.

I can't wait.

Okay.

Good to see you guys.

All right.

Yeah.

Bye, Tasha.

Okay.

Let's move on here.

We'll go to Netflix, the ongoing strikes.

I'll tee it up and then Sam, feel free to pepper me with questions as you see fit.

So what is the backdrop?

We had Netflix earnings.

We had the CEO talking about the ongoing strikes.

And to give some background here on the strikes, for the first time in 63 years, we have both

actors and writers in Hollywood striking.

And the reason this is, and I want to quote Ben Thompson and a piece he wrote about the

ongoing strikes, the most important driver of unrest between studios and talent has

always been technological paradigm shifts.

And this time is no different.

And I think this one sentence really gives you a good understanding of what's happening

here.

And in the last 63 years, we've had a lot of innovation in the entertainment space.

In the 60s, when this was, you know, when they were striking, the reason was television.

Actors wanted a cut of television.

And what came out of that were residual checks.

Actors got paid as their shows were played on television.

Now we have streaming companies enter Netflix.

Netflix came in, they made a big splash, and they really changed how actors and the industry

viewed residuals.

And a part of it was, and this is not the full story, but a part of it was writing upfront

checks to the studios and production companies.

And so paying upfront for content, but then, you know, lowering the residual or having,

in some cases, no residual checks on the back end once those shows were on the platform.

And so now we're at a point with Netflix and the streaming companies where there is unrest.

I think one of the demands right now is the actors and the production companies, the writers

as well, they want 2% of all streaming company revenue.

And that would be audited by a third party.

So if you have a show, a third party should be able to audit and see how much that show

contributes to revenue and signups.

So there's a lot going on, it's a very complex situation.

I've done my best to summarize, I'm probably missing a ton here.

But it really is, I think at the heart, it's, you know, you have this technological change

in streaming, and then a little bit in AI too, which is kind of this boogeyman waiting

in the closet, but you have actors and writers getting ahead and wanting a bigger piece of

the pie.

Yeah, Nick, there's so many things with this that I feel like are being missed by the writers

and the studios.

And obviously, you know, you need fair pay, that makes sense.

But you know, just this share of, you know, revenue for streams, that doesn't make sense

anymore, because it's not one-to-one, right?

You have this backlog of content, and that's, you know, there to reduce churn, and so it's

not as directly correlated, I would say, as kind of those residual checks are.

But my question to you, especially as you've talked about gaming and all of this stuff,

it's like, are they just, they're like squabbling over here, and meanwhile you've got like mobile

and user-generated content as this tidal wave.

And so it's like people fighting over a shrinking pie, as the world is changing, it's like forget

streaming, it's like there's so many other things happening here, where it seems like

they're not fighting over a slice of a bigger pie, it seems like they're fighting with each

other over slices of a shrinking pie, and maybe that's kind of what's causing the arguments

to arise in the first place.

Yeah, and I think, you know, again, I will say it's a very complex issue, right?

And you have so many moving pieces.

You talk about user-generated content with YouTube, you talk about this shrinking pie

within premium theatrical releases.

I mean, we just saw, you know, over this weekend, Barbie and Oppenheimer, that did very well,

but that's not very typical right now to have, you know, a box office weekend like we just

had, the hit, you know, driven nature of theatrical leases, I think has been exacerbated post-COVID.

And so you're right to say that it is a shrinking pie, and they might be missing the larger,

you know, piece here, maybe missing the forest for the trees, but also I think they do recognize

the forest, right?

That's why they're trying to get, you know, more favorable terms when it comes to this

shrinking pie, because potentially, you know, it continues to shrink here.

But I think your point around user-generated content, YouTube, and then when you bring

an AI, and that's really where I want to focus this conversation, because I think that's

the real change that's coming.

I think streaming is kind of this intermediary moment in time, but AI is going to completely

change how we view and consume and create content.

And that's what I think should be really scaring everyone when you bring AI into the equation.

That allows everyone in the world to become a content creator, and a content creator with

the tools and services that had previously been gate-kept by studios, right?

Like you can't, as an individual creator, get access to, you know, high fidelity visual

overlays and some of these FX, you know, things we see in the large theatrical releases.

With AI, that becomes a reality in a not-so-distant future, right?

So that's what I think is really dangerous.

I don't think you need to be...

You said everyone should...

No one should be scared, right?

It's like there's going to be some big winners here.

And I think, you know, one of the things that we discussed in the brainstorm was people

who have IP, right?

I think it was this past week where there's the tool release, the simulation.

It generates a whole South Park episode.

You can put yourself in, you know, if you've got these strong IP kind of characters and

brands going here and you can leverage those and let people generate content and everything

like that, that, you know, that's potentially very exciting for those companies.

And actors and actresses and writers, right?

They all have individual IP and brands that they can leverage in a world where AI allows

anyone to create.

Will you allow that?

It's kind of like the first episode of Black Mirror this past season.

Like, are you going to allow people to create with your brand?

Probably not.

It could get quite dangerous and you could have yourself or your brand doing something

that you don't want if you just kind of allow it out there.

But I do think...

Yeah, you're right.

I probably misspoke in saying that, you know, people shouldn't be scared.

This is an amazing opportunity.

And it's equivalent and probably larger in the opportunity if you look back at when we

changed over with distribution, right?

Like if you look at the last 10 years, now everyone has access to global distribution.

And again, rewind the clock back to the 60s, no one had access to global distribution except

for, you know, the companies at the time that controlled distribution.

Now everyone has access to YouTube.

Everyone has a phone.

You can create and upload.

But now what's changing with AI is you're now allowing for the creation to scale in the

same way that distribution did.

And so you're giving those tools to create.

So the cost of creation, I think, is going down to zero in almost every category over

time and that is very meaningful for this entire space.

And that's something, you know, we'll probably be having the same conversation again in 10

years when people say, hey, what are we doing here?

We need to be, you know, guarding our IP against this AI force that has, you know, started

to generate content across the globe.

Right.

And I think Brett Winton tweeted out showing the cost to write, I think it was comparing

to like a 90th percentile GRE quality of writing has gone down to, you know, 2 cents for 500

words.

So that's pretty close to free and we're getting there.

Some other analogies that were thrown out there and interested on your takes here was

comparing kind of the studio and writers to either the Ivy League or to kind of like

Michelin star where, you know, you have these institutions, they're high quality, but, you

know, what makes a high quality, high quality?

And is this really exclusivity versus, you know, those people are the best of the best.

And, you know, it's just such a steep talent curve that only the best get into those writers'

rooms as well.

Or is this kind of, you know, a democratization of what's going on and McDonald's is the huge

company and they make a lot of money and they're not a Michelin star restaurant.

Yeah.

I'll try to give the analogy.

I'll stick to food because I'm a bit of a foodie myself, but it's like if you were to,

you look at Hollywood, you say, you know, what they produce are Michelin star quality

dishes.

Right.

So, what are you guys going to allow just the everyday content creator to do is put

out, you know, hamburgers and hot dogs.

What scales hamburgers and hot dogs, right?

Like McDonald's to your point is the largest restaurant chain globally, I think.

But it's, you know, because you can reach that larger audience, why it is expansive to

the market.

I don't think AI is bad for content creation.

I think it's great.

It's going to expand the market.

The more creators, it's, you know, a cycle.

If you have more creators, you have more viewers and on and on and on we go.

So this is overall, I think, a positive thing, but it has to be done in the right way.

And that's where it's going to get murky, right?

Like there's going to be great periods over the next few years of, hey, we shouldn't be

allowing this type of content to be out there, but it's going to be really hard to rein it

in once you just throw these tools out there and maybe that brings us to the last topic

here.

Before we go to the last topic, we got to get the people in on Nick Groose's law of content

and see what they think of it.

So Nick's been saying this, you know, for probably like four years now and we've had

some debates over it and I think we've refined it to a pretty good spot.

So Nick, what's your rule of everything for content?

Yeah, I think.

Or for everything.

Is it a rule of the university?

I think if, I mean, we can spend another 15, 20 minutes talking about this, but the way

that I've described it is people and systems naturally build to low input, high output,

unless acted on by another force.

And what I mean by that is you want to put into something the lowest amount of energy

and get the highest amount of dopamine or calories out of it, right?

So more people watch sports than play it.

And that's because, you know, to get to the professional level is extremely hard, but

to watch professional sports is quite easy.

So I'll leave it at that, but I think you can take that and look at most things, especially

in the content space and say, oh, that does make a lot of sense.

Look at TikTok, right?

It took YouTube videos that were 30 minutes in length, made them 30 seconds in length,

and now TikTok has scaled to YouTube level of engagement.

And so I think it's kind of that understanding, and it kind of gets back to my pessimistic

view on people and including myself that we tend to be quite lazy, but we want to still

experience life and have great moments, but we tend to be lazy in how we get that.

So I think it's a good framework for viewing content, and especially over the last 10 years.

Look at what social media has done.

We've shortened our attention spans to fractions of a second at this point.

That's right.

And Unscripted TV has definitely taken a stranglehold of streaming as well, and I would say that's

probably likely due to that low-input, high-output framing.

Right.

Right.

All right.

So now we'll get to our last topic.

We'll be quite brief on this, because I think we've talked a lot about AI and its implications,

but Sam, maybe tee it up and talk to us about what Meta is doing with their Lama model.

Lama 2.

So this is an open-source AI model.

And first, you know, the most important question here, what is Lama?

Lama actually just stands for large language model Meta AI.

So it's really just an acronym for that, a fun acronym at that.

And so it's open-sourcing kind of the equivalent to chat GPT, not quite as performative.

But the interesting thing here is that it's available at no cost for both research and

commercial use.

So this is, you know, what they say is arming the rebels.

So you have an open AI out there, which is, you know, best in class.

You've got Google out there with Bard as well.

And this is Facebook coming in saying, look, here's a pretty good model.

Anyone can use it except companies with more than 700 million active users.

So essentially they're saying anyone can use this except for the other big players in the

space who are trying to kind of commercialize on their own.

And when you look at the cost declines that are happening in AI, this is a perfect example.

So in April of 2022, Google announced the Palm model, which was 540 billion parameters

and was, you know, best in class at the time.

And then this past week, Meta releases Lama 2, very similar in performance to that Palm

model, but with 70 billion parameters.

So it's 8x faster and an eighth of the inference cost.

So you can essentially think of that as essentially they're taking this model.

It'll be able to run on your smartphone, hopefully.

They've got some partnerships underway for that.

And you'll have a best in class language model from 2022 that used to cost $200,000 to infer.

So it's pretty dramatic, you know, the iPhone as the supercomputer in your pocket only becoming

more and more true with AI.

And that partnership, if people are interested, is with Qualcomm, that's one of the companies.

It's actually Microsoft and Qualcomm.

That's great.

And maybe this is another example of this low input, high output.

And it gets to this, I think, internal debate we've been having about, you know, what happens

when you release these large language models and they're free to use for most people?

Well, there's your low input, right?

No cost.

And you have high output because it's equivalent on most aspects compared to other models people

are paying for.

And I think it's a really interesting debate, you know, and who would have thought Meta of

all companies would now be, you know, hailed as open source in nature, right?

They're doing it with threads.

They're doing it with large language models.

It's pretty interesting.

Well, and also from an investment perspective as well and thinking about this, right?

What is the business model, right?

So you have an open AI who is charging $20 a month for, you know, best in class, large

language model.

Now, you've got one that's not quite as good, but essentially free to use.

And so do these open source models commoditize, I'll tie it back in with kind of the first

thing we talked about.

It's like, why do we think Tesla is valuable?

It's this unique data set and capability that they're going to unlock.

So if you look just in this past week, right, you have Microsoft.

They're planning to charge $30 for what is it?

Co-pilot in office.

And so that'll be $30 a month for a chat GBT integration into kind of Excel and all of

the office suite.

You have Salesforce, again, unique data, unique use case, who I think was going to charge

$50 a month.

And so it does seem like value is kind of accruing to people who have unique data sets

where they can offer this capability.

But you know, Nick, is there still a spot for kind of this foundation model, large language

model, or do you think it goes more verticalized here?

It's unclear to me.

And I think we'll have to leave it there and maybe bring on Frank Downing to talk about,

you know, the opportunity there next week because I don't want to speak out of turn

because I think that is a larger debate to be had.

So I think we'll, I'm not going to, I'm not going to give an answer because I'm not fully

informed yet on, you know, where I think this is headed because it is just so fresh.

So we'll have to get, we'll have to get Frank on to debate this a little bit more, maybe

even do two, two guests, how Frank and Brett.

Sounds good.

All right.

Thank you everyone for joining us.

And before we go, Nick, you wanted to answer some questions from the audience or give that

opportunity?

Yeah.

Well, one, we want to see who and show appreciation for those that are still watching to the very

end.

And so what we're thinking about doing, and please give us feedback if you think this

is boring or lame, but I'm going to ask Sam a question.

I'm going to ask him what his favorite color is.

That will be the code word.

If you comment or tweet with the code word, which is going to be his favorite color and

a question that is appropriate and we can actually answer, we will answer it at the end

of the next show for next week.

So Sam, what is your favorite color?

We're going hunter green.

So I look forward to seeing, seeing if people listen to this and where we look forward to

answering your questions.

See everyone next week.

See everyone next week.

Thank you.

ARC believes that the information presented is accurate and was obtained from sources

that ARC believes to be reliable.

However, ARC does not guarantee the accuracy or completeness of any information and such

information may be subject to change without notice from ARC.

Historical results are not indications of future results.

Certain of the statements contained in this podcast may be statements of future expectations

and other forward-looking statements that are based on ARC's current views and assumptions

and involve known and unknown risks and uncertainties that could cause actual results, performance

or events that differ materially from those expressed or implied in such statements.

Machine-generated transcript that may contain inaccuracies.