FYI - For Your Innovation: Character AI, FSD Beta v12, Walmart & Drones | The Brainstorm EP 12

ARK Invest ARK Invest 8/29/23 - Episode Page - 29m - PDF Transcript

Welcome to The Brainstorm, a podcast and video series from Arc Invest.

Tune in every week as we react to the latest in innovation and reflect on how short-term

news impacts our long-term views.

To learn more, visit arc-invest.com.

Arc Investment Management LLC is an SEC-registered investment advisor.

Arc and public are unaffiliated entities and do not have a relationship with respect to

either firm marketing or selling the products or services of the other, and therefore Arc

disclaims responsibility for any loss that may be incurred by public's clients or customers.

The information provided in this show is for informational purposes only and should not

be used as the basis for any investment decision and is subject to change without notice.

It does not constitute either explicitly or implicitly any provision of services or products

by Arc, and investors should determine for themselves whether a particular investment

management service is suitable for their investment needs.

All statements made regarding companies or securities are strictly beliefs and points

of view held by Arc and or show guests and are not endorsements by Arc of any company

or security or recommendations by Arc to buy, sell, or hold any security.

Historical results are not indications of future results.

Certain of the statements contained in this show may be statements of future expectations

and other forward-looking statements that are based on Arc's current views and assumptions

and involve known and unknown risks and uncertainties that could cause actual results, performance

or events to differ materially from those expressed or implied in such statements.

Arc assumes no obligation to update any forward-looking information.

Arc and its clients, as well as its related persons, may, but do not necessarily, have

financial interests in securities or issuers that are discussed.

Certain information was obtained from sources that Arc believes to be reliable, however,

Arc does not guarantee the accuracy or completeness of any information obtained from any third

party.

Hello, everyone, and welcome back to another episode of the brainstorm episode 12.

Today we have Tasha Kinyan as well as Andrew Kim.

We are going to talk all things AI, both on a text basis and FSD basis, and then we will

also talk about some drones.

Andrew, let's start with you.

Can you take us through character AI and this text-based entertainment medium that may

be emerging?

Sure.

I guess since the launch of chat GPT last winter text-based AI and all the experiences

that have been built on top of these LLMs have been all the rage, and if you've been

paying attention to this space, then you would be familiar with character.ai, which essentially

is a consumer-facing entertainment platform or maybe even marketplace that features all

these different customizable AI chatbots.

As a user, you can go on and create your own chatbots and customize them to your liking,

or you can find chatbots that mimic celebrities, for example, or fictional characters, or can

also serve productive use cases, like a coding assistant or a therapist, or even full-on

game experiences, like a role-playing game in a text format.

It's been gaining a lot of traction.

I think desktop, like unique visitors kind of averaged around 15 million last month.

They also launched a mobile app this past spring, and I think that's around a couple

million in DAUs, so it seems to have generated at least early product market fit, and it's

super exciting to see how this space will evolve.

As this space evolves, I'm curious what your thoughts are on where this goes, does this

have legs, and can you really point to this entertainment medium?

Is there something in the past that looked similar to this?

Where have we seen this before?

I think you brought up a great example on Friday when we had the brainstorm, so I would

love for you to talk about that, and then curious what your opinion is on where this

goes from here.

Sure, sure.

I think the last example that I gave with character AI includes text-based adventure

games, and we've definitely seen that before.

If you think about 1980, Infocom released Zork, which was arguably the first text-based

role-playing game that took place on the early personal computers, as the industry was shifting

from mainframes to personal computers, with the rise of Apple and the like.

It was all the rage back then, too.

Got raving reviews on Rolling Stone, for example, on Time, and the company Infocom hit around

$10 million in revenue, and it peaked there in 1984.

What we saw there was a big mixture of both internal and more macro-related turbulences,

one being it was a recessionary time.

I think in early 1985, we saw a wave of tech recessions and layoffs.

We also saw Infocom after creating Zork 1, 2, and 3, and I think also Hitchhiker's Guide

to the Galaxy, by that point, the company tried to pivot to enterprise software, like

a database management software called, I think, Cornerstone.

That was happening amid the tech recession, so demand wasn't really there, and they were

faced with a lot of losses, and they were eventually acquired by Activision.

I think, finally, and probably most importantly, by 1985, we saw the launch of the NES by Nintendo,

and by 1996, I think Sega released its first console as well.

This gave rise to the rise of computer graphics in the video game industry, and Infocom didn't

really pivot. They stuck their place in text-based adventure games, and while faced with this

overall disruption with just audio-visual becoming a very core part of the video game

experience, that led to Zork, not Zork, Infocom shuttering operations by the end of the decade.

This is not to say that character AI is another Infocom, but I think it is really interesting

to consider that the video game industry went through this pivot from text-based to audio

and video-based medium within the span of a single decade, and we seem to be at that

text portion today with generative AI. What's not to say that the same thing is going to

happen within AI? Given that, we might like to think that a bigger opportunity for market

sizing is going to be in this next wave of generative AI.

That leads me to a question that I'd love to tell our audience, Andrew. What is this

for? Is it for gamers who use this? Who should care about this?

Well, for character AI, I think it kind of targets a more niche audience who care a lot

about celebrities, for example, or care about fictional characters, at least anecdotally.

When I'm looking at memes of screenshots of character.ai conversations, it seems to focus

around those two specific categories. That being said, Snapchat rolled out my AI, which

is a companion chatbot back also during the same time I want to say earlier this year,

and that kind of made the waves as well as just a generally humorous experience, but

I'm not so sure that these generative AI chatbots have reached, like, a lot of people

true product market fit in that the primary way that these chatbots are generating entertainment

value is through the meme value of it, in that people are trying to break it or trying

to mess it up or trying to make the chatbots say something that it's not supposed to. Then

there's entertainment value there, and the primary entertainment value comes from cross

posting, cross sharing onto different social media. I think there's also that question

of thinking about these different social platforms, and is the true value in being in that platform

itself, or does the main entertainment value come from sharing it onto other incumbent

platforms? This type of text-based AI has really figured it out quite yet, and we're

excited to see if the introduction of audio or video can change that.

Got you. Well, definitely a space to watch, and we'll be keeping up with everything that

you write, and you're using these platforms, Andrew, I assume. Have you tried character

AI yet? What are your personal thoughts?

I think it adheres to the character well for the first few minutes of chatting. I was talking

with Waluigi from the Super Mario Bros. world, and it sounds like Waluigi for the first couple

of chats, but then it starts to lose the original character. I do wonder if the team there is

working on extending, making sure it adheres to the character for as long as possible.

Sometimes, it really depends on the individual chatbots, and some of them seem to be really

realistic because you just have a lower barrier to mimicry, and that me thinking about Yoshi,

for example, you can't really think about Yoshi having a coherent conversation with

you from the Super Mario Bros., but you can have a coherent conversation with Elon Musk,

for example. There's two very different standards there in terms of believability, but I definitely

tried it, but I wouldn't say I'm an avid user of it yet.

Gotcha. Make sense. I think you brought up an interesting segue into our next topic here.

You mentioned Elon Musk. We have Tasha on to talk about some new updates in the FSD world,

and Andrew, thank you again for coming on. This was a great segment. Thank you.

We'll continue to watch and see how this evolves. Tasha, let's switch gears here. Let's

go to another end of the spectrum in the AI world. FSD, can you give us a breakdown

of this recent weekend? It seems Elon Musk is testing out features live on his other

company's platform, x.com. Curious what your thoughts are? If you can break this down for

us, it would be great.

This Friday night, I believe, Elon went live on x.com, showing a video of him and, well,

he didn't show the person in the passenger seat, but it sounds a lot like the head of

autopilot, his name is Ashok Alaswamy. They were test driving version 12 of FSD, test

this full self-driving software. Why do we care about watching this video? Well, it's

out with consumers yet. This is something that the average person has not seen yet. It

was a roughly 45-minute video. He drives around Palo Alto. It's clear that this system isn't

perfect, but what's different about this versus what's in consumer cars today is that this

latest generation of FSD is much more, it's less generated by, let's say, manually

coded or rules-based AI. It's more that the data itself is curated versus the code lines

curated to what you want. The way to think of this or the way it was described is, okay,

the car is not explicitly coded to know what a stop sign is. It's just shown videos of

people correctly responding to stop signs, which, by the way, no one does. It's less than

a stop sign. People actually come to zero miles per hour at a stop sign. They find those

videos that are the correct driver behavior and then feed that to the system to train

it what to do. It learns directly off video. It's closer to this video in driving out system

that Elon has always wanted to build.

Yeah. I saw a few tweets and I think it was a quote by Elon saying, this is all neural

nets, nothing but nets. Is that kind of what you're hinting at here? This is not rules-based.

This is, as you mentioned, data in and then you get correct driving behavior out.

Exactly. If you think of the full autonomy stack, there's perception, planning, control.

There's different pieces that add up to this puzzle of actually getting the car to drive

itself. It used to be that this method, the more totally neural net-based, end-to-end,

deep learning type of method, was used in pieces of the stack but not the full stack.

That's the major change. Again, it's more like video in, output out. It's probably closer

to that ideal than they ever have been before. What's impressive about it is that it seems

like it works. There was one intervention point and we should probably talk about that.

In the 45-minute drive, they attempt to go to the Google Maps address of Mark Zuckerberg's

how it's been then drive away. They go to and from the Tesla headquarters. It mostly works

and it's not even in shadow mode yet, meaning shadow mode is when they take a new software

feature. They run it in the background of the consumer Tesla cars to see when would the

output that the simulation is running differ from what the driver does and then they use

that to train the system. They haven't even done that step yet. The fact that there was

one intervention I think is relatively impressive.

What was the intervention?

They're at an intersection with multiple stop lights. Basically, it appears as if the car

is assuming that the stop light that it's actually meant for the left-hand side lane to turn is

meant for the vehicle itself. The car was going straight, not turning left, but it abided by

the left-hand turn stop light. First of all, very minor, although I've seen a number of

headlines about this. The car moves maybe a foot forward before Elon stops that he recognizes

it right away. I saw some headlines that were like, it was going to run a red light. There's

still someone behind the wheel training this thing. I don't think it was close to endangering

anyone, but yes, it's not perfect yet.

This is a side note, but I was thinking about this the other day. Humans have interventions

with each other all the time. If you have a passenger, sometimes that passenger will speak

up and say, hey, you just missed the exit. Obviously, humans aren't perfect, but it's

interesting to think that we intervene with each other when we're driving in the same

car pretty often. At least maybe I'm just a bad driver because I feel like I get yelled at all

the time for missing exits because I space out. I think that is kind of an interesting.

It was just a thought that popped into my head. This actually happens to humans quite often.

Yes. You're actually touching on something that was also in this video, which Elon says that the

system technically could work without maps. In other words, he said, well, you could just give it

a GPS point, and it might not know how to drive there. It might go down a dead end and then realize,

oh, I'm at a dead end, and then turn back around to get to the correct route to where you need to be.

I think the first version of autonomous cars that we will see, what is important?

Well, are they safer than humans? We're already seeing that with autopilot. Our analysis suggests

that a car with full self-driving enabled is already safer than a human-driven Tesla. It's certainly

safer than the national average, but safer than a human-driven Tesla. We use Tesla's

self-produced accident data to verify that. Then, can it drive in harsh weather scenarios?

That's something people care about a lot. Well, humans aren't that good at driving in harsh weather.

I would guess that the car is going to be cautious. Ashok mentioned in the video that it'll likely slow

down in a rainstorm similar to what we would do, but it's trained on human-driving data.

It'll mostly try to mimic what we already do. I don't think it'll be perfect.

As you said, maybe it'll take a wrong turn sometimes. I think it'll likely be better than a human,

and that's what matters. Then, ultimately, cheaper than a human, too. We think what's going to drive this

market is that a RoboTaxi could be actually cheaper than driving a personal car, but certainly

cheaper than Ridehill prices today. You could imagine Ridehill produces a nice price ceiling

for RoboTaxi to charge a $4 per mile rate, but it could be as low as less than a dollar at scale.

My last question on this topic, the distinction between Hardware 3 and Hardware 4. I know there

were some commentary from Elon about this. I think this drive was done on Hardware 3 and not Hardware 4.

Could you explain that distinction, what the hope is with Hardware 4 and why this was done on Hardware 3?

I actually think it was great that it was done on Hardware 3 because there's been this question of,

if I have a previous generation, so these hardware 4 is Tesla's latest in-house produced chip,

as the compute that goes into the car itself to think of it as the brains of the autonomous car.

It's actually good to show it on the previous generation system because that was a question.

Will autonomy first only work on Hardware 4 vehicles? Will it work on Hardware 3?

If this is already running in a Hardware 3 car, which you can think of as basically slightly less optimized,

each incremental version of hardware that they come out with is going to be more optimized to what they think

will be the best solution to run full autonomy.

I think that showing that it works in a previous generation of hardware kind of gives everyone hope that,

when Tesla told me that my vehicle had the hardware necessary to become a RoboTaxi,

it looks like seemingly that was true, assuming that they saw full autonomy, which we think that they will.

I think that's good and we know that to really get into the weeds that Tesla's moving more towards using

things like diffusion models, which I've heard will work better on next generation hardware,

but also be better to backwards integrate on Hardware 3 because of the memory and compute power that it has.

All in all, I think that might have been kind of a sigh of relief to people watching.

Yeah, very exciting. Very exciting. I'm sad that Mark Zuckerberg wasn't home.

Maybe we would have got a sneak peek of their upcoming fight.

I know. Yeah, we're all waiting. Yeah. They took it off of Google Maps, so probably wasn't his house,

but it's clearly on his mind.

Okay, our last topic, Tasha. Let's talk about drones and Walmart.

Can you explain this recent news? I feel like we're talking about the future.

We're living in the future, full self-driving drones, character-based AIs that you can speak to.

But yeah, let's touch on drones.

Yeah, I've been writing about drones a lot in the ARK Invest Newsletter.

What the most recent news is that Walmart has teamed up with Wing, which is Google's drone delivery project.

It's teamed up with Wing, company called Flytrex, as drone up, as well as ZipLine for drone delivery.

This news is that Wing is basically like a pilot drone delivery.

There are two super centers in Texas, that Walmart and Dallas, that you can get drone delivery within a six-mile radius of that super center.

This allows Walmart to add to its existing number of stores that offer drone delivery.

I think it's 36, and that's out of roughly 5,000 locations for Walmart.

They have a lot of stores in the US, as you can imagine.

I think it's a meaningful number of deliveries here.

Walmart has said that so far they've completed 10,000.

ZipLine, Wing, as a whole, have done in the hundreds of thousands of deliveries.

But Walmart has certainly done a lot more than Amazon, which is rumored to have only done roughly 100 drone deliveries.

Different solutions, they're working with a number of outside parties to get it done.

But I think it means they're serious about this, and I'm certainly excited for it.

The delivery bill when you order food in New York is embarrassing at this point.

I'd certainly like to skip some of those fees with drone delivery.

One thing that you said last week that stuck with me was dinner plate level accuracy when it comes to drone delivery.

Is that what we should expect in these early days with this partnership?

Or is this more experimental still?

It'll be dropped somewhere in your lawn?

What exactly are they delivering as well?

Good question on what they're delivering.

I'm not sure quite yet, but we've seen in the past, hamburger helper was a super popular item to get in Arkansas and some of the locations that were testing it out.

Dinner plate level accuracy is what Zipline claims they can do in their backyard.

But Zipline and Wing have a similar system.

A lot of them have a similar system where the drone kind of hovers and then lowers something down on a string.

So with Zipline, their latest drone, actually a smaller drone comes out of a big mother drone, let's call it, and lowers down to then ultimately release the package.

I think with Wing, it's something that kind of looks closer to like a McDonald's box that lowers down.

And so it'll be curious to see how accurate is it versus competitors.

You know, I'm curious most about how Wing says, well, Walmart said in their release that this would be beyond line of sight.

So if you care about drones, you care about the beyond line of sight requirements because those are what is really holding the industry back right now.

The FAA allows for some pilots to exist, but they're often when they when you get the restriction removed, it's often it comes with caveats.

So and beyond line of sight means for everyone out there that, you know, I as a pilot don't actually have to physically see the drone or as an operator, a remote operator, I'd have to see the drone.

But often, it's the case now that it's technically beyond line of sight, but you might still have someone say on the ground, that's an observer that has to be able to physically see it, which seems sound, it's as silly as it sounds.

Or, you know, you're restricted to a very small area of operation.

So I'm wondering, you know, what's happening here.

So yeah, it'll be curious to see like, to your point, Nick, what is the consumer experience like?

You know, what is beyond line of sight actually mean in this case?

And then, and then who will Walmart ultimately go with because this is what they do, you know, they partner with a lot of different technology companies.

But if one of them is successful, you assume they're going to double down on that partner.

So you know, I think that Wing is a really interesting competitor.

It's Zipline's number one competitor, but I think that, you know, Zipline is a formidable company to watch here in the space.

All right, so two companies to watch in the space.

And then you have kind of the mega tech players all testing their, you know, level of competency in the drone delivery space, which is really interesting to your point.

You know, Walmart is going at it with a, hey, let's test and see who can, you know, do this at the most probably cost efficient, but also at it, you know, with accuracy when it comes to deliveries.

Do you think they'll ever, or do you think they'll ever get into kind of the vertical integration strategy of maybe acquiring one of these or just, you know, maybe in housing some of this or probably not.

That's a really good question.

You can certainly imagine that if one of these companies was successful, which by the way isn't easy because it's it's not just getting the drone to do the delivery correctly accurately.

It's making sort of like this logistic song work of getting packages where they need to be on time when consumers expect them.

That's really hard to do even with non autonomous systems.

So you imagine that if a company does that well, it might be an attractive target for Walmart.

You know, I'd certainly be interested in these companies as standalone entities because Walmart's probably, you know, it's the Goliath in the space.

So having that partnership is really meaningful, but, you know, we see delivery of all sorts.

So not just packages from Walmart, but food delivery as well.

As I was talking, that's where I think, you know, we see all the exorbitant fees today.

So, you know, ziplines partnered with sweet green and a wing is working with postmates.

So, you know, if it if it is a Kiva robot situation where Amazon takes it or sorry, Walmart takes it in house as Amazon has done with some technologies.

That'd be interesting.

But for now, I think it's smart that Walmart hasn't done that.

Right.

Because it's not working out for Amazon for whatever reason.

So working with outside entities seems to have been the right move.

Yeah, interesting. All right, well, a lot to look forward to a lot to keep our eye on FSD drones character AI.

Tasha, thank you for stepping in today helping helping me out co hosting Sam is on vacation enjoying life.

We wish him, you know, nothing but the best happy vacation.

But yeah, thank you for coming on Tasha. This was awesome.

Thanks for having me Nick.

Yeah. Thank you, everyone. That is our show. We'll see you next week or potentially not actually it is Labor Day next week.

So we'll see maybe update everyone in the comment section on YouTube or Spotify or X dot com.

But we'll try to get a show out next week because innovation doesn't stop and we're here to report on what we're seeing in the space.

So thank you, everyone.

Thank you.

Machine-generated transcript that may contain inaccuracies.

If you know ARK, then you probably know about our long-term research projections, like estimating where we will be 5-10 years from now! But just because we are long-term investors, doesn’t mean we don’t have strong views and opinions on breaking news. In fact, we discuss and debate this every day. So now we’re sharing some of these internal discussions with you in our new video series, “The Brainstorm”, a co-production from ARK and Public.com. Tune in every week as we react to the latest in innovation. Here and there we’ll be joined by special guests, but ultimately this is our chance to join the conversation and share ARK’s quick takes on what’s going on in tech today.

This week, Associate Portfolio Manager Nick Grous is joined by Director of Investment Analysis & Institutional Strategies Tasha Keeney and Research Associate Andrew Kim. Together they discuss the Character AI, Full Self Driving Beta v12 and Tesla, and the latest in Walmart’s partnerships with drone delivery companies.







Key Points From This Episode:

Character AI
Tesla FSD Beta v12
Walmart and Drones

FSD is Full Self Driving.

DAU is Daily Active User.

LLM is Large Language Model.

For more updates on Public.com:

Website: https://public.com/

YouTube: @publicinvest

Twitter: https://twitter.com/public