AI Hustle: News on Open AI, ChatGPT, Midjourney, NVIDIA, Anthropic, Open Source LLMs: Anthropic Unveils Premium Paid Features for Chatbot – What's New?

Jaeden Schafer & Jamie McCauley Jaeden Schafer & Jamie McCauley 10/5/23 - Episode Page - 11m - PDF Transcript

Welcome to the OpenAI podcast, the podcast that opens up the world of AI in a quick and

concise manner.

Tune in daily to hear the latest news and breakthroughs in the rapidly evolving world

of artificial intelligence.

If you've been following the podcast for a while, you'll know that over the last six

months I've been working on a stealth AI startup.

Of the hundreds of projects I've covered, this is the one that I believe has the greatest

potential.

So today I'm excited to announce AIBOX.

AIBOX is a no-code AI app building platform paired with the App Store for AI that lets

you monetize your AI tools.

The platform lets you build apps by linking together AI models like chatGPT, mid-journey

and 11 labs, eventually will integrate with software like Gmail, Trello and Salesforce

so you can use AI to automate every function in your organization.

To get notified when we launch and be one of the first to build on the platform, you

can join the wait list at AIBOX.AI, the link is in the show notes.

We are currently raising a seed round of funding.

If you're an investor that is focused on disruptive tech, I'd love to tell you more

about the platform.

You can reach out to me at jaden at AIBOX.AI, I'll leave that email in the show notes.

The first thing I want to say here is, anthropic of course, this is the AI company established

by former OpenAI staff and it has unveiled Claude Pro.

This is its first consumer-centric subscription plan for Claude 2, which is a text analyst

chatbot.

This is priced very interestingly at $20 a month US or 18 pounds in the UK.

The plan offers quintuple the usage of its free version and the availability to send

significantly more messages and priority access during high traffic periods along with early

exposure to new features.

The thing that's interesting here is $20 a month.

This is the same price point as chatGPT when you get GPT-4 Pro.

It's interesting, they're not discounting themselves, they're not saying, hey, we're

a knockoff.

They're saying, look, we're just as good as them.

I think that pricing point is really good.

They can't go higher because chatGPT and GPT-4 obviously is the premier most popular tool.

You can go more expensive than that unless you're doing enterprise, but for consumers

I wouldn't go more expensive.

If you go less expensive, then you seem like you're a knockoff, you can't quite do as much.

I think smart move on their part putting it at the same price point.

The thing I will say about it is you can do four times the usage of its free version based

off of what you can use.

This is a great move.

For those that don't know, Claude II, one of its main selling points is the fact that

you can upload massive, it has a really big context window, so you can upload super long

prompts, aka you can give it an entire document and say, hey, based off of this document,

you can upload a PDF based off of this PDF and then you ask it questions about it.

I've seen people upload small books and ask it questions about the book and the PDF and

it will give them an answer.

I, for one, use it when chatGPT's context window is not big enough, so typically it's

going to be like I will post three or four news articles in there and be like, hey, tell

me five interesting things where you're consolidating, give me five bullet points based off of these

four different news articles about X, Y, and Z topic and chatGPT's like, hey, your question's

too long, so I post it into Claude and I'll usually get a great response out of it.

That's typically when I use it, but then that's one of the biggest features from it, otherwise

the quality of it.

I've heard different people, like in our Facebook group, have had different people say, oh,

Claude is better at X, Y, and Z, but I've actually had some people that say they just

switched to exclusively using that.

Personally, I feel like, I mean, I still personally like GPT-4 a little bit more on most things.

I guess maybe it depends on what you're specifically using it for, but when I need something long,

chatGPT doesn't do it, so that's where I'm going to go.

I think interestingly, the price point is coming, well, obviously it's the same as chatGPT,

but since its inception in July, Claude has gained attention for a lot of its advanced

features, Anthropic noted in a blog post saying, quote, since launch in July, users tell us

they've chosen Claude as their day-to-day AI assistant for its longer context windows,

faster outputs, complex reasoning capabilities, and more.

The company also kind of mentioned that Claude Pro would enable subscribers to make use of

the chatbot's advanced model five times more than what is available on the free tier, like

I mentioned, right?

And addressing why there are limits on messages, Anthropic clarified that the substantial computational

requirements to run Claude to pretty much mandated these restrictions, so users can

send up to 100 messages every eight hours, which is a boost from the 50 messages per

three-hour limit that you're going to find with chatGPT plus subscribers, which I am

a chatGPT plus subscriber.

So they also specify, quote, a model as capable as Claude to takes a lot of powerful computers

to run, especially when responding to large attachments and long conversations.

We set these limits to ensure Claude can be made available to many people to try for free

while allowing power users to integrate Claude into their daily workflows.

So running AI chatbots obviously is not a trivial financial endeavor.

This is incredibly expensive.

OpenAI once reportedly shelled out, I think, around $700,000 daily to maintain chatGPT.

And this amounts to around $21 million a month.

So obviously, message limits in Claude Pro can be reached kind of quickly, especially

if the conversation includes substantial attachments.

So for instance, if a user uploads the complete text of the Great Gatsby, they'd find their

ability to send messages capped at roughly 20 for the subsequent eight-hour period.

So this is because Claude to revisits the entire conversation, including attachments

with each new message, right?

So it's actually looking at all the previous conversations, everything previously on there.

And if you're putting something in as long as like the text of an entire book in there,

it's going to take, it's definitely going to use more.

So anthropic definitely has, I think, some broader aspirations targeting the development

of a next-generation algorithm aimed at AI self-teaching.

And this was actually revealed in their recent investor pitch deck.

So the ultimate goal is to kind of craft visual, virtual assistance capable of a range of tasks,

including email management, research, and creative output, among a bunch of other functionalities.

Very, very interesting, right?

And since it's launched in 2021, it was spearheaded originally by OpenAI, VP of Research,

Dario Armedi, but anthropic has gained over $1.4 billion in funding at a valuation in

the single-digit billions.

So obviously, a very impressive figure, but it's still far from their estimated $5 billion

that they need over the next two years to realize their vision for AI.

So that's, you know what anthropic is saying?

They're saying if we really want to, you know, hit our vision for AI, do what we would like

to do, we're going to need $5 billion over the next two years, which is a fairly big

ask, but I mean, it's not that, like, out of the question considering Microsoft gave

OpenAI over $10 billion.

So you know, I mean, that's still half of what OpenAI has already done, and they have

a couple years to kind of get this done.

So assuming they get mass adoption and they do some really impressive things, I don't

see any reason why they wouldn't be able to get to that number.

So revenue from offerings like Claude Pro is slated to kind of fund computational capacity.

The company's roadmap indicates a reliance on, quote, tens of thousands of GPUs to train

its models, which is essentially earmarking, I think, nearly a billion dollars for infrastructure

spending within the next 18 months.

So while anthropic definitely enjoys the patronage of thousands of customers and partners, including

a subscription based generative AI app from Quora that offers Claude to, and it's less,

I guess, powerful kind of counterpart Claude Instant, which is just faster.

That's kind of why you'd use that.

It still has some pretty stiff competition.

So for those that don't know, you can go to po.com, and that is a platform that Quora

owns, and you can access, you can try out Claude today for free.

That's typically how I use it when I need it for one-off things.

If it ever comes to the point where I'm needing to do, you know, very frequently analyze large

documents, then I probably just get a subscription to it.

It's kind of a no-brainer.

It's really good.

I would assume if chatGPT or GPT-4 improves the quality a bit, and it comes up with larger

context windows, then maybe that kills Claude, I don't know.

I do really like having multiple options in the arena, right?

I always hate having one player in the space that completely dominates, and then you get

to sort of a monopoly thing, and a lot of times the quality of the products suffer.

So I think there are a bunch of other competitors in the field.

You have Co here, you have AI21 Labs, of course OpenAI is huge, and I think that what's really

interesting is OpenAI actually anticipates to generate a billion dollars in revenue next

year.

So obviously big players in the space making big moves, but I think Anthropoc has a really

bright future, and I'm excited to follow along on their journey.

If you are looking for an innovative and creative community of people using chatGPT, you need

to join our chatGPT creators community.

I'll drop a link in the description to this podcast.

We'd love to see you there where we share tips and tricks of what is working in chatGPT.

It's a lot easier than a podcast as you can see screenshots, you can share and comment

on things that are currently working.

So if this sounds interesting to you, check out the link in the comment, we'd love to

have you in the community.

Thanks for joining me on the OpenAI podcast.

It would mean the world to me if you would rate this podcast wherever you listen to your

podcasts, and I'll see you tomorrow.

Machine-generated transcript that may contain inaccuracies.

In this episode, we dive into the exciting release from Anthropic, where they've introduced a range of premium paid features for their chatbot platform. Explore the latest enhancements and capabilities that are set to revolutionize the chatbot landscape, offering users new and powerful tools. Join us for an in-depth look at the future of conversational AI and what these premium features mean for both businesses and chatbot enthusiasts.


Get on the AI Box Waitlist: https://AIBox.ai/
Join our ChatGPT Community: ⁠https://www.facebook.com/groups/739308654562189/⁠
Follow me on Twitter: ⁠https://twitter.com/jaeden_ai⁠