AI Hustle: News on Open AI, ChatGPT, Midjourney, NVIDIA, Anthropic, Open Source LLMs: Unlocking AI's Impact on Business Analytics with Kelly Cherniwchan, CEO of Chata.ai

Jaeden Schafer & Jamie McCauley Jaeden Schafer & Jamie McCauley 10/5/23 - Episode Page - 32m - PDF Transcript

Welcome to the OpenAI podcast, the podcast that opens up the world of AI in a quick and

concise manner.

Tune in daily to hear the latest news and breakthroughs in the rapidly evolving world

of artificial intelligence.

If you've been following the podcast for a while, you'll know that over the last six

months I've been working on a stealth AI startup.

Of the hundreds of projects I've covered, this is the one that I believe has the greatest

potential.

So today I'm excited to announce AIBOX.

AIBOX is a no-code AI app building platform paired with the App Store for AI that lets

you monetize your AI tools.

The platform lets you build apps by linking together AI models like chatGPT, mid-journey

and 11 labs, eventually will integrate with software like Gmail, Trello and Salesforce

so you can use AI to automate every function in your organization.

To get notified when we launch and be one of the first to build on the platform, you

can join the wait list at AIBOX.AI, the link is in the show notes.

We are currently raising a seed round of funding.

If you're an investor that is focused on disruptive tech, I'd love to tell you more

about the platform.

You can reach out to me at jaden at AIBOX.AI, I'll leave that email in the show notes.

Welcome to the AI Chat podcast, I'm your host, Jayden Schaffer.

Today on the podcast, we have the pleasure of speaking to Kelly Chertichan, who is the

CEO and founder of Chata.AI, which is a cutting-edge platform specializing in NLP-driven self-service

and analytics that really empowers business users to make data-driven decisions.

Before founding Chata.AI, he spent nearly a decade as COO at Circle Cardiovascular Imaging

Incorporated, where he executed strategic initiatives in the cardiac imaging industry

from financial planning to innovation and product management.

Kelly's diverse experience also extends to new company formation and risk management,

making him a seasoned executive with a deep understanding of technology, finance, and

business strategy.

Welcome to the show today, Kelly.

Thanks for having those quite the back or something.

Well, you're the s**t.

Okay, I'll give you a secret, which I haven't told anyone yet, but how I get these bios,

if someone doesn't send me their bio, which I never ask, so usually they don't, is I actually

will go to LinkedIn.

I will just pop be their entire experience page, give them chat to BT and say, give me

a good bio on it.

I found it, it's got some really, it's great at putting it all together, and then I don't

have to read through it on the first go around.

So yeah, that's my secret in case anyone needs to make a quick bio.

Good way to do it.

There you go.

So thank you so much for coming on the show today, Kelly.

The first thing I wanted to kick this off and ask you, can you explain to everyone a

little bit about your background and what got you interested and started in the tech

space, AI space, right off of rate, you know, at the beginning?

Yeah, no, absolutely.

So I actually was going to go into investment banking.

So I was doing my masters, I was in business school, focus in financial derivatives and

the mathematics around there, and that was at 2000, I graduated 2007, and that was kind

of really when machine learning and those things were starting to pick up again.

And so I ended up taking a job at a company creation group with a small seed fund.

It was based out of the technology transfer office at U of C, and that's when one of the

projects that came across the Basque was circle cardiovascular imaging for the very early stages

of it.

So thought it had a lot of commercial potential.

The market was still very niche, but sometimes you want to be the big fish in the small pond,

knowing that that pond is going to get bigger and jumped into things from there, started

building the background more around, you know, understanding machine learning, how it works

practically in the real world, both opportunities and challenges.

And, you know, as a founding group, we saw where we needed to go.

We knew that technology had to be built to get there.

And we just all aligned on that whole piece.

And we're actually one of the early adopters of convolutional neural nets, like maybe if

you look in the history books, we had one of the first medical imaging companies to actually

use that for image segmentation and quantification as well.

So yeah, that's that's kind of the background that the mathematics from the finance side

gave.

I'll say the mathematics background to be able to, you know, understand how neural networks

work and understand a lot of those different pieces was an easy transition.

And then I just then you just have to learn software.

Not that I code on a day to day basis at all, but at least I can hold hold a conversation

with with you.

Very cool.

Very cool.

So talk to us a little bit about, you know, you have this, this experience where you were

working at that company.

What made you decide to kind of jump on this opportunity with shadow.ai?

What made you kind of inspired to start this company?

Yeah, no, I'll tell you the exact date.

So we had just done a deal with a very large device imaging scanner device manufacturer.

And so I'd been talking to one of the co-founders who's still the CEO of the company, Greg

Grodnik, just an amazing entrepreneur, and really, really a good mentor.

And so it's talking to him about an opportunity that I saw something we struggled with at

the last kind of few board meetings that we had, which was we would show these displays

or these dashboards and reports in the board meeting.

And what typically happened was there is about, I don't know, 10 questions that just got

fired right off the bat.

And a lot of the time, we weren't able to answer those questions because they're a pretty

detailed question, right?

So then it's the typical, okay, well, we'll get back to you on that one because we got

to go and, you know, run another report, go get something out of the database and return

that information.

And that's when I had started doing some work in computational linguistics, right?

So this is the earlier days of LP and thought that there's a potential, there is a potential

that we could use natural language and actually have that translated to database query language

in a robust way.

And so that was, you know, we worked on a transition plan, wanted to start the company

again, similar blueprint, knew where things were going to go, but that there's still technology

that needed to be developed to get there.

There's de-risking that needed to be done, everything else.

And then so we began a multi-year research and development journey towards what we've

built today.

That's incredible.

Talk to me a little bit about like your process.

So you want to start this company, I'm assuming you spoke with your mentor about this.

You're going to have to leave your old company.

Was this a spin-off company?

This is a brand new thing.

Did you go look for co-founders?

Did you look for investors?

What was the process of starting that?

Yeah, it was not a subsidiary or anything, it was just my own thing, my own money to

start as well.

So it was just something that, you know, sometimes getting co-founders together, everybody

has to be totally aligned and I had like a very, very specific, a strong vision of kind

of where I was expecting things to go.

I talked to a few other people as well, but I don't think we're totally aligned right

off the bat.

So I got things going, outsourced a little bit of work.

And then at that standpoint is when you started putting the team around.

So effectively not a founder off the bat, but your first three to four people in the company

are co-founders as far as I'm concerned, right?

Or CTO, you know, our CTO, Reg, who was down in Las Vegas as well at the AI4 conference,

you know, he's effectively a co-founder as well.

So yeah, did that, and again, just aligned everything together, made sure that we weren't

just crazy and this was something that was actually achievable, not just, you know, as

you see in the AI space nowadays, everybody wants this like magical AI that will do everything

for them, which, you know, is still a pipe dream now, but where could we actually apply

the technologies itself and really provide value in the real world scenarios, right?

And so he came from the telecom background, handling, you know, billions and billions

of rows of data coming into a system, how that gets managed, and, you know, we aligned

to what type of market we wanted to go for and where we could make the biggest impact.

Very cool.

Talk to me a little bit about, like, explain, I guess, to the audience a little bit about

exactly what Chata does and, you know, the problem that it's solving for customers.

Yeah, so the easiest way to think about what we do is it's, we're just like Google Translate,

except instead of translating human to human language, we translate human to database query

language, okay? And we do that to real two core technologies or two core pieces of IP.

The first is the buzz term of today, which is generative AI. But we used to call it natural

language generation. That's how you kind of test if the NLP space, yeah, because it, I mean,

it's just a different, different way. I mean, okay, architecturally, it's a little bit different.

But the point is, is we use generative AI to build high quality training data, because every

single data warehouse database is unique, structured, it's unique in business logic,

everything else. So it's not something you could just plug a pre trained model into,

and it just works out of the box. Okay, so we use generative AI to connect to, so we,

we connect to a warehouse, and we have our own technology, which generates high quality training

data, right? And the old adage in AI is only as good as the training data that it's that it's

actually learned on. So we do that. And then we also have another system, which it's an automated

training controller. So there's a lot of different things like automated hyper parameter tuning,

as well as the ability to create new training data for itself as well. So we've built everything

in a really strong pipeline to be able to rapidly build custom language models for

our customers proprietary databases and data warehouses, and then also deploy those models

in their environment. So now one piece of proprietary data ever leaves the entire system.

Nice, nice. That's very cool. Wow, that's a that's an incredible technology there.

Talk, talk to me a little bit about how you're how you're helping in reducing data

request backlogs for businesses. I understand that probably big feature, right?

Yeah, that that's the primary like when we think about an ideal customer profile,

there's really two key areas. One, it's an enterprise or even an ISV who's servicing

their customers, but they have a big data request, right? I mean, you can only put so much information

in dashboards, but those dashboards are aggregated views that then gets fed into there. So when you

aggregate views, you've lost all that detailed information and the ability to join all that

detailed information between tables and everything else. And so you constantly have these analysts

having to go write custom if the database query language is SQL for them, they're writing custom

SQL reports, business user as more requests, they see something in their dashboard, and then

there's something else they want to know, but they can't drill down or drill through those

aggregated views. So it's this, it's this, you know, this this backlog like you talked about.

So that's ICP part number one. The second one is personas that require on-demand access. Okay,

so this is why the financial markets is a really important vertical we're working in.

Retail is another one, you can't wait two weeks for information to get back. And in certain cases,

is two months with a lot of enterprises for customer reports, you need to make decisions

quicker, right? Now, on that backlog side, that's the data analyst backlog. What we've found in

what the research has shown is there's the shadow backlog, we call it the shadow backlog,

which is an order of magnitude more questions and non-technical business users have,

they never end up asking because it's going to, you know, the data analysts are way too busy,

it's going to take too long for them to get the data back. They just stop asking, right? I mean,

it's human nature, you and I probably do it up, you know, do it all the time. And so that,

that shadow backlog, when we talk about tightening economic conditions and everything else,

that is where you drive the ROI from the data warehouse and digital transformation investments

that these enterprises have actually. And so that from a positioning standpoint is where we go. And

that between you and me, the analysts do, well, I guess there's probably some analysts listening

as well, but they will resonate with the fact that there's nothing more boring than writing basic,

you know, SQL statements to grab very simple reports for business people.

Now, we work on advanced analysis and in certain organizations, the data analysts are also in the

machine learning, well, doing machine learning project. That's the stuff that adds ultimate

value. So we're kind of there to clean up all that Cape, you have data requests, we'll handle the

data requested volume, they're really complex analytics stuff, that's what the data teams do,

they love it, we're aligned from there and and move forward. Very cool. Yeah, that that makes a lot

of sense. And you know, you mentioned those analysts there. So I believe like your solution,

it's really designed for non technical business users, but also it sounds like analysts really

love it. So what features have kind of made it appeal to both of those demographics?

Well, the main, I mean, yeah, the main features, there's a lot of data analysts out there who

are incredibly proficient with database free language, buildings. So in that case, we can

help from there. But the system is an API first, like the system is more of a developer platform

than anything else. Okay, I like it, it can be embedded anywhere, everything else. So

there's things like Microsoft Excel plugins, and also data analysts who have models built

within Microsoft Excel, and there's a lot of them, right? Their pain point is they just can't

just get the source data into their Excel file. So the plugin allows them to query the warehouse,

which then brings the data directly into Excel. And all they have to do is highlight the cells

they want from there. Primarily, though, it is the non technical user, they're basically just stuck.

But the analysts like it, because again, it takes that basic reporting backlog off of their

desks, right? And when you think about an organization who says, look, we have to do

this faster, we have to get information to our non technical decision makers faster,

the only way to solve that problem without kind of solutions that we work on is to hire more

people, right? And that's the only way to do it, right? So we're not there to, to say, hey,

now you don't need all these people, we're there to say to eliminate this backlog,

you don't have to bring on a bunch more Ed count for something that technology.

So, yes, that makes a lot of sense. And I can see some, some serious benefit to businesses from that.

So I've noticed you have mentioned that you've done some partnerships, you've done partnerships,

some big names, Microsoft, Snowflake, a handful of others. Talk to me a little bit about how

you initiated those conversations with those companies, what those partnerships look like

today, and maybe how those are helping you and your customers. Yeah, so, you know, partnering

with Snowflake is more a new thing, but they're, they're a really good, you know, a really good,

have strong partner potential because, you know, they are a very well respected brand name.

And to be honest with you, the majority of our projects so far are actually Snowflake data

warehouses, right? There's some, you're seeing a big, big increase in data bricks as well.

But a lot of them are really around Snowflake where they can take their SAP data and their,

and their Salesforce data and put it together in Snowflake. And then now you have the warehouse

that has all that information that you can query from there. Microsoft is very,

is a commercial, very much a commercial partnership, which is, you know, you have these big, big

consumption amounts that organizations pay for. They want to look for value add systems and tools

that they can establish within the enterprise to drive more ROI on their, their investments from

that standpoint. So we are part of Microsoft Transact. So a purchase of our system can work

via the, you know, through those, those consumption commitments that are becoming more and more

important on a day-to-day basis. And we do use, you know, we use lots of systems where multi-cloud

we can deploy in a bunch of different clouds, but we definitely see especially Azure lots in the

enterprise. And so we've been able to leverage some of those tools and systems with model

deployments and everything else. So we have some strong technical ties there, but Microsoft has

been a really, really good partner, you know, opened some, some good doors for us, which is always,

as you know, in startup plan, that's one of the most important things.

Totally, 100%. So talk to me a little bit about it, about like who your ideal customer is for

this product. Is this something that's primarily used, like you've done partnerships in these big

companies, is this something primarily used by large corporations? Do small businesses use this?

Who's your, you know, who's your, your typical customer for this?

You know, enterprises for sure. We had a look, when we were first de-risking the model or de-risking

the actual technology, we pre-trained some models for the QuickBooks data warehouse and the zero

data warehouse and actually Stripe as well. All right, because they published their object model,

everything else. And as part of that, to get the most possible users and the most possible queries

through the system, you deploy that, right? So we created an app, put an app around it,

and deployed it from there. The system did extremely well, but the distribution channels

and getting that out to the small businesses, it normally goes through resellers at the end of the

day. And so the, the, I wouldn't say the product market fit, but I would say the market distribution

fit was not super strong. But that was never where we were intending to go. Verizon tended to go to

the enterprise, right? As a, as a key validation step, and then also be able to deploy the entire

system out so that do-it-yourself people could actually use our technology to build their own

proprietary language models internally within the system. So we started that way, but we, we, we

absolutely target enterprises with those two factors, as mentioned before, with data analysts,

backlog, and, and the need for on demand, that data access. And right now we're in the nice spot

that we're getting so much inbound demand on this, which is, you know, well, you know, I think when

we were talking at the AI for booth, you saw a bunch of other kind of people even came by as well.

And that's because of the advent of chat, GPT, and now even some of the way people are finding

us on searches is via that as well, because we don't do all of the things that these auto regressive

LLMs, these pre-trained ones do. We have our very specific area, which is database query generation,

which those systems actually don't do very well and they're highly risky to do that. Right? So we've

formed in a nice little area in there. And so in the enterprise, as you know, chat, GPT, and these

other, these other big LLMs, they're being adopted, they're adding tons of value, all those other

pieces. So we're driving as part of that wave. And we are getting lots of inbound because of that.

That's very cool. Buzz, that's being created. That's amazing. And I think it's really interesting,

too, like, you obviously are not brand new in the space, you know, you've been looking at the space

for a long time, working at the space at a long time. For you, what do you think, like, one of the

biggest differentiators between, you know, Chata is versus maybe some of these new players that

are trying to come into the space today, where do you see, you know, the most value and differentiation?

Yeah. So a lot of the new, like a lot of the new entrants trying to use the LLMs,

they're actually not doing anything different than the business intelligence tools have done

for the last year and a half, two years. All right. So there's like Tableau ass data, SySense

has a system, a few of the other ones. But when you look into the docs, they have the language

model, or they have the interface based on that view, right, which typically is limited to the

number of columns, maybe Power BI is the same thing. So those new systems coming in, what they do with

the LLMs is they create a view and then prompt the LLM with a view. Okay, if you don't have to worry

about joins, complex subqueries aren't there, there's a whole bunch of things that just aren't

supported, which like you said, we've been doing this for a long time. There's a lot of real

difficult, practical things when you're building database queries. But the biggest challenge

is when you're using general again, we use generative AI on the training data side of things.

If you use it on the inference side of things, you need to be absolutely consistent every time

because you have a bad join in a database query, and you can bring that entire system down.

Right. So we all know because we've all worked with these things and work with the big LLMs on

a day to day basis, they don't need to be 100% perfect, nor are they, we just need to get 90,

95% of the way there, we can take that stuff, copy paste, you know, change it, however we see fit,

right. And then we go from there. But generating database queries is very different. You need

to have the system designed and trained for that particular structure. So even a simple question

like, you know, change in monthly sales for product XYZ and region XYZ during campaign XYZ,

like that detailed information that we work with, that exact question in the exact same company type

would be two fundamentally different database queries. Because that's dependent on the thing.

And you can't prompt an LLM with an entire warehouse schema, right? Right. It's lots,

there's conflict, you just don't get. So a lot of the people that have been trying this area now,

and to be totally frank with you, we're not the first choice in a lot of these cases. A lot of

these large organizations, they try, they've tried to go with this. And then they come to us, and I

can't name any names. But actually, a few in the last two weeks have come to be like, we've tried

this, it just doesn't work. It's not consistent. We've tried all the guardrails and everything

else we can. It's not going to do the database stuff. It does our document search and everything

else we need, which is really, really what it's quite strong at. But we need a different solution.

Can you integrate in this type of entire entire environment together? And that's when we say

absolutely, the API, and it just, you call our API when you want to, when you want a query that

needs information from a data warehouse, that query will make the joins, do the calculations,

anything else. Very cool, very cool. So I can see some massive value that this is obviously

adding. But I'm super curious, you've been at this for a while. And but because there's this whole

new kind of wave of hype and interest and a lot of companies, as you say, are kind of reaching out to

you now, what's kind of new or what's next on the horizon for Chata.ai? Are there any upcoming

features or improvements that you guys are looking at that you can share with us?

Yeah, I mean, we're always trying to improve. So I'll give you a new technology that we're just

rolling out now. So we can, in the past, we were able to train a custom language model in about,

depending upon the size of the warehouse in about 48 hours, last kind of what the training would

actually take. Okay, so that's what it takes from there. Now, and then you iterate on the model,

because people want to add new columns or new tables or anything like this. That's the way,

so the things set up for iteration, for sure. So that 48 hours now, with this new rollout,

we've actually dropped that to one hour. Wow. So we can have a language model trained in one hour.

The thing nobody talks about in the space is how expensive it is to have large language models

actually run, let alone train them. Yeah, right. So now all of a sudden a team can iterate on models

and they got to wait one hour for it to actually train to be able to use it away. Okay, so I can't

go too much deep into the technology, but it's about, instead of using a sledge hammer to hit

every single nail in, you can have specialty hammers that can work together to still get the

job done, as long as they can work together. So you pointed out before we've been working on this

long time, we've gone through lots of different iterations, tried lots of different things.

The last system we had has been working incredibly well. This is a way that we make things much

quicker, much more accurate. And then the second piece would be around, there's a lot of information

that can be generated from these questions that people are actually asking, right? And so we have

a really, really strong research direction and understanding that, trying to help people reason

through certain analysis, and those types of pieces as well. So we have a really robust

R&D plan. Our core system now is, I mean, now is the time. So now we've hit the market three years

ago, where we're early to market. People weren't quite ready for something like this, but I get,

because of our friends at OpenAI and Microsoft, and now everybody else, they're pushing the

conversational, because the chatbot's kind of wrecked it for everybody, right? Everyone has

that conversational, I don't want to touch on the technical pool. And so now the automated,

oh, this stuff actually works. And so now things are pushing forward. But there's a very bright

future in that area without it. Sure. And that's the reason why we're raising it $10 million US

round, because we have this pipeline to execute on, which is absolutely amazing, very, very good

brands. But we have to keep that R&D engine pumping, because we got a lot of good, really strong

ideas that we were ahead in a lot of areas, right? So you know, when GPT-4 came out, everybody's

like, oh, that's not one model. That's actually a bunch of different models that are kind of working

together. That's exactly the architecture. Yeah. Right? Now, don't get me wrong, I'm not saying

that ours is that massive and can do all the things there. But for our little, our little

corner of the earth in the natural language to database query language, that's the philosophy

that that we took. And our teams, very brilliant. They're very strong. And they come up with a lot

of really interesting ideas. And as a startup, we can execute quite, quite quick on those ideas.

That's very cool. And one thing I do want to highlight that you said there, because I think

it's really interesting, you know, I know you can't go in too much into the tech, but you mentioned

not using a sledgehammer to do something, but kind of smaller, more accurate hammers. I think you

guys are absolutely on the right track there. A bunch of research that we've seen out of Google

DeepMind, for example, they just trained a new chess model, where essentially it beat alpha zero

because what it did is they spun up 16 individual, like chess model players, and they all trained

them with slightly different styles. And then instead of having one kind of major sledgehammers

that were to play the game of chess, they have the 16 all going against each or like collaborating

and deciding which of the 16 had the best move, the same thing when opening.

That's exactly it. It's the ability that these different models can pass information back and

forth. You're exactly correct. That is really cool. And in addition, when OpenAI's GPT-4 model

weights were leaked, it was found that they have 16 experts within GPT-4 that when you have your

queries, it collaborates with all of that. So I think 100% you guys are on the right direction.

I'm super excited about your $10 million round. I think you guys are the companies that I see

successful right now raising successfully and really killing it in this industry are companies

like yourself that have been in this for a long time. It's maybe not the newest thing,

but it's something that has a really solid tech background. And now you're taking this to the

next level, you have the perfect position to be a really powerful launch pad, pushing forward

your technology. And you, you know, the tech's already there. Now it's time to take it to the

next level. And so yeah, I'm really excited for you guys. I think you'll be incredibly successful

in that. Listen, Kelly, thank you so much for coming on the podcast today for sharing your

insights. Really excited about what you guys are building over at Chata AI. If people want to try

out Chata or if they want to get in contact with you, what is the best way for them to do that?

Yeah, I mean, they can go to our website or they can just reach out directly to me as well.

I can, I can leave my email contact information with you when you post that. And yeah, I'm no

happy to talk to potential partners, investors, whomever, or just people are generally interested

in the area is always great conversations for sure. Wonderful. I'll leave that in the description

for the show notes. So to the listeners, thank you so much for tuning into the AI chat podcast.

Make sure to rate us wherever you listen to your podcasts and have a wonderful rest of your day.

If you are looking for an innovative and creative community of people using chat GPT,

you need to join our chat GPT creators community. I'll drop a link in the description to this

podcast. We'd love to see you there where we share tips and tricks of what is working in chat GPT.

It's a lot easier than a podcast as you can see screenshots, you can share and comment on things

that are currently working. So if this sounds interesting to you, check out the link in the

comment. We'd love to have you in the community. Thanks for joining me on the open AI podcast.

It would mean the world to me if you would rate this podcast wherever you listen to your podcasts,

and I'll see you tomorrow.

Machine-generated transcript that may contain inaccuracies.

Explore the future of AI's transformative influence on business analytics in this engaging conversation with Kelly Cherniwchan, CEO of Chata.ai. Discover how AI-driven insights are reshaping the way companies analyze data, make decisions, and drive success. Join us as we delve into the cutting-edge developments at the intersection of AI and business analytics with an industry leader.


Get on the AI Box Waitlist: https://AIBox.ai/
Join our ChatGPT Community: ⁠https://www.facebook.com/groups/739308654562189/⁠
Follow me on Twitter: ⁠https://twitter.com/jaeden_ai⁠