AI Hustle: News on Open AI, ChatGPT, Midjourney, NVIDIA, Anthropic, Open Source LLMs: Scaling AI: Exploring the Future with UbiOps CTO Victor Pereboom

Jaeden Schafer & Jamie McCauley Jaeden Schafer & Jamie McCauley 10/10/23 - Episode Page - 29m - PDF Transcript

Welcome to the OpenAI podcast, the podcast that opens up the world of AI in a quick and

concise manner.

Tune in daily to hear the latest news and breakthroughs in the rapidly evolving world

of artificial intelligence.

If you've been following the podcast for a while, you'll know that over the last six

months I've been working on a stealth AI startup.

Of the hundreds of projects I've covered, this is the one that I believe has the greatest

potential, so today I'm excited to announce AIBOX.

AIBOX is a no-code AI app building platform paired with the App Store for AI that lets

you monetize your AI tools.

The platform lets you build apps by linking together AI models like chatGPT, mid-journey,

and 11Labs.

Eventually, we'll integrate with software like Gmail, Trello, and Salesforce so you

can use AI to automate every function in your organization.

To get notified when we launch and be one of the first to build on the platform, you

can join the wait list at AIBOX.AI, the link is in the show notes.

We are currently raising a seed round of funding.

If you're an investor that is focused on disruptive tech, I'd love to tell you more

about the platform.

You can reach out to me at jaden at AIBOX.AI, I'll leave that email in the show notes.

Welcome to the AIChat podcast, I'm your host, Jaden Schaefer.

Today, on the podcast, we have the pleasure of being joined with Victor Perryboom, who

is the co-founder and CTO at UbiOps, which enables data scientists and engineers to very

quickly turn algorithms into scalable, robust, and secure end-to-end applications without

requiring knowledge to set up cloud infrastructure, microservices, automated scaling, or DevOps

practices.

Super excited to have you on.

Welcome to the show today, Victor.

Thanks.

Great to be here.

Super excited.

A question I wanted to kick this off with is, I'm wondering if you can give us a little

bit about your background.

First off, my question would be, did you always know you were interested in AI into this space?

Was this something you found throughout your career journey?

Walk us a little bit through your journey that brought you here.

Sure.

Sure.

My interest in AI and data science started probably about more than 10 years ago.

I was still at university studying aerospace engineering at the time, did a lot in autonomous

control and there is a lot of, it's the more computer science data-focused side of aerospace,

I would say.

Okay.

And it was around the time that big data became a topic and more people started exploring

what's possible with machine learning and those techniques.

And I don't know, I've always been interested in more software and computers and all that.

So just out of curiosity, I started diving into the topic and just up with me.

This is really powerful technology.

So yeah, I started also including kind of forces and things and just reading up on what's

happening on data science and AI and big data back then and how it could be applied to what

I was working on university.

And later that kind of ended my interest in the subject and later I got the opportunity.

I was more kind of well-versed on the topic then and had the opportunity with a friend

from university to kind of start a company in the space and we started actually working

on a different range of topics like solving machine learning problems for customers.

And especially around the topic of predictive maintenance, which is really cool because

based on data like IoT, sensor data, all that stuff, you can kind of predict if any device

or component or some things going to break down at some point, if there are any failures

and things.

We did a lot of that work in railway infrastructure and tried to predict failures there.

So it was also solving a big problem back then.

That was really, really interesting to work on and from there we started building a company

around that was more like professional services consultancy in the space and also trying to

productize the things we were developing.

So productizing the AI solutions we were building like the different predictive maintenance

models we were developing.

And we just ran into the issue over and over again of like, hey, we need some kind of standardized

infrastructure to kind of take the models which are basically just like pieces of Python

code with statistics and everything in it.

And how do we actually like turn these into working like live application that we can sell

or now we can sell to people and sell to the businesses that need them because that's

eventually what they were going to use, kind of the type of applications that we're going

to use in the field.

Not just we couldn't just like ship the model and say like, okay, good luck and our work

is done, we needed something.

So let's say for predictive maintenance, let's say that the end users of the application

were more like the actual engineers who knew everything about the railway infrastructure

but not much about data science and AI.

So they needed something.

They could show them on an iPad when they were out there and just see what was going

on.

To get that whole system up and running and working, we needed to host our AI models,

let the data flow through them and put that entire also IT side and cloud infrastructure

in place.

And for every application we were building and for every customer we were engaging with,

we had to do the same thing over and over again.

So that's how we started kind of building that AI machine learning serving infrastructure

that we now turn into a product and a company.

That's incredible.

That's super fascinating.

There was a very long intro by the way.

Yeah, well, I love it.

You gave the whole story.

So kind of diving into that a little bit and kind of what that looked like at the beginning.

What were some of your first steps when you decided you wanted to make UB Ops?

Was this something that you had kind of been working on some of this technology in the

background and it was kind of a natural launch where you've seen some of these issues and

you're like, okay, we need to make something completely brand new.

How did that partnership go with your co-founder?

How did that kind of look like?

Yeah, so by the time we turned UB Ops into a product, we were like a company of 15 people,

probably most of them focused on data science machine learning.

We realized that this was a problem.

So basically the deployment and productionizing of AI machine learning, so turning the models

we had into a working application, was not only something that we were running into,

like that same challenge, but we saw that everywhere.

So also other data analytics teams we talked with and other companies we spoke to, it was

like, yeah, we all run into this problem of having this, having good data science, data

analytics capabilities and having the people in the house to do that, but then at some

point turning it into actually like a reliable product and service.

That's really hard because data scientists are, yeah, those teams are not very at home

in like cloud computing, IT software in general, and to build a stable solution, you need software

engineers and so it started more as an internal product and an internal platform we were using

to just deploy our own solutions.

And then at some point we decided like, hey, this, like the market, the timing is right

to also help other teams achieve the same goals.

So we started actually talking to a lot of those teams at the companies we were working

with saying like, hey, if you had a platform like this, would that be useful?

And that's how we started kind of defining the scope of the Ubisoft platform.

And from there, we did internal development, started with a beta and started, yeah, getting

it, getting it out there into the, to the market.

They're very, very exciting times.

One thing I'd love to ask specifically about the platform and kind of what you guys are

building is like, how does Ubibops, you know, kind of simplify the process of turning algorithms

into scalable applications for data scientists?

Right, right.

So great question.

And so it's a platform that sits between the data science code and like the cloud and IT

infrastructure.

So the main goal is making it as easy as possible for a data scientist to take his AI

model, machine learning model code, push it to our platform.

And we do everything from, let's say, building a software container around it, scheduling

that to doing like all the API, API management, traffic queuing and all that.

So as soon as you push your code to our platform, you get this, you get an API endpoint in return.

So it turns it into a scalable cloud based service.

And you can just use the API endpoint to, should data at your send data to your AI or

machine learning model, you get the response in return.

So that's kind of, then you certainly have a connector to plug it into any application

like a, it could be, it could be like an app, could be some internal data system or RQ or

flow or whatever backend you're, you're using.

So suddenly it becomes scalable.

Your AI model becomes a scalable service that lives in the, that lives in the cloud, or at

least in our, in, in, in UbiOps, which can run in the cloud or run on premise too.

But making that step suddenly very easy, makes it suddenly that as a data science team, you

can build and ship AI products and solutions much faster than you could before.

Because before there was always this really, especially at larger organizations and, and

companies, there was this really awkward handover between data team and IT teams.

Like, hey, we developed those models.

We need to run them.

Can you help us out running this in the cloud?

And then suddenly things start to become really kind of messy in a way.

And now suddenly with our platform, you have, it guarantees for the IT team, it guarantees

all this stability for, like, yeah, you know, it, it, it can run on, on, on, on Kubernetes

or plug into, you know, like your existing cloud tenant API.

And the interface to data scientists is very easy to use.

So for them, it certainly becomes very easy to not create a, create a model catalog, have

all your, like, MLOps capabilities, have all the features you need to do, like model

versioning, manage that, build, build really great workflows and pipelines out of the different

models you have.

So suddenly building and shipping AI applications and not just models becomes very, yeah, easy.

And that's our, that's our main, that's our main goal.

Very cool.

Yeah.

And I mean, it sounds like you're, you're doing a fairly good job of that.

Um, and so I see, you know, a lot of people are quite excited about some of the things

you guys are working on.

One thing I'd be curious about, you know, just from yourself, personally, I've noticed

that you've been on the, the dev network advisory board for AI and machine learning.

And just, you know, off the top of your head, I wonder, you know, how

do you kind of see the role of advisory boards and shaping the future of AI?

Obviously, this is a really big new topic, a big new field.

A lot of people are, you know, wondering how this is going to shape out.

So I'd love to get your take on that.

Yeah.

I think it's, I think it's great to, um, especially for like conferences and events

to have input from people from the industry and from the field to kind of know, like,

what are the latest topics, uh, to, you know, including conference tracks,

what are nice things to, um, uh, what are, what are kind of emerging topics to maybe

include next year.

And just, I think it's great that they're getting that input from, you know, just

like people working in the field, like, uh, startup founders, company founders, but

also just like people working at larger companies, like, like, uh, like meta or

Google or whatever.

And, and yeah, it's, it's, it's really nice to have all these different opinions

to kind of shape what a, uh, yeah, what a, what a, what a relevant, uh, items to

include in a conference or events.

Very cool.

Yeah.

No, I think you're, I think you're spot on with that.

And I think getting all those different opinions, uh, especially from the field

helps to, to make some robust knowledge sharing and solutions and whatnot.

So very cool.

Um, I did notice Ubiops offers zero DevOps solutions.

I'm wondering if you can explain how this kind of benefits companies, specifically

maybe startups.

Yeah.

So we noticed that this, uh, for startups and scale ups, we are, um, a powerful

platform because they, um, usually they're, they reach a point where they say, like,

Hey, if let's say I'm a, uh, I'm a startup or scale up building an AI, uh, uh,

product based on AI, right?

So let's say my, um, offering is based on, uh, I don't know, like image recognition.

Yeah.

I'm building a mobile app for that.

And at some point you can, you can build a beta fairly quickly.

So you can, you know, like put, put quite a lot of stuff together and, and build

a beta of your product, but, um, it's a very different game to build a product

that's, that like 10,000 or maybe a hundred thousand or a million people are

going to use because then you sell any, the scalable infrastructure, not just for

your app, but also for your, uh, for all the AI stuff and all the logic.

And that's what we can give you out of the box.

So as soon as you, um, start using Ubiops for, uh, deploying your models and

yeah, acting them, uh, integrating them in your, uh, in your application or

your, the product that you're building, suddenly you're not just ready for your

beta, but you're also ready for, uh, you know, like scaling for the next, uh, for

the next for the, for, for the future, basically.

And you don't have to, I know, like, like, uh, refactor your, uh, your whole

infrastructure and backend, it saves a lot of DevOps and, and, and engineering

resources, but it also speeds up your time to market.

So suddenly you can ship products much faster.

And, uh, yeah, it, it, it, it also saves you all the, yeah, DevOps overheads.

And, uh, that's a, that's a big, big deal in building scalable production grade

systems right now.

Very cool.

That's awesome.

Yeah.

I'm sure that's a, a massive help to a lot of different people in this space

looking for this kind of technology.

Something I would, you know, speaking, speaking of something I'd love to get

your thoughts on, is there any specific, um, companies that you can, you know,

use cases or people that have, you know, had success with your product that you

could share?

Uh, yeah, sure.

Yeah.

We, uh, we are actually, uh, so we're, we're serving quite a broad range of

different companies and, and, and different use cases, uh, um, uh, some are

more like enterprises, like, uh, uh, Bayer, Bayer crop science, they're

building, uh, computer vision applications for, uh, for agriculture.

It was really cool.

Yeah.

Have, uh, a lot of very cool startups and scale ups, uh, among our, uh, among

our users, uh, some were working on like deep fake detection, which is, which is

amazing, uh, like in healthcare, things, especially healthcare, uh, uh, like image

recognition and, um, so that like, like image based applications in healthcare,

extremely powerful and, uh, uh, personalized, uh, personalized medicine

application.

So there are tons of great, uh, yeah, great products being built on, uh, on

Ubiops right now.

Very cool.

That's awesome.

Yeah.

Exciting stuff.

A lot of industries that I think you're going to see some, uh, some big gains from

this.

I'm wondering what are your thoughts on the future of AI infrastructure,

specifically like looking at the context of multi-cloud and on-premise solution.

Yeah.

Yeah.

That's a, that's a, that's a big topic right now.

And it's really interesting because, um, there are a couple of drivers that,

that kind of, yeah, move the, um, uh, uh, kind of change the, change the way we

consume and look at, look at the infrastructure.

First it was, okay, we choose, you choose a cloud, right?

Like a few years ago, you choose a cloud, you stick with it.

And, and that's kind of where you deploy everything.

But suddenly, especially now with, um, GNI and LLMs, that suddenly, uh, you can't

get like GPUs everywhere anymore all the time.

So they, they just sell out in certain regions.

And, uh, uh, also compliance, uh, compliance becomes a topic, especially for enterprises.

So like, okay, I want to, you know, like I want to keep my data and, and my

compute inside my own organization.

Uh, so suddenly you have all these different, um, kind of demands for like where

stuff should run and how you deal with compute.

So that really drives the, uh, adoption of hybrid cloud and multi-cloud, uh,

infrastructures.

So there are companies that say like, yeah, I want to, you know, like use my

own infrastructure for, uh, uh, because maybe you bought some like GPUs and have

them in your office or in the data centers.

Like, yeah, I want to use that for training, but I want to do inference in the

cloud because it skills better.

Yeah.

So then, but you want to control it from one, uh, one control plane and one interface.

So we also started, um, uh, we, we, we recently released, uh, capabilities to also,

you know, like deploy your models and workloads, uh, cross like hybrid and multi

clouds, multi-cloud architectures, which is really cool because then you can

suddenly lift and shift, uh, models from, uh, your on-premise data center or like

burst training workloads to the cloud or, uh, kind of pick what, um, zone you have

GPU availability so you can connect different compute environments to the same

control plane and, uh, use that to, uh, optimize for cost, optimize for resource

availability for compliance.

You have suddenly have a lot more flexibility in what your, uh, compute

landscape should look like.

And, uh, that, that's a whole different thing than like a few years ago, when

you just like stick with one, let's say cloud 10 and then build everything in

there, right?

Because it's currently kind of limiting, especially if you're, uh, yeah.

If, if, if, if these are challenges you're facing.

Right, right.

Yeah.

That makes a lot of sense.

Um, something I'd love to hear your opinion on is, you know, where do you see

the, the future of like AI and healthcare specifically?

I know you have some experience, um, in this space in the past.

What, what are your thoughts on where this goes?

I've seen some incredible innovations, right?

AI being used for drug discovery and all sorts of things.

Where, where do you think this all goes?

Yeah.

I think healthcare is an area where AI has tremendous potential.

And, uh, you also already see a lot of, um, very innovative applications being,

uh, being built, uh, in a healthcare space.

I think the most tangible right now has been like in the computer vision image

recognition space, uh, you know, like detecting, uh, uh, uh, in, in like pathology

or like, um, uh, medical imaging and doing automate automatic detection, uh,

there of like, uh, you know, like, like, like cancer cells and, and all those

different, uh, all those different applications.

I think those are pretty, they, those are actually, uh, uh, yeah.

They, they perform really well.

And I think it's really maturing field.

The tricky thing in healthcare is the data, the sensitivity of the data.

Right.

And that's, that's been, uh, that's been a thing for, uh, for a long time now.

So data is usually, you know, it sits in different silos.

It's hard to kind of break them down, um, especially when you like for, for

imaging, it's quite straightforward because there's a data stream from a machine

and you, uh, you pass it through an image recognition model with can also run on

premise, especially if you want to combine large bodies of data for more

advanced machine learning tasks, let's say, you know, like patient files and

things, then it suddenly becomes very tricky, but there's a lot of potential there.

And, um, so I think the, the, the challenging part of doing AI in healthcare

is on the data side, but there's already, yeah, there's a lot happening right now.

I think also, as you say, like drug discovery, uh, drug discovery is a, uh,

is a big thing where AI, um, and there's also a lot of data in those, in those

spaces. So, um, I think for the, yeah, for the future, I think also, uh,

like a bridge from like the hybrid multi-cloud part, this is also powerful.

This can also help in the, in the data, uh, in the healthcare space because

suddenly it becomes easier to bring, uh, your models to where the, where the data

is instead of pulling the data out of the silos where it's in.

And, you know, like migrating that to the cloud because then it becomes

suddenly, uh, yeah, kind of sensitive and, and, and people start kind of pushing

back on that, that, that movement because it's because of security and, you know,

compliance reasons. So, um, I think there's a, but it's, it's a, it's a very

exciting space for AI.

And I think we're going to see a lot of cool things.

Yeah. I mean, personally, it's one of my areas I'm like the most excited about

just, you know, seeing the ways they speed up drug discovery or, you know,

all sorts of interesting things, you know, proteins and enzymes and this is

crazy stuff. And it's, it's personalized medicine will be a big thing,

probably in the next 10, 20 years, as soon as, yeah, you can fine tune things and

also get more information about, uh, you know, like, like individuals and have

all that data and, you know, like, totally fine.

Like what's the best, uh, what's the best treatment for this patient in this

condition? And, uh, that's, that's, I think, really exciting stuff.

Totally. I think it's going to, you know, we're going to look back on the time,

you know, give maybe five, 10, 15, 20 years. We'll look back on the time when

everyone took the same pill or had the same treatment as like the stone ages,

right? Everything's going to be personalized. You're going to run

biometric scans on your body and your DNA and all sorts of things.

And things like custom to that. So it'd be super exciting.

You know, um, I would love to get from you a piece of advice to aspiring, um,

AI experts, machine learning, uh, people going into the field.

What's a piece of advice you feel like you could give them?

Oh, wow. Uh, I think it's, it's changing so fast, right?

Yeah. The whole, the whole field, I think compared to five years ago, uh, I

already hear people talking about like machine learning, 1.0 and kind of the

new era with more like foundation models and generative AI.

I do think that, um, a lot of the, let's say 99% of the business cases and

use cases still require like traditional machine learning and statistics.

Isn't it well, I think right now when, um, that becomes, we have like more

powerful techniques, more powerful hardware. Uh, we see a lot of, uh, I

think there's a lot of potential there.

So I think it's definitely worthwhile to still, you know, like understand the,

the basics of how machine learning works before you jump into the, uh, jump

into the space, uh, LLMs and gen AI and transformer models, foundation models.

It's really, it's really exciting.

It's really exciting stuff.

And, but it's still like fair.

It's just very advanced statistics in a way, uh, where it's just like you,

you, you, uh, increase the amount of parameters in a model and the complexity

of the relationships between them, uh, but still it's based on the same

foundational statistical principles as like the older machine learning stuff.

So it depends a bit on what you want to, uh, what you want to do.

I still think there's a lot of, um, right now, everyone is, is looking at, uh,

uh, gen AI and LLMs and looking like, okay, you know, let's see what we can do

with this and how it can be applied.

But, um, double kind of that, that, that's like, we're on the top of the high

curve right now.

So that will, that will, that will also, uh, yeah, become a bit more like quiet

in a way.

And then, um, I think it also helps kind of boost the, uh, potential for, uh,

more traditional machine learning and approaches where there's still a lot of,

yeah, ground to cover and a lot of cool business cases to solve.

And, uh, uh, I'd actually turned into, uh, yeah, turned into, uh, turned into

applications and, and, and products to, uh, yeah, make this, uh, make this

world a little, uh, a little more, uh, little, uh, little better.

So, um, I think as aspiring data scientist, um, it will be, uh, yeah,

it's good to have an idea of like the foundations and principles of how all

this stuff works, but, um, yeah, eventually when you work at a company,

it's about solving, uh, solving a business case in the, in the, in the right

way.

So whatever technique works, uh, she usually, uh, she usually, uh, usually

the best, um, but we, uh, I've seen the past at sometimes just like a linear

regression can get you quite far.

Uh, but right now, I think there, there, there are tons of exciting

opportunities, uh, with transformers and, and LLMs and we haven't really figured

out all the applications yet.

So there's a lot of, uh, a lot of exciting stuff to do there.

Very cool.

Yeah.

There's so much to come, you know, so much has already happened, so much more

to come, really exciting.

Victor, thank you so much for coming on the AI chat podcast, sharing your

insights, perspectives, background.

It's, it's been a phenomenal, um, time.

If people want to, uh, you know, find out a little bit more about what you guys

are building at Ubiops, maybe start use, look at using some of your tools,

implementing them into their tech stack and what they're working on.

What's the best way for them to find you?

Yeah.

So you can go to, uh, ubiops.com and, uh, actually create a free account or feel

free to, feel free to, uh, to reach out or connect, uh, connect me on LinkedIn.

If you have any, have any questions or want to chat, uh, definitely, uh,

definitely open to that.

So, uh, uh, yeah, just, uh, just look at our, look at our website and, uh,

connect one, one from our team and, uh, yeah, excited to get in touch.

Very cool.

And to the listeners, I'll leave a link to Ubiops in the show notes.

So you can go check it out there, but once again, thank you so much for coming

on Victor, um, to the listener.

Thank you so much for tuning in to the AI chat podcast.

Make sure to rate us wherever you get your podcasts.

If you are looking for an innovative and creative community of people using chat

GPT, you need to join our chat GPT creators community.

I'll drop a link in the description to this podcast.

We'd love to see you there where we share tips and tricks of what is working in

chat GPT.

It's a lot easier than a podcast as you can see screenshots.

You can share and comment on things that are currently working.

So if this sounds interesting to you, check out the link in the comment.

We'd love to have you in the community.

Thanks for joining me on the open AI podcast.

It would mean the world to me if you would rate this podcast wherever you

listen to your podcasts and I'll see you tomorrow.

Machine-generated transcript that may contain inaccuracies.

Join us for an illuminating conversation with Victor Pereboom, CTO of UbiOps, as we explore the exciting frontier of scaling AI. Discover the cutting-edge technologies and strategies that are shaping the future of AI deployment and integration. Don't miss this episode for invaluable insights into the evolving landscape of AI scalability!


Get on the AI Box Waitlist: https://AIBox.ai/
Join our ChatGPT Community: ⁠https://www.facebook.com/groups/739308654562189/⁠
Follow me on Twitter: ⁠https://twitter.com/jaeden_ai⁠