Navigated to Creating your own AI with Custom GPTs - Transcript

Creating your own AI with Custom GPTs

Episode Transcript

Hi, I'm Ray Poynter.

And I'm Will Poynter.

And together we're the founders of ResearchWise AI.

And today we're going to be having a look at custom GPTs.

So, Will, perhaps to get us started, what is a custom GPT?

Thank you, Ray.

Like a lot of things in the market research, sorry, in the AI world right now, there are a lot of new terms appearing and they're being used in a multitude of different ways.

But the best way to boil down what a custom GPT is, is it is a standalone assistant that is either you can customize or someone else is customized using very little code or no code at all.

And when I say customize it, this means you can give it your own instructions.

You can tell it to speak in a certain way, achieve a certain task, provide it with a custom knowledge base or access to some other tooling.

And custom GPTs, are they just an open AI part of chat GPT?

Or are there other things around that are custom GPTs?

So, there are others.

Open AI has named its custom GPT product Custom GPTs, which is very smart.

And as I say, there is a big fight over the names of things right now in the AI war or AI race, I prefer to call it, but there are other products.

So, Eads and AI offers a similar product called, I think it's called Yoda AI, which I think they're trying to move away from that name by the feel of their website.

Hugging Face offers assistance.

And then there's even custom GPT.ai.

So, that's a whole company that's named itself around custom GPTs.

And there are more.

I'm sure lots of people listening can mention a few more.

So, if we focus for the moment on the Open AI product, how does that look?

What are the alternatives in terms of custom GPTs?

So, anyone with a ChatGPT account can go look at custom GPTs.

I think that includes free accounts, but certainly any of the paid accounts have access to custom GPTs.

And you can either create your own custom GPT, or you can use someone else's from the library made available or made public.

As I say, if you create your own, you'll be able to provide it with your own instructions, provide it with its own knowledge base, which is a RAG system for those that have followed the RAG podcast.

And it can integrate with other external tools.

So, it could, for instance, look up the current weather.

Or if you provided it, it's a tool that generates Excel workbooks.

It could generate an Excel workbook for you.

I know some of it.

And then there's the library of custom GPTs that have been made available by other people made public.

One I've used was an expert in styling React components, which is not one of my skills.

And it had a lot more up-to-date documentation around that styling.

So, I was able to ask it more challenging questions than I can the standard ChatGPT, whose knowledge cut off was a while ago.

Even when I get to search the internet, it might search one or two pages, but not the right pages.

And so forth.

So, this was more focused towards that.

And I know that you've experimented with a couple of custom GPTs from the library with a more fun spin to them.

Yeah.

So, I've used one for creating cartoons.

I've used a Canva version for visualization, using stuff for writing quotes.

So, I mean, there's a lot of tools out there pre-built.

And quite often, they link to commercial software.

So, they'll do a little bit for you.

And if that's not enough, then you can go through to the host platform.

Canva is like that, for example.

Yeah.

So, in that case, I think we're seeing companies like Canva quite savvily make sure they've got a precedence in custom GPTs.

So, when people discuss it, they're making the Canva experience easier from all angles and then pulling you in towards Canva.

So, you mentioned you can create your own.

So, can you talk us through how people do that and what some of the use cases are?

Why would they do that?

Yeah.

So, let me start with the why.

And that we can use an example of how we do it.

I've been compiling over the last 24 hours different use cases in my head.

And it's an almost endless list when you keep thinking about it.

So, a very, very standard one for a company might be an internal HR assistant or compliance and governance assistant.

We all know companies that are full of policies and you want to remember how you submit something maybe to accounts payable or you request annual leave or you request a special type of leave or you do something else.

You can feed these custom GPTs with your own knowledge base.

So, these are policies that are specific to your company and then you can make that available to all of your employees.

That's a fairly routine rag-like system.

You take all the docs that already exist, you can upload it in a few clicks, and then it will use those docs as its knowledge base.

And you would provide it some instructions to say, only use those docs.

Don't go and, for instance, where there is, if your company has not provided a whistleblower policy, don't go read one off the internet and just pretend they're the answers.

But to put a more market reach at a spin on it, for instance, we could use a custom GPT to test a discussion guide.

So, we could build personas into custom GPTs and then we could have conversations with them to test whether our interview guide is suitable, whether it's going to get to, you know, this would be an early free test before involving people.

Equally, we could fill it full of the findings, the insights from a study and have it be an interactive conversational way of sharing findings.

So, some people learn really well by charts, some people love reading a written report, some people want to have it in their ear and more sort of spoken to them.

And some people will learn best by a conversation.

And it just is another method in which we could deliver that information.

Yeah, so we have a product called Virtual Ray and that is a rag-based thing.

But the very first version we did, we did using custom GPT.

Maybe talk about why we moved away from using custom GPT, because I think that gives us some insight into some of the challenges with them.

As well as the benefits.

Yes.

So, well, just before I say why we moved away, that's another good use case, prototyping.

It is a no low code version with OpenAI, pretty much no code.

Some of these are low code.

And that means that if you've got an idea for something that might work, you can prototype it in minutes to hours, which is nearly impossible with anything coded in reality.

Yes, an engineer can set something up on their laptop and it can generally run within an hour.

But the moment you need to put it in the cloud, the moment you need to send it to someone else to test, you're adding time in.

So it's a fantastic prototyping tool where appropriate.

And I would always recommend people, if they can, start with something like that.

Why to move away?

It is, by being a no code solution, it is limited.

It can't do all the things that the more complex frameworks can do.

It, for instance, and the current versions don't have your reasoning models.

So they don't have chain of thought.

Now, they are not a multi-agent approach.

So some of these assistants we speak to have actually got multiple assistants inside them.

So either, let's say, a qual analysis expert and a quantum analysis expert and a fact checker and a tone of voice checker, for instance.

All of these things can exist as separate AIs working together.

In a custom GPT, you can't have that complexity.

It is a single set of instructions which will always dilute its power.

And you're also limited in, for instance, how it builds its tools and knowledge base.

Because you're doing that without code, as I say, it's fantastic to get up and running.

But we really improved the answer rate of Virtual Ray, even in its first couple of weeks, by changing the way in which the knowledge base was chunked up into separate pieces of knowledge and how then it was indexed and findable.

So custom GPTs allowed me to build the knowledge base in under an hour, but it was only so-so.

Then a couple of days of coding allowed me to build a knowledge base that had a greater answer rate.

So it is a great prototyping tool, and it may be a great tool for certain situations.

As I said, there's an HR bot, for instance, or a customer service bot.

It may work perfectly.

But, you know, as a standing piece of advice, we start with the cheap, fast thing and then move from there into the more complex, expensive things.

And I think at the moment, if you were to create some deliverables for your clients and create a custom GPT so they could access them, by the way, that doesn't really raise too many privacy concerns, but we can talk about that later.

They would need a chat GPT account to be able to access it.

And I think that's probably one of the other limitations at the moment.

You're quite right.

To be able to experiment with AI, with almost any product, you need to be able to create an account.

This is more so than anything else that I can think of out there, because AI costs so much for the companies that host it.

So I've just double-checked, I can edit a video, including stripping off the audio or cutting a video, online in my browser for free now.

Editing video was a very, very demanding task in which people spent thousands of pounds on very expensive computers to do, and still do to some extent.

But the idea of editing in your browser for free was just unheard of 10 years ago.

Now the thing that, you know, that sort of free access that seems unusual is AI, because AI costs so much more than editing video.

It costs so much more than doing anything else.

So you will have to create accounts to experiment with things.

And therefore, these tools that are suitable for prototyping, like OpenAI's custom GPTs, hooking face assistants, and so forth, will always make you create an account.

But once you create your own assistant via code, you can make the choice as to whether you want them to have an account or not.

Now, there is a huge risk.

Virtual Ray right now does not require an account to speak to, but that is because we are keeping a very watchful eye on the number of people speaking to it.

And thankfully, you don't have 10 million followers.

But if you did, we probably would have to immediately put an account around it, because having thousands and thousands of conversations with it would immediately incur quite a large bill for us.

So you're absolutely right.

To use a custom GPT, you need a chat GPT account.

This is a limitation, but it's also a limitation we're not going to get out of anytime soon with a lot of these tools.

Someone's got to foot the bill, and they want to know who's going to foot the bill.

Yeah, that's a really good point.

So I think that's given us a nice introduction to custom GPTs.

Is there anything you can share with us about the other products, if people are wanting to try a custom GPT, but not the open AI chat GPT route?

Yeah, so I liked HuggingFace Assistants.

HuggingFace is actually the area which developers hang out, for lack of a better term, in the AI space.

For those that have heard GitHub for software, it's like that.

It is your repository for information.

It's your community.

But the assistance is very approachable with a no-code solution, and it seems you'd start for free, whereby some of the others, these are built for companies who are definitely going to be churning out a lot of custom GPTs and their specialized tools, and will cost you $250 a month or more.

I would certainly not recommend experimenting with them.

I'm sure they're very good, but not for the sake of experimentation.

Excellent.

And if people do want to know more about custom GPTs, we've got a course coming up on April the 9th, I think.

So if you'd like to find out about that, go to researchwiseai.com.

Go to the Academy, and you can check out the upcoming course.

Or if you're listening to this podcast later, then you can check out how you can access it retrospectively.

Thank you, Ray.

And that's it for today.

So thank you, Will.

Thank you for listening.