Skip to main content

Prompt Engineering Best Practices: Using Plugins [AI Today Podcast] AI transcript and summary - episode of podcast AI Today Podcast: Artificial Intelligence Insights, Experts, and Opinion

· 17 min read

Go to PodExtra AI's episode page (Prompt Engineering Best Practices: Using Plugins [AI Today Podcast]) to play and view complete AI-processed content: summary, mindmap, topics, takeaways, transcript, keywords and highlights.

Go to PodExtra AI's podcast page (AI Today Podcast: Artificial Intelligence Insights, Experts, and Opinion) to view the AI-processed content of all episodes of this podcast.

View full AI transcripts and summaries of all podcast episodes on the blog: AI Today Podcast: Artificial Intelligence Insights, Experts, and Opinion

Episode: Prompt Engineering Best Practices: Using Plugins [AI Today Podcast]

Prompt Engineering Best Practices: Using Plugins [AI Today Podcast]

Author: AI & Data Today
Duration: 00:07:30

Episode Shownotes

Plugins for Large Language Models (LLMs) are additional tools or extensions that enhance the LLM’s capabilities beyond its base functions. In this episode hosts Kathleen Walch and Ron Schmelzer discuss this topic in greater detail. Can I use plugins with ChatGPT? Plugins can access external databases, perform specific computations, or

interact with other software and APIs to fetch real-time data, execute code, and more. Continue reading Prompt Engineering Best Practices: Using Plugins [AI Today Podcast] at Cognilytica.

Full Transcript

00:00:01 Speaker_01
The AI Today podcast, produced by Cognolitica, cuts through the hype and noise to identify what is really happening now in the world of artificial intelligence.

00:00:10 Speaker_01
Learn about emerging AI trends, technologies, and use cases from Cognolitica analysts and guest experts.

00:00:22 Speaker_02
Hello, and welcome to the AI Today podcast. I'm your host, Kathleen Malch.

00:00:26 Speaker_00
And I'm your host, Ron Schmelzer. You know, one of the great things about AI is that, hey, it really is part of everybody's conversation.

00:00:34 Speaker_00
As you may have been following us recently, we've been producing a series on prompt engineering because most people's entry point today to artificial intelligence really is generative AI, where they're typing in prompts and they're getting responses, whether it's text or images or audio now, lots of things.

00:00:53 Speaker_00
It's appearing either because they're explicitly going to these tools like ChatGPT, Google Gemini, Microsoft Copilot, or maybe it's becoming embedded in their tools, a little sparkle.

00:01:03 Speaker_00
So everybody's learning and using prompts and they're learning the benefits and the drawbacks of AI-based systems. And so there are best practices.

00:01:11 Speaker_00
There are things that people are realizing that work and some things that work better and some things that don't. So we've been really spending some time here on our AI Today podcast with your audience. Now, we understand that many of you

00:01:23 Speaker_00
Our skill technical people building models from scratch that's what the cpm methodology is all about the cognitive project manager for methodology we have so many podcasts on that topic you should probably listen to them all but for a lot of people that are not.

00:01:39 Speaker_00
data scientists, machine learning engineers, or even project managers implementing AI. They're using AI, but in this way.

00:01:46 Speaker_00
So I think this is why we're filling in that gap of knowledge to help people learn how to make more effective use of what we believe will be a tool that everybody has to learn to do their job in the future.

00:01:57 Speaker_02
Exactly. And so we've brought this up on previous podcasts, but it's always worth repeating. You know, many years ago, the question was, will AI take my job? And I think that the answer is now, no, but people that use AI will.

00:02:09 Speaker_02
And that's why we think it's so important to make sure that we have a podcast series on this, that we continue to talk about it with our audience.

00:02:16 Speaker_02
So if you haven't done so already, subscribe to AI Today so you can get notified of all the upcoming episodes, both in this series as well as the other episodes that we have.

00:02:25 Speaker_02
We have some wonderful interviews coming up, and we also are talking about additional topics as well.

00:02:30 Speaker_02
So for today's podcast, we want to talk about using plugins, because for many of you, if you have used these large language models, you know that they are limited in their training data and also limited in their inability to access things like your documents and your data, maybe, you know, your organizational data.

00:02:51 Speaker_02
And they also have limitations on what they can do with text-based prompts and data.

00:02:57 Speaker_02
So plugins have become one way that people are now, you know, adding these additional tools or extensions to enhance the large language models capability beyond just its base functions.

00:03:10 Speaker_02
So some people are just using the base functions, but now a best practice is to use plugins. And you'll want to do this so that you can

00:03:18 Speaker_02
You know, there's a variety of different things, but they help with things like accessing external databases or performing specific computations.

00:03:25 Speaker_02
We talk about, you know, some people go in and they're trying to do these math equations with just, you know, general prompting and it's not working, right? So we say, well, maybe because that's not the right tool.

00:03:37 Speaker_02
So if you have these specific plugins, that can help. Or maybe that you need to interact with some other software and different APIs to fetch data in real time, execute code, a bunch of different reasons.

00:03:48 Speaker_02
So plugins are a best practice to help augment your large language model.

00:03:54 Speaker_00
Yeah. And so there's lots of reasons why this might happen. You know, large language models have been trained on a lot of language. That's what, that's the definition of a large language model.

00:04:03 Speaker_00
And so by default, they're really focusing on trying to answer the broad range of what everybody might be asking. And of course, also the training data has a cutoff date.

00:04:15 Speaker_00
There's only a certain, it's not like the training and the models updated like every minute. I mean, the models are updated. And when they are updated, they bring in new training data and they've updated them.

00:04:25 Speaker_00
But not all that data is going to be recent. And if you ask for something like, what is the price of the stock ticker for, you know, Apple or something, it's not going to tell you today's price. How would it know that? Right.

00:04:36 Speaker_00
So what you can do is you could say, hey, you know, LLM, when someone is asking for this kind of information or requesting this kind of data, I'm going to give you a way to get that data. And that's what these plugins are for.

00:04:51 Speaker_00
So as mentioned, one of those things is it provides access to real-time information. So we can have an LLM that, you know, provides access to stock quotes or to some specific information from some database somewhere.

00:05:03 Speaker_00
And that plugin, if you use that plugin with the LLM, will provide access to that information.

00:05:10 Speaker_00
And similarly, there might be very domain-specific information, like the LLM may not have been trained to know specific legal information or medical information or whatever it is, you know, education.

00:05:20 Speaker_00
Maybe you are trying to have it answer questions about your college courses and it doesn't know your college courses.

00:05:27 Speaker_00
So you can provide that specific set of answers and responses and examples and context within the plugins to say, hey, any questions about our college campuses courses, here's all the context that you need.

00:05:41 Speaker_00
And so people don't need to do that themselves. Because the alternative to doing a plug-in is like, yes, you can do this yourself, but then you have to do it yourself.

00:05:51 Speaker_00
So if you want to ask questions about the college campuses courses that are offered that quarter, then you would have to somehow upload that information and then to the LLM and then you would have to ask those questions.

00:06:04 Speaker_00
But because the GPT plug-in has been added, then it knows that there's already an external source of that information. You don't have to do all that work.

00:06:12 Speaker_00
But let's look at a few other things that we can do besides that access to real information and adding some of that specialized knowledge through these plugins.

00:06:20 Speaker_02
Exactly. So we can have interactivity and also dynamic content. So people are now using plugins so that large language models are able to interact with other applications and services.

00:06:31 Speaker_02
So you think about maybe they need this for dynamic content generation, or they want to automate certain tasks, or they want to interact or enhance user interactions that they wouldn't be able to do with just the base large language model.

00:06:46 Speaker_02
And there's other reasons such as customization and personalization.

00:06:52 Speaker_02
So developers and users now can tailor the capabilities of large language models to their specific needs through plugins that they're not able to do with just the base large language model.

00:07:03 Speaker_02
So now businesses and individuals can really use the same base model, but then customize it with plugins.

00:07:09 Speaker_02
So if you want to have specifics about your business, because every business is unique, and so maybe you have a specific onboarding process or you have a specific way of doing things, then now you're able to do that with the use of plugins.

00:07:23 Speaker_02
And we've also seen people use plugins because it helps with improving performance and efficiency.

00:07:29 Speaker_02
So some tasks may just be beyond the scope of that, you know, base large language model and really require that computational resources that are more efficiently handled externally.

00:07:40 Speaker_02
So plugins, you can use them to help offload some of these tasks for these specialized services or databases. And then it's going to help the performance and the speed of a large language model as well, because people want the results pretty,

00:07:53 Speaker_02
pretty instantly, you know, we're not, we don't want to wait for that long. So when we use plugins, we're able to just improve that performance, improve efficiency, get results in a much more timely manner.

00:08:03 Speaker_00
Yeah, now if you've had experience with the ChatGPT's premium, the four bottle, but also all the other premium features, you will know that there are actually some plugins that are sort of built in, or at least available to you as part of the ChatGPT service.

00:08:21 Speaker_00
One of those, of course, is the live browsing. It'll browse with Bing. So if you enable it, you have to make sure to enable and you pull down. So you go to pull down and you enable the live browser, you could say,

00:08:31 Speaker_00
You know, who are their current senators in the blah, blah, blah committee? It'll go out and it'll do a fetch for you with Bing in that case. It'll come back with the results and it'll do the analysis. I actually did that exact case for something.

00:08:45 Speaker_00
There's also another built-in plugin called Advanced Analytics. If you upload a spreadsheet, you can ask questions about it. It doesn't use the LLM directly, it uses the plug-in to basically figure out the data.

00:08:58 Speaker_00
And then it uses the LLM, of course, to answer the questions on the data. There's a few of them, the few models that are sort of built in. There's a lot of other ones that you can get from third parties that will do things that other people have built.

00:09:09 Speaker_00
That may or may not also require, I don't know how they deal with, you might need account access, you might need all that sort of stuff. But there are things that are, for example, will let you run code snippets.

00:09:18 Speaker_00
So you can have the LLM answer some question, but then actually execute some task, like update a website. or generate some code, or, you know, do some sort of application, some sort of software development, which is kind of cool, kind of neat.

00:09:32 Speaker_00
You know, like, hey, ask a question and then maybe make a blog post about it just directly from the GPT interface. It's kind of, the more I talk about these things, I'm like, I've got to be using more of these myself.

00:09:40 Speaker_00
There's some that will even let you do things like create PDFs or create slides, PowerPoint slides, just from the results of your query. Of course, now, there are also those tools that are, you can get them from the slide company that use GPT.

00:09:54 Speaker_00
So it's kind of a little bit of where do you want to use that service, but it's available there. There are other ones that will do things like integrate with some of the real-time tools like Slack or Asana or Trello.

00:10:04 Speaker_00
So you can basically have it assign tasks. It's kind of neat for project managers to actually have a question and actually go and actually do the assignment if you trust the system to do that, right? Or manage the look at your calendar.

00:10:17 Speaker_00
Some plug into your calendaring system so you can ask questions about what's coming up in your calendars. Maybe even set up a schedule, set up a meeting. I mean, the more I talk about things, I'm like, this is pretty powerful.

00:10:28 Speaker_00
I'm like, we need to be using all this ourselves. There used to be a company, or there still is a company called x.ai, and all it did was respond to emails with scheduling. Now you can just do that within GPT directly.

00:10:41 Speaker_00
And there are other tools that do things like translation and localization or do things like integration with enterprise applications.

00:10:49 Speaker_00
There are some that basically will connect you to the cloud, which will allow you to access files, say, in Dropbox or Drive or something like that.

00:10:56 Speaker_00
And then there are other ones that also let you query databases to say, hey, find this information in this database. And others that even help you build new AI models using GPT, which is kind of crazy.

00:11:08 Speaker_00
So, the thing is, we say this is the best practice because there's as much power as there is in LLMs and generative AI. There's even more power when you throw this stuff on top of it.

00:11:18 Speaker_00
It's like this world where you're like, you just want to, you know, it's like, it's like the first time you've ever seen a supermarket. You're like, I can't believe there's all this food on the shelf.

00:11:25 Speaker_00
And, you know, the one thing you want to do is try everything, but of course you can't. So you kind of have to just keep coming back.

00:11:30 Speaker_00
And like, you know, as long as you keep trying different things in different store aisles, you know, you'll discover new things that you may like and things you don't like if you don't like kombucha, you know, but you'll never know because it's there on the shelf.

00:11:41 Speaker_00
So.

00:11:42 Speaker_02
Exactly. And so that's why we like to share this because one, maybe you, you know, do use plugins and that's great. This is a refresher. But if you don't use them, these are some examples of why and how you can.

00:11:56 Speaker_02
And so again, we always say, you know, generative AI is one application of the seven patterns of AI. And so when you're

00:12:04 Speaker_02
when you're running and managing AI projects, even if you don't think that using these large language models is, they are little mini projects.

00:12:11 Speaker_02
And so we always say, start with the business understanding, make sure you're solving a real problem, then figure out your data understanding, and then make sure that your data is prepped. And so with this, that just helps with that.

00:12:21 Speaker_02
And it's like, what is the reason for doing this? well, maybe you're trying to save time, you know, and you don't need to be going into all of these different systems anymore.

00:12:29 Speaker_02
Wouldn't it be so nice that it sends a Slack message and it also updates whatever, you know, project management software task, task management that you're using, and also adds it to people's calendars all within your prompt. That would be wonderful.

00:12:44 Speaker_02
And then, yes, of course, we always say AI has never said it and forget it. So you have to go in and be double checking and making sure that this works, but but it saves a ton of time.

00:12:51 Speaker_02
So, you know, how can you become much more efficient at your job doing, having, you know, AI help with these certain tasks? So whenever we talk about it, we get so jazzed up. We think that this is really powerful.

00:13:05 Speaker_02
And that's why we are in the middle of our best practices series, our podcast on prompt engineering, because we want to let everybody know, you know, what is the best practice for right now?

00:13:19 Speaker_02
So if you haven't done so already, subscribe to AI Today so you get notified of all of our upcoming episodes. This is podcast four in our six part series. So part five is going to be hack and track the best practice.

00:13:31 Speaker_02
So how you use spreadsheet or another method to track what prompts work well as you experiment with different possibilities, especially if you're in a group setting, how do you share your prompts for others to use as well?

00:13:47 Speaker_02
We also have additional, really some wonderful interviews coming up and other things, so definitely subscribe.

00:13:53 Speaker_02
And if you are interested in learning more about the CPM-AI methodology, which is the Cognitive Project Management for AI methodology, we provide a training and certification, so you can go to Cognolitica.com slash CPM-AI to learn more.

00:14:06 Speaker_02
Like this episode and want to hear more? With hundreds of episodes and over 3 million downloads, check out more AI Today podcasts at aitoday.live.

00:14:16 Speaker_02
Make sure to subscribe to AI Today if you haven't already on Apple Podcasts, Spotify, Stitcher, Google, Amazon, or your favorite podcast platform. Want to dive deeper and get resources to drive your AI efforts further?

00:14:29 Speaker_02
We've put together a carefully curated collection of resources and tools. Handcrafted for you, our listeners, to expand your knowledge, dive deeper into the world of AI, and provide you with the essential resources you need.

00:14:41 Speaker_02
Check it out at aitoday.live slash list. This sound recording and its contents are copyright by Cognolitica. All rights reserved. Music by Matsu Gravas. As always, thanks for listening to AI Today, and we'll catch you at the next podcast.