In the previous posts, we have learned the changes introduced in Semantic Kernel 1.0 which affected the setup of the project and the usage of semantic functions, now called prompt functions. In this post, we’ll focus on native functions: we’ll learn the changes we have to apply and also some new features that were introduced. This post will be shorter than the others, since the changes were minor, mainly the renaming of a few attributes and methods. For this reason, in this post we’re going to cover both native functions and OpenAI plugins. However, these changes are the gateway to the most important changes we’re going to see in the next post: function calling and planners.
But let’s focus on native plugins for the moment. The starting point is the same one we have used in our original examples: two flavors of the same plugin (a native one and an OpenAI one) to call the DataUSA APIs to get information about the US population.
Upgrading the native plugin
The first step when you upgrade your project to Semantic Kernel 1.0 is to change the code of the plugin, since the attributes used to turn a method into a native function have changed. Let’s take a look at the new code:
|
|
The code of the function is the same one we have used in the previous post, but the attribute has changed. The KernelFunction
attribute, in fact, has replaced the old SKFunction
one.
Adding the plugin to the kernel
The way you add a native plugin to the kernel has changed as well. You don’t need anymore to create a new instance of the plugin class, but you can use the ImportPluginFromType<T>()
method, which accepts a generic parameter, which is the type of the plugin we want to add. Here is the complete example:
|
|
Now we can call the plugin in the usual way. The only difference to keep in mind (which, however, we have already learned about in the previous post about prompt functions), is that now functions are stored in the Plugins
collection of the kernel:
|
|
Importing an OpenAI plugin
In another post of the original series, we have learned how we can reuse a plugin built for OpenAI in Semantic Kernel. These plugins are simply wrappers around REST APIs, with two peculiar features:
- They are described by an OpenAPI definition.
- They have an OpenAI manifest, which is a JSON defile that describes the goal of the plugin and the location of the OpenAPI definition, which is then used by OpenAI to figure out how to call the plugin.
Semantic Kernel 1.0 brings a few changes for this scenario. The first one is that OpenAI plugins are now managed by a different NuGet package. As such, you must add also the Microsoft.SemanticKernel.Plugins.OpenApi package to your project before using this feature. Another difference is that now you must use the ImportPluginFromOpenAIAsync()
method, instead of the old ImportOpenAIPluginFunctionsAsync()
one. The final important change to consider is that OpenAI plugins support has been marked as experimental and, as such, you will need to suppress the warning coming from the compiler otherwise the project won’t build. The following code shows an updated example:
|
|
The whole snippet of code which is using classes coming from the Microsoft.SemanticKernel.Plugins.OpenApi
namespace has been wrapped inside a #pragma
directive, which suppresses the warning.
Once we have imported the plugin, the rest of the code to execute the included function (GetPopulation
) is the same we have seen for native functions.
Wrapping up
In this post, we have learned the minor changes that we have to apply to our code to use native and OpenAI plugins. Compared to the other posts, the changes were mainly cosmetic, with the goal to align the naming of methods and classes with the rest of the kernel. If you have found this post to be too short, don’t worry: the next post will be much longer :-) We’re going to learn, in fact, the big changes the affected the planner and the way you can automatically orchestrate AI workflows.
In the mean time, you can find the updated sample on GitHub.