The Kong API gateway is, just like any other API Gatweway provider, one of the essential ways to protect our application ecosystem and to standardize requests and guarantee that it complies to a lot of regulations and standards. To help us work with AI, Kong also provides an AI proxy plugin that allow us to do a lot of things and among the most important ones, to deal with the different quirks of the different LLM providers seamlessly along with API keys, logs and metric management. The other great thing that it offers is the ability to standardize requests to AI. In this video I explore that functionality, some quirks of its own, a few things that didn't work, and how I managed to ask an AI, in this case Gemini as a random pick, in what year were The Smashing Pumpkins, as a band, founded. Will the answer match reality or not? Still up to debate I guess. But that is just for fun. The point of this video is to get us up and running as fast as possible with one of the possibilities this plugin has to offer. As usual, stay tech, keep programming, be kind and have a good one everyone! Cheers!
---
Chapters:
00:00:00 Start 00:00:17 Intro 00:01:34 Install Kong API Gateway OSS 00:03:34 Configuring the service 00:03:46 Configuring the route 00:04:20 Configuring the AI proxy plugin 00:05:13 Getting and API Key from Gemini 00:06:22 Resuming API proxy plugin configuration 00:07:02 Asking questions to Gemini via the Kong API gateway's AI proxy plugin 00:10:10 Closing notes 00:10:31 See you in the next video! 00:10:45 End credits and disclaimer ---
As a short disclaimer, I'd like to mention that I'm not associated or affiliated with any of the brands eventually shown, displayed, or mentioned in this video.
---
All my work and personal interests are also discoverable on other different sites:
If you have any questions about this video please put a comment in the comment section below and I will be more than happy to help you or discuss any related topic you'd like to discuss.