Recent Content
Introduction to Cato API - Setup to Your First Real API Call
Transcript: Hey, guys. It's Josh Snow and Brian Anderson. Today, we're gonna be talking about the Kato API. I'm really stoked for this because the API is extremely powerful at Kato, and there's a lot you can do with it. But we're gonna really help demystify that. And so today, we're gonna talk about what the KPI Cato API is, and then how you can make your first API call. So, Brian, take it away. Let's walk through what the key a Cato API is and some of the possibilities. Awesome. Thank you. Let's first frame the conversation about what we're looking to achieve or what the outcomes are, why we would even use the Cato API. So if you're familiar with the Cato platform, obviously, there is a a global private backbone we wanna connect to. We need to perform different types of tasks like creating sites, maybe provisioning rules, and maybe getting data out of the platform, security events, security incidents, and that sort of thing. So typically, that starts off with, you know, version control systems, and maybe you have other platforms that are out there and frameworks that would help enable the automation of provisioning configurations, like Terraform, Pulumi, or Azure DevOps, and there's there's a whole suite of tool sets that are out there. And then secondarily, there might be other teams that would use other solutions like Ansible, Chef or Puppet or Salt to provision new rules, for example, or make exceptions or modify network configurations. And then lastly, there's what do we do with the data that we have. In the Kato platform, we see all ports, all protocols, the very rich dataset. How do we make that actionable? We want to invoke in, like, an ITSM system, like a ServiceNow, maybe send data to Splunk or different business units or your network operations team or security operations team that may be interacting with your data into the platform that way. So all the a all the arrows that we see here in this diagram are APIs that would enable the interactions between these different platforms. That's awesome. All I see is possibility. And what's great today is we're gonna give you the foundational elements to generate your own API call, which will start you on your journey of building any of the things that we just showed today. Some of it's gonna be off the shelf that you can build. Others, it's gonna start to build your own. And that's what we're gonna walk you through today. Awesome. Yeah. So starting at the at the high level here, we always start with the documentation. This is our Cato management API documentation, and many folks in the industry are familiar with REST APIs or SOAP APIs historically. This is a GraphQL API. It still works as an HTTP request. You still send a post request. But in this case, the way the GraphQL works is they have the concept of queries, which is the ability to pull and retrieve information or data. And then there's something called mutations, which are the way that you would create, update, or delete different configurations. So this is the the documentation. You can feel free to kind of sift through that if you want some exciting reading on the weekend. And we're gonna do a separate video on this so that you can be able to consume that on its own if that's something you're interested in. Awesome. But we'll do separate videos also on something called the playground. And the playground is kind of an off the shelf you know, tool set that's exposed to GraphQL that lets you test your own API calls. And so we'll do one separate session on how to use this. You also may be familiar with a client called Postman, a very common API testing tool that may also help automate certain tasks and so forth. So we'll have a separate session on Postman. But today, we're gonna be covering a tool that's on github dot com slash kato networks called the Kato API Explorer. The Kato API Explorer is an application that runs on top of Docker. So there's an assumption that already have Docker installed here. We're gonna have a separate session on how to install that, but essentially essentially, all you do is download this from docker dot com and start it up, and you could then work through standing up the Keto API Explorer locally on your system to be able to use. Cool. Super easy. So initially, what we need what we can do is either download all of this all this code directly from GitHub, or really all that you need is this one file, the Docker Compose dot YAML file. So if I literally just take this one file, I could either download that from here or just copy this text. I put that into a file on my local workstation. So in a folder that I've created, I've got a Docker Compose dot YML file, and and then I've added this text to that file. So that's all that I've done so far apart from having Docker running on my system. And then what I do in a in a terminal window is I would make sure that I'm in this folder, the same folder that I'm having this file, this Docker Compose file present in. And this is gonna point to basically where we publish our container. So all that we need to do if Docker compose is running if Docker is running, we run Docker Compose pull. I could type and talk at the same time. And what this will do is tell it's gonna tell Docker to look at the file that you have, and it's gonna pull this image down from where we publish it in what's called the container registry. So now I've pulled this data this this container down. The next command that I'm gonna run is docker compose up minus d. And what that does is stands up the container, and then it runs as a background daemon process. So it continues to run on my system here. Now one important thing to note is this configuration, which is you could leave it as the default, but this is port eighty eighty. And on my workstation that I'm running on, when I go in a browser, is the port that I'm gonna be accessing this application on. So when I go in here to a browser window that this application is running, I should be able to refresh local host port eighty eighty, and that is how I get access to my application. And so, essentially, now the Explorer is running locally. The first thing that I wanna do in the Explorer is I wanna configure an API key. And what does that mean, and how how does that work? So if I if I transition back over to the Cato management application, under the resources tab in the Cato management application, under API keys, I can create a brand new key, and I can give this whatever name that I want. And I could define whether not I want this to be a view or an edit key. And then if I wanna restrict access for my API key to only come from certain IPs, I can do that, which is a good security practice. And then we can choose how long you want this key to be good for. So it's not a forever key that lives in the system. So once I create this, I hit save. I would get prompted to copy that key down. What I would do then is go add a new key here and put in my information. And there's a couple of different options for servers you can use. There's a US one. There's an island. There's an India. Essentially, it works like this. Put in the name from my key, the account ID, which, by the way, in the Kato management application is easiest found probably in the URL here, is your account ID to use. And then if I add those attributes there, whenever I save my credentials, the API Explorer will then test those credentials and make sure that they're valid. And so now I've now I've essentially connected the Kato management application to my tenant. And So now you're ready to make API calls at this point. Right? That's exactly right. That's awesome. So in the explorer, the interesting thing about this is that every time you refresh this page, it pulls down what's called our production schema. And what is that? This is all of the API calls that are supported for the Keta management application. So today, we might have, you know, fifty, a hundred, tomorrow might be five hundred. All of those are now available in an easy drop down that I could say what calls are available to me in the platform, and all that I have to do is simply select one. And what this client does, which is different from either Postman or the GraphQL playground, is it automatically renders all the parameters and all of the arguments that I would need to be able to construct one of my one of my API calls. And down here in the middle, just to share with everyone, this is every possible parameter that is available to be selected for what's called the account snapshot. Now the account snapshot is a call that gives you a snapshot of the account, all your sites, all your sockets, all your users, that sort of thing. It gives you some information about that. And I could choose here whether or not I wanna filter for certain sites. Noticed that when I selected this That's cool. It automatically drops in what's called site IDs. So I don't need to go look those up and construct my payload. But what's interesting here is if I hit execute, I now get an interactive live response back from the API. So what I've done in this few minute session here is I've just created a key, and now I'm already making API calls against the Kato management application interact Even just the sites you wanted. So if you only wanted certain sites, that's what all you would get from this specific call. That's cool. That's exactly right. So I can uncheck this, and it removes those sites from my payload. And now I get interactively back all of the sites from my entire environment. So another interesting thing here is down to this bottom right section, these are code examples. And we have a keto CLI, which we'll talk about in a moment, which just generates code examples for me incorporating my parameters that I select maybe. We also have Python examples, which I can literally copy and paste this syntax into a Python script that I run on my local system. I run it and it works, provided that I have my my authentication, my my API key in an environment variable and so forth, and it'll walk you through how to do that. And then lastly, we have even a curl example. Now curl is simply a command line way to run a web request. That's it. And so if I paste this into a terminal window and I put in my my key, which I could choose to unmask that if I wanted, can paste it into a terminal window or to a shell script. It will work. And that's exactly the same call that I just selected from from this drop down here. So it's that easy now. We are interacting and making calls against our own kind of CMA tenant. That's huge. So now you could you've gone all the way from creating the API key to whatever individual call you wanna be able to make, and then you even can add those request variables, which I've always found to be as a new API user. Like, where do I change a variable? I messed up the bracket or this. This just makes that process easy. And so once you have this, Brian, you just copy that into your terminal interface to be able to start making the API calls. Right? That's exactly right. So I can I can take this and copy this into a terminal window, or I could put it into a a bash script, or I can use the Python script, which is a very, very common programming language that a lot of people use for integrations and automation work? But the next call I'd like to talk about we already talked about kind of the account snapshot to give you kind of data about the account users and insights and so forth. But there's kind of a general purpose call called the entity lookup call. And the way that the Keto API is structured, there's a lot of the read operations you're looking for. For example, if I wanna look up my interfaces or my my site ranges, I could pick what I wanna look up, and then I can run this and it would give me a list of back of whatever thing that I'm selecting from that list. So here's all my site ranges that I have in my account, for example. This is a demo account that we're running here. I can also choose to look up maybe, you know, all of my VPN users that might be in my environment, right, Or even my sites and so forth. So this gives me a list of all of those kind of validate the response to see, alright, I'm putting together this bigger chef script or this provisioning, or these are the events I want say. So you can start to validate those here of what you're starting to put together in a bigger view by understanding these responses. Say, oh, yeah, that looks good. Now I'm going to take that code and add that to my larger playbook is you know, I know it's kind of a rookie way to say it, but is that am I is that a fair assessment? What what you want as a developer is you first wanna see what the response looks like so that when I start to go develop code, I know what object that I'm looking to get back. I know that it's data, and then there's a child node called entity lookup, data dot entity lookup dot items. This is a list. It's an array of items. So I would know what the data structure is that I'm going to be parsing my response, and then I would know how to make that useful for whatever kind of task that I'm working on, whether it's importing something or maybe exporting something and then doing something with that. Maybe I wanna pull down all the site sites that we have or lists of applications or whatever it is, this kind of helps you enable to test interactively what is that call gonna look like. Yeah. So if I were to go into maybe I'm looking up a list of sites. Let's talk about how to use this a little bit more interactively. Or maybe I'm looking for network interfaces, for example. And so this is all the network interfaces in in my environment, but I actually get a lot of these back. And I don't I don't maybe want all of these at one time, so maybe I wanna create a filter for a particular site. I wanna filter my network interfaces. So in here, this gives me the ability to kind of edit the entity input. What I wanna do is I wanna filter for a site, and I'm gonna filter for the value of site ID. And when I hit add here, notice that it adds this to the parent, and now it actually passed this down into my variables here, and it gave me Yeah. All of this down into all these different code generation tools where now if I run that request again, I only get Mac one interface that is for that site. Yeah. That's So what I did is I just created that kinda dynamic filter on the fly from the data they got back from the API. Right? Awesome. This has been super helpful, Brian. Future videos, we are going to start to stitch the stitch this together for other integrations, putting this together inside of deeper scripts so that you can then take this and really shine with the the Kato API. But this is such a cool tool. Thanks for introducing it, Brian, because it allows people like me who kind of know APIs, but now I can be really dangerous with that. And I think a lot of people are starting to it takes that intimidation level away. And so I really appreciate what what you built here, and I'm excited to try to use it and hope everyone else is too. Yep. One last little thing we wanted to cover here would be the the CLI. And so what I've done so far here is I've generated a request, and it gave me example syntax. I can simply install the Cato CLI in the terminal window by running pip three Install Kato CLI if you have Python installed in your system. Then I would run Cato CLI configure, and then that would walk me through setting up my credentials. And then if I put my credentials in, just like I got from, you know, the the Cato management application, I can now take this exact query that I got from the Kato management application and run that in the terminal window and get the exact same output. So it's it they're designed to kind of work hand in hand. That's huge. I don't I don't need to write a lot of code. I just need the command to run, and I can use the explorer that I have here to create any payload that I want to for any API. And then I literally just copy this out, drop it in a terminal, drop it in a shell script, and you you're off and running. Yeah. So then you could just start iterating, iterating, iterating. Right? That's that's huge. Alright, Brian. Thanks so much. Appreciate your time. Look for the next videos, guys. Thanks so much. Thanks, everyone. ---- Let us know what you think and what you'd like to learn more about.
0likes0CommentsYou Ask a Good Question: Top 5 Applications Per Site, by Total Bandwidth
The Ask: I’d like to be able to see the top 5 applications, per site, by total bandwidth. Basically, this graph multiple times. API Guy answer: My solution is a multi-query approach. Step 1: This appStats query to get the list of site names and their total traffic: Step 2: Iterate over each site, calling an appStats() query for each one, with the site name as the filter. Here’s an example for one site: You will need to then calculate the percentages based on the total for each site from the first query.1like0Comments