Recent Discussions
Permission errors when testing Cato API with Python
HI all, I am currently working on a project to automate workflows in Cato with Python. I've already set and reviewed my API permissions and they should already inherit my account which is able to edit and view most of the resources. However, I still get this error: HTTP 200 { "errors": [ { "message": "permission denied", "path": [ "licensing", "licensingInfo" ], "extensions": { "code": "Code104" } } ], "data": { "licensing": { "licensingInfo": null } } } I've been scouting the documentation on specific troubleshooting steps but I couldn't seem to find the answers i'm looking for. Any chance some folks could give me a quick guide on how to ensure I get the right permissions for my API keys? This is the sample script i'm testing btw, it is to pull available licensing information for monitoring. API_KEY = os.getenv("CATO_API_KEY") API_URL = "https://api.catonetworks.com/api/v1/graphql2" QUERY = """ { licensing(accountId: <ID_HERE>) { licensingInfo { globalLicenseAllocations { ztnaUsers { total allocated available } } } } } """ async def main(): headers = { "x-api-key": API_KEY, "Content-Type": "application/json" } async with aiohttp.ClientSession(headers=headers) as session: async with session.post(API_URL, json={"query": QUERY}) as resp: print("HTTP", resp.status) print(json.dumps(await resp.json(), indent=4)) asyncio.run(main())5Views0likes0CommentsCato Connect Event: AMA with Professional Services - February/March 2026
Did you join our last AMA with Professional Services and want more? Did you miss the last one and have been waiting for us to drop more dates? Well your request is our command, and we are back with another event for our customers and partners. During these live AMAs with members of our talented Professional Services team we’ll cover topics like: Implementing Cato and getting as much out of your purchase as possible Best practices we’ve seen across real-world environments AI Security (New, exciting topic!) Your questions... seriously, bring them Choose from the two available sessions, whatever works best for you. February 24th, 2026 at 11am EST or March 12th, 2026 at 3pm JST Here’s how to get the most out of it: Register for the February 24th or March 12th meetings and get the calendar invite and join us live Post your questions below in the comments — we’ll answer pre-submitted ones first, before tackling live chat during the session + See a question you like? Give it a “like” to help it rise to the top Note: We won’t be able to look at specific CMA instances — demos will be done using internal environments. That’s it — register, post your questions, and we’ll see you there! Presenters: Steven Wong Professional Services Engineer Mihai Radoveanu Principal Consultant Professional Services, Italy Rob Pfrogner Principal Consultant Professional Services, US Special guest: Robin Johns Worldwide, AI Security SME If you run into any issues, @mention me or email us at community@catonetworks.com9Views0likes0CommentsApplication File Name Upload
Hi, We are monitoring the uploads to external cloud storage which are not compliant to our company policies. We have seen that only in gmail Upload events, the file name is presence. For Whatsapp, Google Drive or other services, an file path hashed is provided. ¿Is there any possibility or roadmap in order to check for the file name in this apps? Thank you, David.1View0likes0CommentsAPI for Creating Users in CMA
We don’t have an IdP environment, so we need to manually provision a large number of users in CMA. I couldn’t find any API call in the API Reference that would allow us to do this. Is there an API that can be used to create/register users? I apologize if I have overlooked it in the documentation.11Views0likes1CommentBlock access to local/home network for Cato Client – force all traffic through Cato tunnel
Hi everyone, we are using the Cato Client (Windows/macOS) for remote users and would like to fully block access to the local/home network when the client is connected. Goal: No access to local LAN subnets (e.g. 192.168.0.0/16, 10.0.0.0/8, printers, NAS, routers, IoT, etc.) No split tunneling or local breakout All traffic should be forced through the Cato tunnel We checked the following areas but could not find a clear way to block local LAN access on the endpoint: Client Connectivity Policy Network Rules Internet / WAN / LAN Firewall Questions: Is it possible to block local/home network access for Cato Clients purely within Cato (endpoint-based), so that local LAN traffic is not reachable at all? If yes: which policy / feature is required (e.g. Client Advanced Controls, specific license, feature flag)? If no: is the recommended approach to enforce this via endpoint controls (e.g. OS firewall / MDM) in combination with Always-On and no split tunneling? Any guidance or best practice from real-world deployments would be highly appreciated. Thanks in advance!43Views0likes4CommentsDNS Forwarding When Overriding Account-Level DNS Settings
Since I cannot leave comments on the KB, I am writing this down for others who may face the same issue. https://support.catonetworks.com/hc/en-us/articles/12710391725981-Centralized-Management-of-SDP-User-DNS-Settings-with-the-DNS-Settings-Policy#UUID-13385199-3a2b-70d3-5da2-ea4ebb98e5dd The article lists the following under Known Limitations: DNS Forwarding is not supported if you override Account Level DNS settings. This known limitation applies when using an untrusted DNS server. If you use a trusted DNS server (such as 8.8.8.8), DNS Forwarding can still be used even when overriding the account‑level settings.15Views0likes1CommentLDAP To SCIM Migration
We are planning to migrate from Cato Directory Services LDAP & User Awareness to Cato SCIM user provisioning and looking to get some feedback if anyone has performed this migration and if they encountered any issues during the migrations. We currently have a few domains, over 3500 users and not everyone has an SDP lic, a mixture of Entra joined and non-Entra joined devices. SSO for VPN Users. I'm trying to understand how users are going to be mapped to the workstations they are logging in from and identified since Cato currently taps into DC's Event viewer to map users to computers and LAN IP's. We have Shared computers where an SDP license is not needed as these are fixed computers. We see the user login events, but not the details for the system they are logging in from and LAN IP. Will there be problems if we migrate 1 domain first and wait a week or two to iron out any bugs? Should Always-On Windows RegKey be removed from all systems prior the migration?61Views0likes3CommentsOffice mode for Mac users
We have AlwaysOn policy enabled for all the users and it is causing some troubles for Mac users. Most of our users are Windows and when they come to the office behind the socket, the client detects Office Mode automatically, users do not need to enter credentials, and they get network connectivity just fine. However our Mac users would need to enter credentials in the cato client for it to detect the office mode. If they do not enter credentials, they do not have a network connection. Our Mac users are not happy with this since it does add some inconvenience when they are in the office. I am wondering if anyone has the same challenge and what are possible workarounds.36Views0likes1CommentRegarding the execution interval of the Azure Functions template for Cato log integration
I'd like to confirm something about Azure Functions processing. ■Requirements - To forward Cato SASE logs to an Azure Log Analytics workspace, I'm using the following Cato log integration template. https://github.com/catonetworks/cato-sentinel-connect/tree/main -The Azure Functions specs are as follows: OS: Linux Plan: App Service Plan Size: P1v3 Type: Custom Handler Trigger: Timer trigger (30-second interval) The following logs are targeted for integration: -CommonSecurityLog Log size: Approximately 2.5-5MB per 30 seconds (300-600MB per hour) -CatoAuditEngine_CL Log size: Less than 0.01MB per 30 seconds ■Question I'm using a 30-second timer trigger, but the actual execution interval is 2 minutes. (The execution interval can be confirmed by counting the "Functions Execution Count" metric.) Please confirm the following three points. 1. Is the change in execution interval due to a large log volume? 2. What should I do to set the execution interval to 30 seconds? Would scaling up Azure Functions be effective? 3. Even if execution takes a long time, is the log integration being executed without any problems? Are there any logs being missed? Note that in the test environment (log volume per 30 seconds is less than 0.01MB for both tables), execution is performed every 30 seconds.22Views0likes1Comment