Recording: Ask Me Anything with Professional Services - February 2026
Professional Services AMA – February 2026 Missed the live session? Here’s the full rundown of every question asked, summarized for quick reading, and the recording for deeper context and chit chat. Our experts this session: Robin Johns, David Tudor, and Mihai Radoveanu AI Security Questions How will Cato help identify MCPs, AI agents, and all the new AI tools popping up daily? Cato is introducing an AI Security module (GA expected early Q2) that will provide: Local AI usage discovery (MCP servers, local agents) Cloud AI usage discovery (ChatGPT, Copilot, etc.) Model inventories & device discovery for homegrown AI Early access may be available around mid‑March. Will users be able to test early versions? Yes. Cato expects to offer trial availability around general release (early Q2). Can customers see how each AI app uses data (free vs enterprise)? Yes. Cato can differentiate free, paid, and enterprise versions of tools like ChatGPT or Copilot by analyzing traffic, authentication headers, or API connections. Can existing AI-related firewall and CASB rules be removed once AI Security is enabled? Technically yes, but Cato recommends keeping them during transition. Move them to “monitor” mode first before deleting. Can Cato block or warn users about risky AI sites? Yes. Through web firewalling and AI Security policies, admins can: Block sites Redirect users Show user education prompts Apply rules per site, category, or group Can Cato enforce guardrails on AI prompts? Yes. Prompt policies can: Detect PII Block sensitive data Anonymize inputs Detect intent (e.g., self‑harm, illegal activity, jailbreak attempts) Trigger “Are you sure?” notifications Does this work with embedded Copilot inside Microsoft apps (Teams, Word, Excel, etc.)? Yes. Cato can audit and monitor AI usage across the Microsoft ecosystem, including embedded Copilot prompts. Can Cato block file uploads or screenshots to AI tools? Partially. Today: Cato can block the upload action. Later in 2026: OCR‑based inspection of files/images is on the roadmap. DLP is still recommended for full file handling. Can Cato monitor email-based prompt injection attacks? Yes. AI Security can detect prompt-injection attempts, including those originating from email content. Can it help discover vulnerable code or libraries in homegrown AI apps? Yes. Cato can inspect your AI pipelines, models, datasets, knowledge bases, and detect: PII in training data Vulnerable base models Insecure tools/endpoints Risky GPTs or agent configurations Will AI Security support SOAR-like capabilities? Eventually. Partners already offer SOAR-like services today. Cato may expand here in the future. Can Cato detect internal MCP servers (e.g., engineers running local Docker containers)? Yes. Cato can detect MCP traffic using Layer 7 signatures and app analysis. Will the browser plugin be locked so users can’t remove it? Yes, deployment via MDM allows admins to make the plugin non-removable. Does the ZTNA client need to be connected for AI/user identification? No. As long as the client is installed and running, Cato can identify the user. Identity & SCIM / LDAP Migration Questions Can customers migrate from LDAP to SCIM gradually? Yes, you can run LDAP and SCIM in parallel. SCIM entries override LDAP where both exist. Do SCIM provisioning and SSO use the same application in Entra? No. SSO app = authentication SCIM provisioning app = user & group sync Both coexist. Can two SCIM provisioning apps run at the same time? No. If you rebuild the SCIM app (e.g., because MS Graph v1 was deprecated), you must replace the old app, not run both. How are users detected when synced through SCIM? User awareness requires: The user synced through SCIM The ZTNA client installed (no login needed) The ZTNA client provides identity signals via the endpoint. If a user without a ZTNA license has the client, can they connect? No. They will be identified, but they cannot remotely connect. API & Logging Questions Why is Arctic Wolf only receiving IPS/security events and not network events? Check the API key permissions. Old API keys had limited controls; new RBAC-enabled keys allow specifying full access. Updating the key typically resolves this. Cato recommends using: API Explorer Cato CLI to validate what should be visible. Does Cato offer API discovery and monitoring? Not fully today, but you can use: API Explorer MCP server logs AI Security (for AI-driven API calls) More native API discovery is expected in future releases. Miscellaneous Questions Can Cato support SOAR workflows for automated response? Yes, through partners today, and potentially natively in the future. Links discussed in the video: https://learn.microsoft.com/en-us/microsoftsearch/semantic-index-for-copilot https://learn.microsoft.com/en-us/copilot/microsoft-365/microsoft-365-copilot-privacy https://support.catonetworks.com/hc/en-us/sections/28000327077789-Migrating-from-LDAP-to-SCIM-User-Provisioning https://support.catonetworks.com/hc/en-us/articles/28000333704861-Preparing-to-Migrate-to-SCIM-Part-1 https://docs.arcticwolf.com/bundle/m_cloud_detection_and_response/page/configure_cato_sse_360_for_arctic_wolf_monitoring.html Creating API keys: https://support.catonetworks.com/hc/en-us/articles/4413280536081-Generating-API-Keys-for-the-Cato-API https://github.com/catonetworks/cato-api-explorer https://github.com/catonetworks/cato-mcp-server https://github.com/catonetworks/cato-cli https://connect.catonetworks.com/
11Views0likes0CommentsGreenfield/Brownfield Deployments for Cato Network Sites
Have you ever found yourself managing dozens or even hundreds of Cato Network sites manually through the Cato Management Application (CMA), wishing there was a better way to maintain consistency, version control, and automate? Cato Brownfield Deployments (or Day 2 Operations) solves exactly this problem by enabling you to bring your existing Cato infrastructure under Terraform management without recreating everything from scratch. This comprehensive guide will walk you through the process of exporting existing Cato Network site configurations, modifying them as needed, and importing them into Terraform state for infrastructure-as-code (IaC) management. Why This Matters Version Control: Track all infrastructure changes in Git Consistency: Ensure standardized configurations across all sites Automation: Enable CI/CD pipelines for network infrastructure Disaster Recovery: Quick restoration from configuration backups Bulk Updates: Modify multiple sites simultaneously with confidence What is a Cato Brownfield Deployment? In infrastructure terminology: Greenfield Deployment: Building infrastructure from scratch with no existing resources Brownfield Deployment: Managing and updating existing infrastructure that's already running in production, in this case, sites that are already configured in the Cato Management Application (CMA). NOTE: Bulk export and import of sites for brownfield deployments apply to physical socket site deployments (X1500, X1600, X1600_LTE, X1700), as virtual socket sites for cloud deployments include separate cloud resources that are covered by terraform modules found here. For Cato Networks, a brownfield deployment means: You already have Socket sites, network interfaces, and network ranges configured in the CMA You want to start to manage, or take over the configuration of these existing resources using Terraform You don't want to delete and recreate everything (which would cause network downtime) You need to import existing configurations into Terraform state The socket-bulk-sites Terraform module, combined with the Cato CLI (catocli), makes this process straightforward and safe. Prerequisites Before starting, ensure you have the following installed on your machine: Install Terraform Install Python Install Cato CLI Install Git (optional) NOTE: It is a best practice to use a version control system to track changes in code, and configuration files, this example highlights how to use the git cli client, and github to do so. Validate Required Tools # Python 3.6 or later python3 --version # Terraform 0.13 or later terraform --version # Cato CLI tool pip3 install catocli # Git (recommended for version control) git --version Pro Tip Add the following to your ~/.bashrc or ~/.zshrc file to use aliases making it easier to manage running the various terraform commands: cat >> ~/.bashrc << 'EOF' alias tf='terraform' alias tfap='terraform apply --auto-approve' alias tfapp='terraform apply --auto-approve -parallelism=1' alias tfdap='terraform destroy --auto-approve' alias tfdapp='terraform destroy --auto-approve -parallelism=1' alias tfclear='rm -rf .terraform* && rm terraform.tfstate*' alias tffmt="tf fmt -recursive" EOF source ~/.bashrc Cato API Credentials You'll need: API Token: Generated from the Cato Management Application. Refer to Generating API Keys for the Cato API. NOTE: Save the token securely (you won't be able to view it again). Account ID: Your Cato account number found in Account > Account Info or in the CMA URL, example: https://system.cc.catonetworks.com/#/account/{account_id}/ Cato Brownfield Deployment Overview The Cato brownfield deployment workflow consists of four main phases: Phase 1: Export - Cato Management Application → catocli → CSV/JSON files Phase 2: Import - CSV/JSON files → Terraform State (catocli import command) Phase 3: Modify - Edit CSV/JSON files with desired changes (optional) Phase 4: Manage - Terraform State → Terraform Apply → Update CMA Components Cato CLI (catocli): Command-line tool for exporting and importing configurations socket-bulk-sites Module: Terraform module that processes CSV/JSON files Terraform State: Tracks which resources are managed by Terraform Cato Management Application: The source of truth for your actual network configuration Step-by-Step Implementation Step 1: Configure Cato CLI First, configure the CLI with your API credentials: # Interactive configuration (recommended for first-time setup) catocli configure # Or configure with environment variables export CATO_TOKEN="your-api-token-here" export CATO_ACCOUNT_ID="your-account-id" Verify Your Configuration: # View current configuration catocli configure show # List your sites to confirm access catocli entity site list Step 2: Create Your Project Directory Organize your Terraform project with a clear structure: # Create project directory mkdir cato-brownfield-deployment cd cato-brownfield-deployment # Initialize git repository (optional) git init Step 3: Set Up Terraform Configuration Create your main Terraform configuration file (main.tf): terraform { required_version = ">= 0.13" required_providers { cato = { source = "catonetworks/cato" version = "~> 0.0.46" } } } provider "cato" { baseurl = "https://api.catonetworks.com/api/v1/graphql2" token = var.cato_token account_id = var.account_id } NOTE: Please refer to the following Intro to Terraform instructional video for a guide on how to set up authentication, define Terraform variables and manage environment variables like your api token, to securely initialize the Cato Terraform provider. Working with CSV Format The CSV format is ideal when you want to: Edit configurations in Excel or Google Sheets Separate site metadata from network ranges Have human-readable, easily diff-able files Export to CSV # Export all socket sites to CSV format catocli export socket_sites \ -f csv \ --output-directory=config_data_csv This creates: socket_sites_{account_id}.csv - Main site configuration sites_config{account_id}/{site_name}_network_ranges.csv - Per-site network ranges Add CSV Module to Terraform Update your main.tf to include the CSV module with the path to your files: # CSV-based site management module "sites_from_csv" { source = "catonetworks/socket-bulk-sites/cato" sites_csv_file_path = "socket_sites_12345.csv" sites_csv_network_ranges_folder_path = "sites_config_12345/" } Validate CSV Site Location Syntax In the case you have updated your csv with additional sites, and updated the addresses (country code, city, state code, timezone) of those sites, use the following to validate the site location syntax as a pre-flight check before applying changes: catocli import validate_site_location my_socket_sites.csv -f=csv Loading site data from my_socket_sites.csv... Loaded 4 sites ======================================================== VALIDATION RESULTS ======================================================== [✗] Site 1: My X1500 Site (CSV line 2) Location: Wyoming, Usxyz, US-MN Timezone: America/Chicago Status: INVALID - Location not found: Wyoming, Us22, US-MN ======================================================== SUMMARY ======================================================== Total sites processed: 4 Valid sites: 1 (25.0%) Invalid sites: 1 (25.0%) Skipped sites: 2 (50.0%) ======================================================== SKIPPED ROWS (all location fields empty) ======================================================== - My X1500 Site (CSV line 3) - My X1600 Site (CSV line 5) ======================================================== HOW TO FIX INVALID LOCATIONS ======================================================== Use the following catocli query to search for valid locations: catocli query siteLocation -h Validate CSV Required Fields and Import into Terraform State # Initialize Terraform terraform init # Validate csv has all required fields before attempting import catocli import socket_sites_to_tf \ --data-type csv \ --csv-file config_data_csv/socket_sites.csv \ --csv-folder config_data_csv/sites_config/ \ --module-name module.sites_from_csv --validate # Import existing resources into Terraform state catocli import socket_sites_to_tf \ --data-type csv \ --csv-file config_data_csv/socket_sites.csv \ --csv-folder config_data_csv/sites_config/ \ --module-name module.sites_from_csv \ --auto-approve # Review (should show no changes if import was successful) terraform plan Working with JSON Format The JSON format is ideal when you want to: Use programmatic tools to manipulate configurations Keep all configuration in a single file Work with JSON-aware editors and validation tools Export to JSON # Export all socket sites to JSON format catocli export socket_sites \ -f json \ --output-directory=config_data Best Practices 1. Version Control Everything Use a version control system to manage the changes in your configuration files, in this example, the Git client is used to track infrastructure file changes: # Initialize repository git init git add main.tf git commit -m "Initial Terraform configuration" 2. Regular Exports and Backups Create automated backup scripts to regularly export your configuration (sites_backup.sh): #!/bin/bash DATE=$(date +%Y%m%d_%H%M%S) BACKUP_DIR="backups/$DATE" mkdir -p "$BACKUP_DIR" catocli export socket_sites -f json --output-directory="$BACKUP_DIR" Troubleshooting Issue: Import Fails with "Resource Already Exists" Symptom: Error: Resource already exists in state Solution: # List all items in terraform state terraform state list # Show terraform state terraform show # Remove the resource from state and re-import terraform state rm 'module.sites_from_csv.cato_socket_site["Your Cato Site Name Here]' Issue: Plan Shows Unexpected Changes Symptom: Plan: 0 to add, 25 to change, 0 to destroy Solution: # Export fresh configuration from CMA catocli export socket_sites -f json --output-directory=config_data_verify # Compare with your current configuration diff config_data/socket_sites.json config_data_verify/socket_sites.json Conclusion Brownfield deployments for Cato Networks enable you to bring existing infrastructure under version-controlled, automated management without disruption. By following this guide, you can: Eliminate manual configuration errors through automation Maintain consistency across hundreds of sites Accelerate deployments from days to minutes Improve disaster recovery with infrastructure-as-code backups Enable collaboration through Git-based workflows Ensure compliance with standardized configurations Key Takeaways Start Small: Begin with exporting a single site, validate the process, then scale Test First: Always use terraform plan before terraform apply -parallelism=1 Version Control: Git is essential for tracking changes and enabling rollbacks Automate Backups: Regular exports provide disaster recovery capability Document Everything: Clear documentation enables team collaboration Additional Resources Cato API Essentials - Videos Cato Terraform Provider Socket-Bulk-Sites Terraform Module Cato CLI Cato API Documentation Learning Center: Using Terraform with Cato Cloud Online JSON Formatter Happy Infrastructure-as-Code Management!
389Views3likes1CommentTerraform Modules with Cato: Simplifying and Scaling Network Deployments
In this video, we introduce Terraform Modules with Cato and show how they simplify, standardize, and scale Cato deployments. You’ll learn how Terraform modules help you: Combine multiple Cato resources into reusable building blocks Standardize corporate firewall rules and remote user configurations Reduce Terraform code by packaging common Cato use cases into modules This session is ideal for engineers looking to manage Cato environments more efficiently using Infrastructure as Code (IaC), whether you’re just getting started with Terraform or looking to scale existing deployments. References: Cato Terraform Registry
78Views1like0CommentsCato SDK 101: Introduction & Building Your First Queries
Welcome to your first look at the Cato SDK 🚀 In this video, we introduce the SDK, walk through setup, and guide you through building your first real queries. Perfect for developers, SEs, analysts, or anyone starting with the platform. What you’ll learn: What the Cato SDK is and how it works How to install and authenticate the SDK The structure of clients, queries, and models How to build and run your first queries ⚡ Common mistakes to avoid Where to find docs and next steps 📚
34Views0likes0CommentsMastering Cato Go SDK Queries: A Practical Guide for Developers
Learn how to leverage the Cato Go SDK to query data programmatically and build powerful automations around the Cato SASE platform. This session walks through: Initializing the SDK Authenticating securely Performing real-world queries, and Interpreting responses. Whether you're building internal tools, integrations, or custom workflows, this video gives you the foundation you need to work confidently with the Go SDK.
11Views0likes0CommentsDynamic Resources with Cato & Terraform: Automate and Scale Your Infrastructure
Take your Cato Networks automation to the next level with Terraform! In this video, we dive deep into how to dynamically manage and update your Cato resources using Terraform: enabling faster, scalable, and automated infrastructure management. What you’ll learn: How to update Cato resources dynamically; Sites, Hosts, Domains, and Groups Using bulk provisioning to quickly build and modify multiple Cato objects How Terraform and Cato work together to create a responsive, adaptive security environment Practical examples of Terraform configuration, authentication, and resource updates By the end, you’ll know how to integrate Terraform into your Cato environment to automate routine changes and respond dynamically to your organization’s evolving needs. Perfect for: Network engineers, DevOps professionals, and IT admins seeking to simplify large-scale configuration management with Infrastructure as Code (IaC). Resources Mentioned: Cato Terraform Provider Docs Terraform Docs
16Views0likes0CommentsIntro to Terraform & Cato: Setup, Authentication, and Your First Terraform Apply
Ready to automate your Cato Networks setup with Terraform? In this video, we’ll walk you through everything you need to start using Terraform with Cato, from initial setup to your very first infrastructure deployment. What you’ll learn: How to install and configure Terraform Setting up authentication between Terraform and Cato Networks Understanding the Cato Terraform Provider Running your first Terraform apply to deploy real configurations By the end of this session, you’ll have a working Terraform environment integrated with Cato — ready to manage network resources as code! Perfect for: Network engineers, DevOps professionals, and Cato administrators looking to bring Infrastructure as Code (IaC) into their workflow. Resources Mentioned: Terraform Docs Cato Terraform Provider Docs
26Views0likes0CommentsDeploying the Cato Sentinel Connector | Full Installation & Setup Guide for Azure Sentinel
In this video, we’ll walk you through the complete setup of the Cato Sentinel Connector, connecting your Cato Networks environment to Microsoft Azure Sentinel for unified visibility and smarter threat detection. What You’ll Learn: Setting up your Azure environment (Resource Group, Log Analytics Workspace, Sentinel) Creating your Cato API Key and finding your Account ID Deploying the ARM template / Azure Function App to ingest Cato Events, CEF, Audit Logs & XDR Stories Configuring data ingestion and filters for efficient log collection Installing the Sentinel Workbook to visualize Cato data Best practices and tuning tips for ingestion and workspace setup Who It’s For: Security engineers, SOC analysts, and IT professionals using Cato Networks who want to enhance visibility through Azure Sentinel. Prerequisites: Active Cato Networks account with API access Azure subscription with Sentinel enabled Permissions to deploy ARM templates and Function Apps By the end of this tutorial, you’ll have a fully operational integration between Cato Networks and Azure Sentinel, empowering your team with advanced insights and real-time threat correlation. Github repository: https://github.com/catonetworks/cato-sentinel-connect
280Views1like0CommentsIntroduction to Terraform - From Setup to Your First Apply
Welcome to this hands-on introduction to Terraform, the powerful Infrastructure as Code (IaC) tool that lets you automate cloud resource provisioning with ease. In this video, you’ll learn: How to install Terraform on your system What Terraform state and commands are and how they work How to run your first Terraform apply to deploy real infrastructure By the end of this session, you’ll have a working Terraform environment and a solid understanding of how to manage your infrastructure efficiently and reproducibly. Perfect for: Beginners, DevOps engineers, cloud professionals, and anyone looking to get started with Infrastructure as Code. Resources mentioned: Terraform Official Documentation: https://developer.hashicorp.com/terraform/docs Terraform Download: https://developer.hashicorp.com/terraform/install
57Views0likes0CommentsIntroducing the Cato GraphQL API Playground
Explore how the Cato GraphQL API Playground streamlines API exploration, query development, and team collaboration.In this video, you’ll learn how to: Interactively explore the GraphQL schema and available API operations Write and test queries in real time with syntax highlighting Debug and troubleshoot API responses efficiently Save query history and share examples with your team for better collaboration Whether you’re a developer, tester, or network engineer, the Cato GraphQL API Playground helps you accelerate development and improve API reliability.
97Views0likes0Comments