Greenfield/Brownfield Deployments for Cato Network Sites
Have you ever found yourself managing dozens or even hundreds of Cato Network sites manually through the Cato Management Application (CMA), wishing there was a better way to maintain consistency, version control, and automate? Cato Brownfield Deployments (or Day 2 Operations) solves exactly this problem by enabling you to bring your existing Cato infrastructure under Terraform management without recreating everything from scratch. This comprehensive guide will walk you through the process of exporting existing Cato Network site configurations, modifying them as needed, and importing them into Terraform state for infrastructure-as-code (IaC) management. Why This Matters Version Control: Track all infrastructure changes in Git Consistency: Ensure standardized configurations across all sites Automation: Enable CI/CD pipelines for network infrastructure Disaster Recovery: Quick restoration from configuration backups Bulk Updates: Modify multiple sites simultaneously with confidence What is a Cato Brownfield Deployment? In infrastructure terminology: Greenfield Deployment: Building infrastructure from scratch with no existing resources Brownfield Deployment: Managing and updating existing infrastructure that's already running in production, in this case, sites that are already configured in the Cato Management Application (CMA). NOTE: Bulk export and import of sites for brownfield deployments apply to physical socket site deployments (X1500, X1600, X1600_LTE, X1700), as virtual socket sites for cloud deployments include separate cloud resources that are covered by terraform modules found here. For Cato Networks, a brownfield deployment means: You already have Socket sites, network interfaces, and network ranges configured in the CMA You want to start to manage, or take over the configuration of these existing resources using Terraform You don't want to delete and recreate everything (which would cause network downtime) You need to import existing configurations into Terraform state The socket-bulk-sites Terraform module, combined with the Cato CLI (catocli), makes this process straightforward and safe. Prerequisites Before starting, ensure you have the following installed on your machine: Install Terraform Install Python Install Cato CLI Install Git (optional) NOTE: It is a best practice to use a version control system to track changes in code, and configuration files, this example highlights how to use the git cli client, and github to do so. Validate Required Tools # Python 3.6 or later python3 --version # Terraform 0.13 or later terraform --version # Cato CLI tool pip3 install catocli # Git (recommended for version control) git --version Pro Tip Add the following to your ~/.bashrc or ~/.zshrc file to use aliases making it easier to manage running the various terraform commands: cat >> ~/.bashrc << 'EOF' alias tf='terraform' alias tfap='terraform apply --auto-approve' alias tfapp='terraform apply --auto-approve -parallelism=1' alias tfdap='terraform destroy --auto-approve' alias tfdapp='terraform destroy --auto-approve -parallelism=1' alias tfclear='rm -rf .terraform* && rm terraform.tfstate*' alias tffmt="tf fmt -recursive" EOF source ~/.bashrc Cato API Credentials You'll need: API Token: Generated from the Cato Management Application. Refer to Generating API Keys for the Cato API. NOTE: Save the token securely (you won't be able to view it again). Account ID: Your Cato account number found in Account > Account Info or in the CMA URL, example: https://system.cc.catonetworks.com/#/account/{account_id}/ Cato Brownfield Deployment Overview The Cato brownfield deployment workflow consists of four main phases: Phase 1: Export - Cato Management Application → catocli → CSV/JSON files Phase 2: Import - CSV/JSON files → Terraform State (catocli import command) Phase 3: Modify - Edit CSV/JSON files with desired changes (optional) Phase 4: Manage - Terraform State → Terraform Apply → Update CMA Components Cato CLI (catocli): Command-line tool for exporting and importing configurations socket-bulk-sites Module: Terraform module that processes CSV/JSON files Terraform State: Tracks which resources are managed by Terraform Cato Management Application: The source of truth for your actual network configuration Step-by-Step Implementation Step 1: Configure Cato CLI First, configure the CLI with your API credentials: # Interactive configuration (recommended for first-time setup) catocli configure # Or configure with environment variables export CATO_TOKEN="your-api-token-here" export CATO_ACCOUNT_ID="your-account-id" Verify Your Configuration: # View current configuration catocli configure show # List your sites to confirm access catocli entity site list Step 2: Create Your Project Directory Organize your Terraform project with a clear structure: # Create project directory mkdir cato-brownfield-deployment cd cato-brownfield-deployment # Initialize git repository (optional) git init Step 3: Set Up Terraform Configuration Create your main Terraform configuration file (main.tf): terraform { required_version = ">= 0.13" required_providers { cato = { source = "catonetworks/cato" version = "~> 0.0.46" } } } provider "cato" { baseurl = "https://api.catonetworks.com/api/v1/graphql2" token = var.cato_token account_id = var.account_id } NOTE: Please refer to the following Intro to Terraform instructional video for a guide on how to set up authentication, define Terraform variables and manage environment variables like your api token, to securely initialize the Cato Terraform provider. Working with CSV Format The CSV format is ideal when you want to: Edit configurations in Excel or Google Sheets Separate site metadata from network ranges Have human-readable, easily diff-able files Export to CSV # Export all socket sites to CSV format catocli export socket_sites \ -f csv \ --output-directory=config_data_csv This creates: socket_sites_{account_id}.csv - Main site configuration sites_config{account_id}/{site_name}_network_ranges.csv - Per-site network ranges Add CSV Module to Terraform Update your main.tf to include the CSV module with the path to your files: # CSV-based site management module "sites_from_csv" { source = "catonetworks/socket-bulk-sites/cato" sites_csv_file_path = "socket_sites_12345.csv" sites_csv_network_ranges_folder_path = "sites_config_12345/" } Validate CSV Site Location Syntax In the case you have updated your csv with additional sites, and updated the addresses (country code, city, state code, timezone) of those sites, use the following to validate the site location syntax as a pre-flight check before applying changes: catocli import validate_site_location my_socket_sites.csv -f=csv Loading site data from my_socket_sites.csv... Loaded 4 sites ======================================================== VALIDATION RESULTS ======================================================== [✗] Site 1: My X1500 Site (CSV line 2) Location: Wyoming, Usxyz, US-MN Timezone: America/Chicago Status: INVALID - Location not found: Wyoming, Us22, US-MN ======================================================== SUMMARY ======================================================== Total sites processed: 4 Valid sites: 1 (25.0%) Invalid sites: 1 (25.0%) Skipped sites: 2 (50.0%) ======================================================== SKIPPED ROWS (all location fields empty) ======================================================== - My X1500 Site (CSV line 3) - My X1600 Site (CSV line 5) ======================================================== HOW TO FIX INVALID LOCATIONS ======================================================== Use the following catocli query to search for valid locations: catocli query siteLocation -h Validate CSV Required Fields and Import into Terraform State # Initialize Terraform terraform init # Validate csv has all required fields before attempting import catocli import socket_sites_to_tf \ --data-type csv \ --csv-file config_data_csv/socket_sites.csv \ --csv-folder config_data_csv/sites_config/ \ --module-name module.sites_from_csv --validate # Import existing resources into Terraform state catocli import socket_sites_to_tf \ --data-type csv \ --csv-file config_data_csv/socket_sites.csv \ --csv-folder config_data_csv/sites_config/ \ --module-name module.sites_from_csv \ --auto-approve # Review (should show no changes if import was successful) terraform plan Working with JSON Format The JSON format is ideal when you want to: Use programmatic tools to manipulate configurations Keep all configuration in a single file Work with JSON-aware editors and validation tools Export to JSON # Export all socket sites to JSON format catocli export socket_sites \ -f json \ --output-directory=config_data Best Practices 1. Version Control Everything Use a version control system to manage the changes in your configuration files, in this example, the Git client is used to track infrastructure file changes: # Initialize repository git init git add main.tf git commit -m "Initial Terraform configuration" 2. Regular Exports and Backups Create automated backup scripts to regularly export your configuration (sites_backup.sh): #!/bin/bash DATE=$(date +%Y%m%d_%H%M%S) BACKUP_DIR="backups/$DATE" mkdir -p "$BACKUP_DIR" catocli export socket_sites -f json --output-directory="$BACKUP_DIR" Troubleshooting Issue: Import Fails with "Resource Already Exists" Symptom: Error: Resource already exists in state Solution: # List all items in terraform state terraform state list # Show terraform state terraform show # Remove the resource from state and re-import terraform state rm 'module.sites_from_csv.cato_socket_site["Your Cato Site Name Here]' Issue: Plan Shows Unexpected Changes Symptom: Plan: 0 to add, 25 to change, 0 to destroy Solution: # Export fresh configuration from CMA catocli export socket_sites -f json --output-directory=config_data_verify # Compare with your current configuration diff config_data/socket_sites.json config_data_verify/socket_sites.json Conclusion Brownfield deployments for Cato Networks enable you to bring existing infrastructure under version-controlled, automated management without disruption. By following this guide, you can: Eliminate manual configuration errors through automation Maintain consistency across hundreds of sites Accelerate deployments from days to minutes Improve disaster recovery with infrastructure-as-code backups Enable collaboration through Git-based workflows Ensure compliance with standardized configurations Key Takeaways Start Small: Begin with exporting a single site, validate the process, then scale Test First: Always use terraform plan before terraform apply -parallelism=1 Version Control: Git is essential for tracking changes and enabling rollbacks Automate Backups: Regular exports provide disaster recovery capability Document Everything: Clear documentation enables team collaboration Additional Resources Cato API Essentials - Videos Cato Terraform Provider Socket-Bulk-Sites Terraform Module Cato CLI Cato API Documentation Learning Center: Using Terraform with Cato Cloud Online JSON Formatter Happy Infrastructure-as-Code Management!199Views3likes1Comment