Skip to content

Python Client Optimization Guide

This guide explains the caching and rate limiting features in the cyber-trackr-live Python client.

🚀 Quick Start

Basic Installation

bash
pip install cyber-trackr-live
bash
# Install with optional optimization dependencies
pip install cyber-trackr-live[optimizations]

# Or install manually
pip install cyber-trackr-live hishel ratelimit

📦 Optimization Features

1. HTTP Caching (hishel)

What it does:

  • Caches API responses in persistent storage
  • Speeds up repeated requests by 10-250x
  • Follows RFC 9111 HTTP caching standard
  • Respects HTTP cache headers

Usage:

python
from cyber_trackr_helper import CyberTrackrHelper

# Enable caching with 1 hour TTL
helper = CyberTrackrHelper(
    enable_cache=True,
    cache_ttl=3600  # 1 hour in seconds
)

# First call hits the API
stigs = helper.list_stigs()  # ~300ms

# Second call hits cache
stigs = helper.list_stigs()  # ~10ms (30x faster!)

Configuration Options:

python
helper = CyberTrackrHelper(
    enable_cache=True,
    cache_dir="~/.cache/my_app",  # Custom cache location
    cache_ttl=7200,  # 2 hours
    cache_backend='filesystem'  # 'filesystem', 'sqlite', or 'redis'
)

Cache Backends:

  • filesystem (default): Fast, simple, no external dependencies
  • sqlite: Better for concurrent access, query capabilities
  • redis: Best for distributed systems, shared cache

Recommended TTLs:

python
from cyber_trackr_helper.cache import CacheConfig

# Static data (rarely changes)
CacheConfig.RMF_CONTROLS_TTL  # 24 hours
CacheConfig.CCI_LIST_TTL      # 24 hours

# Lists (updated periodically)
CacheConfig.DOCUMENT_LIST_TTL  # 1 hour
CacheConfig.SCAP_LIST_TTL      # 1 hour

# Individual documents (may change)
CacheConfig.DOCUMENT_DETAIL_TTL     # 15 minutes
CacheConfig.REQUIREMENT_DETAIL_TTL  # 15 minutes

2. Rate Limiting

What it does:

  • Prevents overwhelming the API with too many requests
  • Automatically sleeps between requests if needed
  • Configurable limits per second/minute

Usage:

python
# Limit to 10 requests per second
helper = CyberTrackrHelper(
    rate_limit_per_second=10
)

For Batch Operations:

python
from cyber_trackr_helper.rate_limit import RateLimiter

# Conservative rate limiting for batch fetching
@RateLimiter.for_batch_operation()
def fetch_many_stigs():
    # Your batch fetching logic
    pass

Custom Rate Limiting:

python
from cyber_trackr_helper import RateLimiter

@RateLimiter.custom(calls_per_second=5)
def careful_operation():
    # Slower, more respectful operation
    pass
python
from cyber_trackr_helper import CyberTrackrHelper

# Best practice configuration
helper = CyberTrackrHelper(
    enable_cache=True,
    cache_ttl=3600,  # 1 hour for most operations
    cache_backend='filesystem',  # Or 'redis' for production
    rate_limit_per_second=10  # Respectful rate limiting
)

# Now use the helper normally
stigs = helper.list_stigs()
ac_controls = helper.get_control_family('AC')

📊 Performance Comparison

OperationWithout CacheWith CacheSpeedup
list_stigs()~300ms~10ms30x
list_all_documents()~250ms~8ms31x
get_control_family('AC')~200ms~5ms40x
search_ccis('password')~400ms~12ms33x

🛡️ Best Practices

1. Use Caching for Read-Heavy Workloads

python
# Good for dashboards, reports, analysis
helper = CyberTrackrHelper(enable_cache=True, cache_ttl=3600)

2. Disable Caching for Real-Time Monitoring

python
# Good for monitoring latest updates
helper = CyberTrackrHelper(enable_cache=False)

3. Use Conservative Rate Limits for Batch Operations

python
# When fetching many requirements
helper = CyberTrackrHelper(rate_limit_per_second=5)
complete_stig = helper.fetch_complete_stig(name, version, release, delay=0.2)

4. Clear Cache When Needed

python
from pathlib import Path
import shutil

cache_dir = Path.home() / '.cache' / 'cyber_trackr_live'
if cache_dir.exists():
    shutil.rmtree(cache_dir)  # Clear all cached data

5. Different TTLs for Different Data

python
# Static reference data - cache longer
helper_static = CyberTrackrHelper(enable_cache=True, cache_ttl=86400)  # 24 hours
rmf_controls = helper_static.get_control_family('AC')

# Dynamic STIG data - cache shorter
helper_dynamic = CyberTrackrHelper(enable_cache=True, cache_ttl=900)  # 15 minutes
stig = helper_dynamic.get_document(name, version, release)

🔧 Advanced: Custom Cache Configuration

python
from cyber_trackr_helper.cache import create_cached_client, CacheConfig
from cyber_trackr_api_client import Client

# Create custom cached client
cached_client = create_cached_client(
    base_url="https://cyber.trackr.live/api",
    cache_dir="/tmp/my_cache",
    ttl=7200,  # 2 hours
    backend='sqlite'
)

# Use with helper (advanced)
helper = CyberTrackrHelper()
helper.client = cached_client

📝 Environment-Specific Recommendations

Development

python
# Fast iteration, short cache
helper = CyberTrackrHelper(
    enable_cache=True,
    cache_ttl=300,  # 5 minutes
    rate_limit_per_second=20  # Faster for dev
)

Testing

python
# Longer cache to avoid API calls during tests
helper = CyberTrackrHelper(
    enable_cache=True,
    cache_ttl=86400,  # 24 hours
    rate_limit_per_second=100  # Fast for unit tests
)

Production

python
# Balanced caching and respectful rate limiting
helper = CyberTrackrHelper(
    enable_cache=True,
    cache_ttl=3600,  # 1 hour
    cache_backend='redis',  # Shared cache
    rate_limit_per_second=10  # Respectful to API
)

🐛 Troubleshooting

Cache Not Working

python
# Check if optimizations are installed
from cyber_trackr_helper import _OPTIMIZATIONS_AVAILABLE
print(f"Optimizations available: {_OPTIMIZATIONS_AVAILABLE}")

# If False, install:
# pip install hishel ratelimit

Cache Location

python
from pathlib import Path
cache_dir = Path.home() / '.cache' / 'cyber_trackr_live'
print(f"Cache directory: {cache_dir}")
print(f"Cache exists: {cache_dir.exists()}")

Rate Limiting Too Slow

python
# Increase rate limit
helper = CyberTrackrHelper(rate_limit_per_second=20)

🎓 Learn More

Released under the Apache-2.0 License.