We are seeking a Security Data Architect with strong Cribl knowledge to drive the design and evolution of advanced security‑telemetry data ecosystems. In this role, you will define the end‑to‑end architecture and engineering strategy required to orchestrate, normalize, and transform large‑scale, highly diverse security data flows.
You will guide the modernization of ingestion patterns across 100+ legacy and emerging Cybersecurity data sources, ensuring resilient, secure, and high‑fidelity movement of telemetry between platforms. As a principal contributor to our SIEM modernization initiative, you will architect and oversee data pipelines built on Cribl, Vector, and other platforms.
-
Architect scalable, reusable security‑telemetry pipelines using Cribl, NiFi, Vector, and related platforms, ensuring consistent ingestion across 100+ diverse data sources.
-
Develop platform‑agnostic ingestion frameworks and modular patterns supporting multiple protocols and destinations (syslog, HTTP, Event Hubs, Snowflake, ADX, etc.).
-
Define multi‑year ingestion and transformation roadmaps, including modernization phases, platform standards, and scalable architectural guardrails.
-
Set enterprise governance models for schema evolution, onboarding new data sources, transformation quality, and versioning.
-
Drive platform consolidation and rationalization, identifying redundant ingestion patterns and unifying them into enterprise‑wide frameworks.
-
Create reference architectures, reusable design patterns, and standardized pipeline blueprints adopted by all engineering teams.
-
Provide technical mentorship to senior engineers, guiding architectural thinking and deep‑system design approaches.
-
Influence cross‑organizational strategy, aligning ingestion and transformation capabilities with threat‑detection, compliance, SIEM modernization, and data‑analytics roadmaps.
-
Evaluate emerging technologies, assessing fit, integration patterns, and long‑term viability for enterprise-scale telemetry processing.
-
Lead adoption of OCSF‑based normalization, including field mapping, schema validation, and portable transformation templates.
-
Implement advanced data transformation logic (filtering, enrichment, routing, format conversion) using Groovy, Python, or JavaScript while enforcing strict governance and security controls.
-
Ensure complete data lineage and traceability across ingestion, transformation, and storage layers, including metadata tagging and audit‑ready tracking.
-
Integrate pipeline‑level observability: health monitoring, error handling, transformation failure alerts, and anomaly detection.
-
Validate high‑fidelity data delivery to analytics and SIEM platforms, minimizing data loss, duplication, and drift.
-
Lead cross‑functional design sessions, technology evaluations, and architecture reviews for large‑scale security telemetry ecosystems.
-
Maintain centralized documentation for ingestion patterns, schema definitions, transformations, and governance standards.