Senior Data Engineer

Overview

On Site
Depends on Experience
Accepts corp to corp applications
Contract - W2
Contract - 1 Month(s)

Skills

Apache Kafka
Apache NiFi
Auditing
Authentication
Cloud Computing

Job Details

We are looking for Senior Data Engineer for our client in Bellevue, WA
Job Title: Senior Data Engineer
Job Location: Bellevue, WA
Job Type: Contract
Job Description:
Pay Range: $60hr - $63hr
  • The Lead Data Ingestion & Pipeline Engineer will be responsible for designing, implementing, and maintaining scalable, modular, and reusable data flow pipelines across complex, multi-source telemetry environments.
  • The role involves leading architecture decisions, building ingestion frameworks, normalizing schemas, and ensuring high-fidelity data delivery to analytics platforms while maintaining governance, security, and compliance standards.
Key Responsibilities:
  • Lead the architecture, design, and implementation of scalable and reusable data pipelines using Cribl, Apache NiFi, Vector, and other open-source platforms.
  • Develop platform-agnostic ingestion frameworks and template-driven architectures for various input types (syslog, Kafka, HTTP, Event Hubs, Blob Storage) and output destinations (Snowflake, Splunk, ADX, Log Analytics, Anvilogic).
  • Spearhead the creation and adoption of a schema normalization strategy using the Open Cybersecurity Schema Framework (OCSF), including field mapping, transformation templates, and schema validation logic.
  • Design and implement custom data transformations and enrichments using scripting languages such as Groovy, Python, or JavaScript, enforcing robust governance and security controls (SSL/TLS, client authentication, input validation, logging).
  • Ensure end-to-end traceability and lineage of data across ingestion, transformation, and storage, including metadata tagging, correlation IDs, and change tracking for audit readiness.
  • Collaborate with observability and platform teams to integrate pipeline health monitoring, transformation failure logging, and anomaly detection mechanisms.
  • Oversee and validate data integration efforts to ensure high-fidelity delivery into downstream analytics platforms with minimal data loss, duplication, or transformation drift.
  • Lead technical working sessions to evaluate and recommend best-fit tools, technologies, and practices for managing structured and unstructured security telemetry data at scale.
  • Implement data transformation logic including filtering, enrichment, dynamic routing, and format conversions (JSON CSV, XML, Logfmt) for over 100+ data sources.
  • Maintain a centralized documentation repository covering ingestion patterns, transformation libraries, naming standards, schema definitions, data governance procedures, and platform-specific integration details.
  • Coordinate with security, analytics, and platform teams to ensure pipeline logic supports threat detection, compliance, and data analytics requirements.
Qualifications And Skills:
  • Proven experience designing and implementing large-scale, modular data pipelines for multi-source telemetry or analytics environments.
  • Strong expertise in Cribl, Apache NiFi, Vector, or other open-source data ingestion platforms.
  • Hands-on experience with schema normalization strategies, preferably using OCSF or similar frameworks.
  • Proficiency in scripting languages such as Python, Groovy, or JavaScript for custom data transformations.
  • Strong understanding of data governance, security controls, and audit compliance.
  • Experience ensuring end-to-end data traceability, lineage, and metadata management.
  • Ability to collaborate effectively with cross-functional teams and lead technical discussions.
  • Strong documentation and communication skills.
  • Experience with analytics platforms and cloud data stores (Snowflake, Splunk, ADX, Log Analytics, Anvilogic).
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.