Senior Data Engineer (Security, Cribl,Vector)

Overview

On Site
$50 - $53 hr
Contract - W2
Contract - Independent

Skills

PLATFORM DATA ENGINEER
DATA INTEGRATION ARCHITECT
SECURITY DATA ENGINEER
SENIOR INFRASTRUCTURE DATA ENGINEER
SR. CRIBL / VECTOR ENGINEER
SECURITY ANALYTICS DATA ENGINEER
TELEMETRY PIPELINE ARCHITECT

Job Details

Danta Technologies is hiring a Senior Data Engineer Security Telemetry, Cribl, Vector Specialist. This position is located in Bellevue, WA, and requires on-site presence. I'd like you to go through the job description for your reference. If you are interested, please send your updated resume here.

Only independent candidates with valid U.S. work authorization are eligible for this position.

Role: Sr Data Engineer Security Telemetry, Cribl, Vector Specialist
Location: Bellevue, WA, Onsite
Duration: Contract 6+ months and possibility for extension - This duration is subject to changes based on the project's requirement and/or the client's sole discretion.
Client Industry: Telecommunication
No of open positions: 3
Pay Rate: $50/hr - $53/hr on W2 All Inclusive Danta Technologies payroll

Job Description:
The client is looking for a highly experienced data engineer with deep expertise in telemetry data ingestion and transformation, particularly using Cribl and Vector. This role goes beyond typical data engineering and veers into data pipeline architecture, security telemetry, and platform integration indicating a hybrid of data engineering, security data operations, and observability engineering.

A Sr. Data Engineer with architectural-level experience focused on security telemetry ingestion and processing at scale using tools like Cribl and Vector. The ideal candidate is not just a pipeline builder but an architect and enabler, capable of building reusable, secure, observable, and scalable data flow frameworks that support complex enterprise telemetry environments.

  • Lead the architecture, design, and implementation of scalable, modular, and reusable data flow pipelines using Cribl, Apache NiFi, Vector, and other open-source platforms, ensuring consistent ingestion strategies across a complex, multi-source telemetry environment.
  • Develop platform-agnostic ingestion frameworks and template-driven architectures to enable reusable ingestion patterns, supporting a variety of input types (e.g., syslog, Kafka, HTTP, Event Hubs, Blob Storage) and output destinations (e.g., Snowflake, Splunk, ADX, Log Analytics, Anvilogic).
  • Spearhead the creation and adoption of a schema normalization strategy, leveraging the Open Cybersecurity Schema Framework (OCSF), including field mapping, transformation templates, and schema validation logic designed to be portable across ingestion platforms.
  • Design and implement custom data transformations and enrichments using scripting languages such as Groovy, Python, or JavaScript, while enforcing robust governance and security controls (SSL/TLS, client authentication, input validation, logging).
  • Ensure full end-to-end traceability and lineage of data across the ingestion, transformation, and storage lifecycle, including metadata tagging, correlation IDs, and change tracking for forensic and audit readiness.
  • Collaborate with observability and platform teams to integrate pipeline-level health monitoring, transformation failure logging, and anomaly detection mechanisms.
  • Oversee and validate data integration efforts, ensuring high-fidelity delivery into downstream analytics platforms and data stores, with minimal data loss, duplication, or transformation drift.
  • Lead technical working sessions to evaluate and recommend best-fit technologies, tools, and practices for managing structured and unstructured security telemetry data at scale.
  • Implement data transformation logic including filtering, enrichment, dynamic routing, and format conversions (e.g., JSON CSV, XML, Logfmt) to prepare data for downstream analytics platforms. (100 plus sources of data)
  • Contribute to and maintain a centralized documentation repository, including ingestion patterns, transformation libraries, naming standards, schema definitions, data governance procedures, and platform-specific integration details.
  • Coordinate with security, analytics, and platform teams to understand use cases and ensure pipeline logic supports threat detection, compliance, and data analytics requirements.



Notes:- All qualified applicants will receive consideration for employment without regard to race, color, religion, religious creed, sex, national origin, ancestry, age, physical or mental disability, medical condition, genetic information, military and veteran status, marital status, pregnancy, gender, gender expression, gender identity, sexual orientation, or any other characteristic protected by local law, regulation, or ordinance.

Benefits: Danta offers a compensation package to all W2 employees that are competitive in the industry. It consists of competitive pay, the option to elect healthcare insurance (Dental, Medical, Vision), Major holidays and Paid sick leave as per state law.

The rate/ Salary range is dependent on numerous factors including Qualification, Experience and Location.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.