Senior DataOps Engineer

  • Dearborn, MI
  • Posted 1 day ago | Updated 1 hour ago

Overview

On Site
$67 - $72 /hr
Contract - Independent
Contract - Long Term

Skills

Data Architecture
Endpoint Security
Google Cloud Platform
Data Governance
Cloud Infrastructure
Extract Transform Load (ETL)
Big Query
Network Security
Python

Job Details




Stefanini Group is hiring!

Stefanini is looking for Senior DataOps Engineer, Dearborn, MI (Onsite)

For quick apply, please reach out Pawan Rawat Singh at /



We are seeking a highly skilled and experienced Senior DataOps Engineer to join our EPEO DataOps team. This role will be pivotal in designing, building, and maintaining robust, scalable, and secure telemetry data pipelines on Google Cloud Platform (Google Cloud Platform). The ideal candidate will have a strong background in DataOps principles, deep expertise in Google Cloud Platform data services, and a solid understanding of IT operations, especially within the security and network domains. You will enable real-time visibility and actionable insights for our security and network operations centers, contributing directly to our operational excellence and threat detection capabilities.



ResponsibilitiesLead the design, development, and implementation of high-performance, fault-tolerant telemetry data pipelines for ingesting, processing, and transforming large volumes of IT operational data (logs, metrics, traces) from diverse sources, with a focus on security and network telemetry.Architect and manage data solutions using a comprehensive suite of Google Cloud Platform services, ensuring optimal performance, cost-efficiency, and scalability. This includes leveraging services like Cloud Pub/Sub for messaging, Dataflow for real-time and batch processing, BigQuery for analytics, Cloud Logging for log management, and Cloud Monitoring for observability.Drive the adoption and implementation of DataOps best practices, including automation, CI/CD for data pipelines, version control (e.g., Git), automated testing, data quality checks, and robust monitoring and alerting.Develop specialized pipelines for critical security and network data sources such as VPC Flow Logs, firewall logs, intrusion detection system (IDS) logs, endpoint detection and response (EDR) data, and Security Information and Event Management (SIEM) data (e.g., Google Security Operations / Chronicle).Implement and enforce data governance, compliance, and security measures, including data encryption (at rest and in transit), access controls (RBAC), data masking, and audit logging to protect sensitive operational data.Continuously monitor, optimize, and troubleshoot data pipelines for performance, reliability, and cost-effectiveness, identifying and resolving bottlenecks.



Experience RequiredCode Assessment, Google Cloud Platform, Data Architecture, Endpoint Security, Google Cloud Platform, Data Governance, Cloud Infrastructure, Extract Transform Load (Etl), Big Query, Network Security, PythonProven experience as a DataOps Engineer, Data Engineer, or similar role, with a strong focus on operationalizing data pipelines.Expertise in designing, building, and optimizing large-scale data pipelines for both batch and real-time processing.Strong understanding of DataOps principles, including CI/CD, automation, data quality, data governance, and monitoring.Proficiency in programming languages commonly used in data engineering, such as Python.Experience with Infrastructure as Code (IaC) tools (e.g., Terraform) for managing cloud resources.Solid understanding of data modeling, schema design, and data warehousing concepts (e.g., star schema).



Experience Preferred8+ years of experience in data engineering, with at least 4 years in a Senior or Lead role focused on DataOps or cloud-native data platforms.Collaborate closely with IT operations, security analysts, network engineers, and other data stakeholders to understand data requirements and deliver solutions that meet business needs. Mentor junior engineers and contribute to the team's technical growth.Create and maintain comprehensive documentation for data pipelines, data models, and operational procedures



Education Bachelor's or Master's degree in Computer Science, Data Engineering, Information Technology, or a related quantitative field.



**Listed salary ranges may vary based on experience, qualifications, and local market. Also, some positions may include bonuses or other incentives***

Stefanini takes pride in hiring top talent and developing relationships with our future employees. Our talent acquisition teams will never make an offer of employment without having a phone conversation with you. Those face-to-face conversations will involve a description of the job for which you have applied. We also speak with you about the process, including interviews and job offers.



About Stefanini Group

The Stefanini Group is a global provider of offshore, onshore and near shore outsourcing, IT digital consulting, systems integration, application, and strategic staffing services to Fortune 1000 enterprises around the world. Our presence is in countries like the Americas, Europe, Africa, and Asia, and more than four hundred clients across a broad spectrum of markets, including financial services, manufacturing, telecommunications, chemical services, technology, public sector, and utilities. Stefanini is a CMM level 5, IT consulting company with a global presence. We are a CMM Level 5 company.



#LI-PS27

#LI-ONSITE
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.