DataOps Engineer – Regulated Cloud & Data Hub Services
Location: Indianapolis, IN
About the Role
This role will support enterprise data and analytics initiatives within a HIPAA-compliant, regulated cloud environment, enabling scalable and secure data hub services. The DataOps Engineer will play a key role in ensuring seamless data delivery, pipeline orchestration, monitoring, and operational excellence across modern AWS and Databricks-based data platforms.
The position focuses on bridging data engineering, DevOps, and platform operations, enabling continuous integration, deployment, and monitoring of data workflows within modern data lake/lakehouse architectures.
Job Summary
The DataOps Engineer will be responsible for building, managing, and optimizing end-to-end data operations, ensuring data pipelines are reliable, secure, and ready for production. The role involves working with cloud platforms, orchestration tools, and observability frameworks to enable continuous data delivery and operational scalability in a regulated healthcare/pharma environment.
This position requires strong expertise in data pipeline orchestration, monitoring, automation, and cloud infrastructure, with a strong focus on data quality, governance, compliance, and system reliability.
Key Responsibilities:
- Design, implement, and manage DataOps frameworks to support scalable and reliable data pipelines
- Build and maintain CI/CD pipelines for data workflows, enabling automated testing, deployment, and version control
- Develop and manage workflow orchestration using tools such as Apache Airflow, AWS Step Functions, or equivalent
- Monitor, troubleshoot, and optimize data pipelines and platform performance across environments
- Implement observability and monitoring solutions (e.g., CloudWatch, Datadog, Prometheus, Grafana)
- Ensure data quality, validation, and governance standards are enforced across pipelines
- Support secure data processing with PII/PHI protection, encryption, and access controls (IAM/RBAC)
- Collaborate with data engineers, architects, and business teams to enable seamless data delivery
- Automate manual processes and improve efficiency, scalability, and cost optimization
- Manage and optimize AWS-based cloud infrastructure supporting data platforms
- Support incident response, root cause analysis, and failure recovery mechanisms
- Establish and promote best practices for DataOps, DevOps, and data lifecycle management
- Enable secure and compliant data operations aligned with HIPAA and enterprise standards
- Support ingestion and processing of healthcare data sources (e.g., APIs, claims, EHR data)
Minimum Qualifications:
- Bachelor’s degree in Computer Science, Engineering, or a related STEM field
- 5+ years of experience in Data Engineering / DataOps / Platform Engineering
- Strong experience with Python and SQL for data processing and automation
- Hands-on experience with AWS (S3, Glue, Lambda, Step Functions, Redshift, CloudWatch, IAM)
- Experience with Databricks (Spark, Delta Lake, workflows, notebooks)
- Experience with workflow orchestration tools (e.g., Apache Airflow)
- Experience building and managing CI/CD pipelines (GitHub Actions, Jenkins, or similar)
- Strong understanding of data pipelines, ETL/ELT, and modern data architectures (Lakehouse)
- Experience with monitoring, logging, and observability frameworks
- Knowledge of data governance, security, and compliance (HIPAA preferred)
Preferred Qualifications:
- Experience with real-time/streaming technologies (Kafka, Kinesis, Flink)
- Familiarity with data quality frameworks (Great Expectations, dbt tests, etc.)
- Experience in healthcare, pharma, or other regulated industries
- Knowledge of infrastructure as code (Terraform, CloudFormation)
- Exposure to data observability and lineage tools
Certifications such as:
- AWS Certified Data Engineer / Solutions Architect
- Databricks Certified Data Engineer
What You Should Bring:
- Strong problem-solving and analytical skills
- Ability to work in fast-paced, regulated production environments
- Strong collaboration and communication across technical and business teams
- A mindset focused on automation, reliability, compliance, and continuous improvement
- Attention to detail and commitment to delivering secure, high-quality, and maintainable solution