Senior Data Engineer

• Posted 23 hours ago • Updated 23 hours ago
Full Time
On-site
Fitment

Dice Job Match Score™

⭐ Evaluating experience...

Job Details

Skills

  • Enterprise Architecture
  • Pivotal
  • Open Source
  • Engineering Support
  • Data Integrity
  • Version Control
  • Continuous Integration
  • Continuous Delivery
  • Collaboration
  • Data Integration
  • Kubernetes
  • Mentorship
  • Data Engineering
  • Workflow
  • Performance Tuning
  • Operations Management
  • Testing
  • SQL
  • Python
  • Data Quality
  • Automated Testing
  • Orchestration
  • Managed Services
  • Machine Learning (ML)
  • Extract
  • Transform
  • Load
  • Artificial Intelligence
  • GitHub
  • Debugging
  • Test Cases
  • Real-time
  • Data Processing
  • Streaming
  • Data Modeling
  • Warehouse
  • IT Management
  • Microsoft Exchange

Summary

{"description": " Overview

Job Purpose

We're seeking a talented Senior Data Engineer to join our Enterprise Architecture team in a cross-cutting role that will help define and implement our next-generation data platform. In this pivotal position, you'll lead the design and implementation of scalable, self-service data pipelines with a strong emphasis on data quality and governance. This is an opportunity to shape our data engineering practice from the ground up, working directly with key stakeholders to build mission-critical ML and AI data workflows.

We emphasize building systems that are maintainable, scalable, and focused on enabling self-service data access while maintaining high standards for data quality and governance. The ideal candidate is a problem-solver who enjoys working on complex data systems and is passionate about data quality. You thrive in collaborative environments but can also work independently to deliver solutions. You're comfortable working directly with technical and non-technical stakeholders and can communicate complex technical concepts clearly. Most importantly, you're excited about creating systems that empower others to work with data efficiently and confidently.

Responsibilities
  • Design, build, and maintain our on-premises data orchestration platform using the best-in-breed open-source tools
  • Create self-service capabilities that empower teams across the organization to build and deploy data pipelines without extensive engineering support
  • Implement robust data quality testing frameworks that ensure data integrity throughout the entire data lifecycle
  • Establish data engineering best practices, including version control, CI/CD for data pipelines, and automated testing
  • Collaborate with ML/AI teams to build scalable feature engineering pipelines that support both batch and real-time data processing
  • Develop reusable patterns for common data integration scenarios that can be leveraged across the organization
  • Work closely with infrastructure teams to optimize our Kubernetes-based data platform for performance and reliability
  • Mentor junior engineers and advocate for engineering excellence in data practices

Knowledge and Experience
  • 5+ years of professional experience in data engineering, with at least 2 years working on enterprise-scale data platforms
  • Demonstrated experience with orchestrating workflows, performance optimization, and operational management
  • Strong understanding of data transformation techniques, including experience with testing frameworks and deployment strategies
  • Experience with stream processing frameworks and technologies
  • Proficiency with SQL and Python for data transformation and pipeline development
  • Familiarity with containerized application deployment
  • Experience implementing data quality frameworks and automated testing for data pipelines
  • Ability to work cross-functionally with data scientists, ML engineers, and business stakeholders

Preferred Knowledge and Experience
  • Experience with self-hosted data orchestration platforms (rather than managed services)
  • Background in implementing data contracts or schema governance
  • Knowledge of ML/AI data pipeline requirements and feature engineering
  • Experience leveraging AI tools (e.g., GitHub Copilot, Cursor, Claude Code) to debug, develop unit tests and generate test cases from requirements documents
  • Experience with real-time data processing and streaming architectures
  • Familiarity with data modeling and warehouse design principles
  • Prior experience in a technical leadership role

#LI-HR1 #LI-ONSITE

-

Intercontinental Exchange, Inc. is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to legally protected characteristics.", "salary_raw": null}
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 90922487
  • Position Id: 24059147
  • Posted 23 hours ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Atlanta, Georgia

Today

Full-time

Atlanta, Georgia

Today

Full-time

Compensation information provided in the description

Atlanta, Georgia

Today

Full-time

Compensation information provided in the description

Atlanta, Georgia

Today

Full-time

Compensation information provided in the description

Search all similar jobs