Job Responsibilities:
Design, develop, and maintain data pipelines using AWS Redshift, Databricks, and Snowflake
Ingest, transform, and curate data from multiple sources
Optimize solutions for performance, reliability, scalability, and cost
Implement data quality checks, monitoring, and observability for production pipelines
Perform root cause analysis on data incidents and pipeline failures, and implement preventive fixes to avoid recurrence
Data wrangling of heterogeneous data, exploration and discovery in pursuit of new business insights
Collaborate with analytics and business teams to deliver trusted datasets
Mentor other team members in their efforts to build data engineering skillsets
Assist team management in defining projects, including helping estimate, plan and scope work
Prepare and contribute to presentations required by management
Required Qualifications:
4+ years developing cloud solutions as a Data Engineer
Bachelor s Degree in Computer Science, Computer Engineering, or a related field
Clear and concise verbal and written communication; ability to communicate technical concepts to non-technical people
Proven experience developing solutions with:
AWS (Redshift, S3, Step Functions, Eventbridge, CloudWatch)
Databricks (Spark, Delta Lake, Apache Iceberg, Unity Catalog)
Snowflake
Strong proficiency in SQL to write and optimize performance of large-scale analytics queries
Strong proficiency in Python for custom data processing
Familiarity with CI/CD, version control (Git), and automated testing for data pipelines
Familiarity with Infrastructure as Code (Terraform, or similar)
Solid understanding of data warehousing and dimensional modeling
Ability to write detailed and comprehensive testing documentation. Strong focus on code quality with the ability to design and execute thorough tests.
Ability to manage work across multiple projects, good organizational skills
Ability to quickly learn new technologies and understand data flows and business concepts.
Excellent problem-solving skills, attention to detail, and the ability to work independently.
Proven track record of proactively identifying, performing root cause analysis on, and mitigating production data issues.
Ability to conduct effective code reviews.
Ability to interview without use of AI tools
At least 18 years of age
Legally authorized to work in the United States
Preferred Qualifications:
Hands-on experience integrating AI tools/solutions into data workflows to improve efficiency, automation, or developer productivity