Data Engineer

Overview

On Site
Full Time

Skills

Management
Warehouse
Reporting
Usability
Data Quality
Data Science
Analytics
Training
Data Storage
Data Engineering
SQL
Python
Java
Scala
Cloud Computing
Amazon Web Services
Amazon Redshift
Snow Flake Schema
Databricks
Microsoft Azure
Data Modeling
Data Warehouse
Extract
Transform
Load
ELT
Orchestration
Communication
Collaboration
Real-time
Streaming
Apache Kafka
Amazon Kinesis
Continuous Integration
Continuous Delivery
Terraform
Machine Learning (ML)
Workflow
Insurance
Financial Services

Job Details

  • Key Responsibilities
    • Design, build, and maintain scalable ETL/ELT pipelines to support analytics and machine learning workloads.
    • Develop and manage robust data models and warehouse structures that support self-service analytics and reporting.
    • Work with stakeholders across the business to understand data requirements and ensure data availability, accuracy, and usability.
    • Implement and monitor data quality and validation checks to maintain trust in our data assets.
    • Collaborate closely with data science and analytics teams to ensure data infrastructure supports model training and deployment.
    • Optimize data storage and query performance across cloud-based and relational systems.
    • Stay current with emerging data engineering tools and architectures and advocate for best practices across the team.
  • Required Qualifications
    • 5+ years of experience in data engineering, data infrastructure, or related fields.
    • Proficiency in SQL and at least one programming language (e.g., Python, Java, Scala).
    • Experience working with cloud data platforms (e.g., AWS Redshift, Snowflake, BigQuery, Databricks, Azure).
    • Strong knowledge of data modeling, data warehousing, and building ETL/ELT pipelines.
    • Familiarity with modern data orchestration tools (e.g., Airflow, dbt).
    • Excellent communication and collaboration skills.
  • Preferred Qualifications
    • Experience with real-time data streaming technologies (e.g., Kafka, Kinesis).
    • Familiarity with CI/CD for data pipelines and infrastructure-as-code (e.g., Terraform).
    • Experience supporting machine learning workflows and model deployment.
    • Background in insurance, financial services, or other highly regulated industries.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.