Overview
On Site
DOE
Contract - W2
Skills
Analytical Skill
Recovery
Collaboration
Sprint
UPS
Documentation
Data Flow
Python
Data Engineering
Writing
Cloud Computing
Snow Flake Schema
Extract
Transform
Load
Data Modeling
Data Warehouse
Data Integration
Git
Version Control
Linux
Unix
Agile
Conflict Resolution
Problem Solving
Attention To Detail
Orchestration
Apache Airflow
Microsoft Power BI
Data Visualization
Big Data
Apache Hadoop
Apache Spark
Database Performance Tuning
Microsoft Azure
Databricks
Job Details
Job Summary We are seeking a skilled and innovative Data Engineer to join our fast-paced, agile development teams. In this role, you will collaborate with cross-functional squads to design, build, and maintain scalable data pipelines and solutions using modern cloud technologies such as Databricks and Snowflake. Youll work closely with developers, product experts, and site reliability engineers in a highly collaborative, international environment. Key Responsibilities Design, develop, and deploy scalable ETL pipelines using Python and Databricks Extract, transform, and load data from a variety of sources to meet analytical and operational needs Build reliable, performant, and monitored data pipelines with error handling and recovery strategies Collaborate with stakeholders to understand data requirements and translate them into technical solutions Participate in Agile ceremonies (sprint planning, daily stand-ups, retrospectives) Write clean, testable, and well-documented code Conduct code reviews and maintain high-quality standards across all development efforts Utilize REST APIs to integrate data sources and services Maintain up-to-date documentation for data flows, processes, and systems Optimize and automate recurring data engineering tasks Required Qualifications Proficiency in Python for data engineering, including writing efficient, maintainable code Experience with Databricks and building data pipelines in a cloud environment Hands-on experience with Snowflake or other modern data warehouse technologies Strong understanding of ETL concepts, data modeling, and data warehousing best practices Familiarity with REST APIs and data integration techniques Experience working with Git or similar version control systems Solid understanding of Linux/Unix environments Proven ability to work in an Agile environment and contribute to cross-functional teams Excellent problem-solving skills and attention to detail Preferred Qualifications (Nice to Have) Experience with data orchestration tools such as Apache Airflow Familiarity with Power BI or other data visualization platforms Exposure to big data technologies (e.g., Hadoop, Spark) Background in database performance tuning or administration Certifications (if any) certifications if required or preferred e.g., Azure Data Engineer Associate, Databricks certification Education: Bachelors Degree Certification: Azure Data Engineer Associate , Databricks Certification
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.