Hi,
The following requirement is open with our client.
Title : Data Engineer
Location : Omaha, NE or Chicago, IL - Onsite
Duration : 12+ Months
Rate :$42/hr on W2
Position : W2 Position
Relevant Experience (in Yrs.):
Job Description:
• 5–8 years of experience in data engineering or a related role.
• Advanced proficiency in SQL for complex data transformation and analysis.
• Hands-on experience with cloud-based data platforms such as Databricks, Snowflake, or similar tools.
• Experience with ETL/ELT tools and frameworks (e.g., Informatica, Talend, dbt, or equivalent).
• Strong proficiency in Python and/or PySpark for data processing and pipeline development.
• Strong understanding of data modeling, database design principles, and building curated datasets for analytics and operational use cases.
• Experience with DevOps practices and Git-based development (branching strategies, pull requests, code reviews).
• Experience implementing CI/CD for data pipelines/workflows and managing deployments across environments.
• Client Domain Knowledge will be a plus.
• Familiarity with orchestration and workflow tools (e.g., Databricks Workflows, Airflow, or similar) is preferred.
• Familiarity with Infrastructure as Code (e.g., Terraform, CloudFormation) and/or containerization concepts is a plus.
• Strong problem-solving skills, attention to detail, and ability to troubleshoot complex issues end-to-end.
• Excellent communication skills and ability to collaborate across technical and non-technical teams.
Key Responsibilities:
• Lead the design, development, and maintenance of scalable data pipelines that process and integrate data from multiple sources into the Enterprise Data Platform.
• Build pipelines and workflows as code using modern engineering practices (version control, code reviews, automated testing, reusable components).
• Define and implement patterns for CI/CD for data pipelines (automated builds, tests, deployments, and environment promotion).
• Partner with data scientists, analysts, and business teams to gather requirements and translate them into robust data solutions.
Must Have Skills:
Databricks, Snowflake, Pyspark