Onsite - Only W2 - Python Data Engineer

Overview

On Site
Depends on Experience
Accepts corp to corp applications
Contract - W2
Contract - 12 Month(s)

Skills

Amazon S3
Amazon Web Services
Pandas
Data Modeling
Python
Data Integration
Data Engineering

Job Details

We are currently seeking an experienced Data Engineer to join the Big Data and Advanced Analytics department. The Data Engineer will work closely with business domain experts to create an Enterprise Data Lakehouse to support data analytic use cases for the midstream oil and gas operations, engineering, and measurements business units.

Must Haves:

  • Experience software development/ software engineering experience
  • 5+ years with Python
  • Experience with Data Build Tool (DBT)
  • Knowledge of Data Lakehouse technologies Apache Iceberg or Delta Lake
  • Working with S3 object storage

Nice to Haves:

  • Python UI development experience, Dash
  • Dremio experience
  • Kubernetes/AWS EKS
  • AWS Cloud experience

Responsibilities include:

  • Design and implement reliable data pipelines to integrate disparate data sources into a single Data Lakehouse
  • Design and implement data quality pipelines to ensure data correctness and building trusted datasets
  • Design and implement a Data Lakehouse solution which accurately reflects business operations
  • Assist with data platform performance tuning and physical data model support including partitioning and compaction
  • Provide guidance in data visualizations and reporting efforts to ensure solutions are aligned to business objectives

The successful candidate will meet the following qualifications:

  • 5+ years of experience as a Data Engineer designing and maintaining data pipeline architectures
  • 5+ years of programming experience in Python and ANSI SQL
  • 2+ years of development experience with DBT, Data Build Tool
  • Experience in various data modelling methods such as Star Schema, Snowflake, and Data Vault design
  • Experience in implementing a Data Lakehouse using a Medallion Architecture with Apache Iceberg on S3 Object Storage
  • Experience in various data integration patterns including ELT, Pub/Sub, and Change Data Capture
  • Experience with common Python Data Engineering packages including Pandas, Numpy, Pyarrow Pytest, Scikit-Learn, and Boto3
  • Excellent communication skills with experience presenting complex concepts to technical and non-technical stakeholders
  • Experience in software development practices such as Design Principles and Patterns, Testing, Refactoring, CI/CD, and version control
  • Experience with Dremio, Apache Airflow, and Airbyte is preferred

Upon offer, candidate must be on the submitting vendors W2

Yes

Face to face interview required

No

Candidate must be authorized to work without sponsorship

No

Background Check

No

Drug Screen

No

End Client

Enterprise Products Partners / RE

With Regards,

Talent Acquisition Specialist

263 Tresser Blvd, One Stamford Plaza,

Stamford, CT 06901

(M)

Skype ID: adrikjordan4u

Please Connect Me -

H

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.