Overview
On Site
Depends on Experience
Accepts corp to corp applications
Contract - Independent
Contract - W2
Contract - 12 Month(s)
Skills
Python
ELT/ETL frameworks
Upstream
WITSML
ProdML
LAS
SCADA historian data
seismic
well logs
WellView/OpenWells datasets
CI/CD
Snowflake
Databricks
Delta Lake
SQL
cloud services (Azure or AWS)
BI reporting tools (Power BI / Spotfire)
Job Details
Job Title: Sr. Level Upstream Data Engineer-Strong Python (Architect)
Duration: 12+ Months
Location: Spring, TX
Description:
The Upstream Data Engineer will design, develop, and optimize enterprise data solutions that support drilling, reservoir engineering, completions, production optimization, and broader subsurface workflows. This role combines advanced data engineering expertise with deep functional knowledge of upstream oil and gas to enable high-quality analytics and accelerate operational decision making.
Key Responsibilities
- Architect, build, and maintain scalable data pipelines for drilling, reservoir, and production datasets leveraging Python and modern ELT/ETL frameworks
- Ingest, harmonize, and curate industry data sources such as WITSML, ProdML, LAS, SCADA historian data, seismic, well logs, and WellView/OpenWells datasets
- Design and implement robust data models in Snowflake and Databricks to support operational reporting, subsurface analytics, AI/ML, and reservoir engineering workflows
- Utilize open table formats such as Apache Iceberg to support efficient data lineage, versioning, governance, and incremental processing
- Collaborate with drilling, geoscience, and reservoir engineering stakeholders to translate business requirements into reusable technology solutions
- Apply orchestration, CI/CD, and DevOps practices to ensure reliability and automation across cloud environments
- Improve data product performance, availability, quality, and compliance aligned with upstream data governance standards and PPDM/O&G reference models
- Troubleshoot and support production data pipelines and ensure secure, optimized access to datasets
Required Qualifications
- Bachelor s degree in Petroleum Engineering, Computer Science, Data Engineering, or related technical discipline
- Proven experience working directly within upstream oil and gas domains such as drilling operations, reservoir management, completions, or production engineering
- Strong Python programming skills and experience building reusable transformation frameworks
- Hands-on experience with Snowflake and Databricks including Delta Lake or similar distributed processing capabilities
- Experience with open data lakehouse architectures and formats (Apache Iceberg preferred)
- Proficiency in SQL, cloud services (Azure or AWS), distributed compute concepts, and data ingestion frameworks
- Solid understanding of the well lifecycle, subsurface engineering concepts, and upstream operational KPIs
Preferred Skills
- Experience with Cognite Data Fusion for contextualization and integration of operational, engineering, and IT data to enable analytics and AI solutions
- Familiarity with OSDU data platform or PPDM standards for upstream data governance
- Experience building analytics-ready datasets for data science and real-time operational decision support
- Knowledge of BI reporting tools such as Power BI or Spotfire used in E&P environments
Exposure to real-time data ingestion from drilling rigs, control systems, or production operations
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.