Overview
Remote
Depends on Experience
Full Time
Skills
Amazon Redshift
Data Warehouse Engineer
Python
E-Business Suite
Job Details
Job Title: Senior Data Warehouse Engineer (E-Business Suite & Python)
Location: Remote
Duration: Full-time
Overview:
We are seeking a Senior Data Warehouse Engineer with a background in Oracle E-Business Suite (EBS) and Python-based data engineering to design, develop, and maintain enterprise data solutions. The ideal candidate will have hands-on experience building and optimizing data pipelines, integrating ERP data into warehouse environments, and enabling analytics and reporting across business domains such as Finance, Supply Chain, HR, and Manufacturing.
Key Responsibilities:
- Design, develop, and maintain data warehouse architectures supporting enterprise analytics and reporting.
- Extract, transform, and load (ETL/ELT) data from Oracle E-Business Suite (EBS) and other enterprise systems into the warehouse (e.g., Snowflake, BigQuery, Redshift, or Oracle DW).
- Develop efficient and reusable Python scripts for data ingestion, transformation, and validation.
- Build and optimize SQL and PL/SQL queries, stored procedures, and views for high-performance data access.
- Work with business analysts, functional teams, and ERP specialists to understand data models and reporting needs.
- Ensure data quality, lineage, and consistency across multiple systems.
- Participate in data modeling (conceptual, logical, and physical) for warehouse and reporting layers.
- Automate workflows and data pipelines using Airflow, DBT, or similar orchestration tools.
- Troubleshoot and optimize data pipelines for performance and scalability.
- Implement best practices for data governance, metadata management, and security.
Required Qualifications:
- Bachelor s or Master s degree in Computer Science, Information Systems, or related field.
- 8+ years of experience in data warehousing and ETL development, including hands-on experience with Oracle E-Business Suite data structures and APIs.
- Strong SQL and PL/SQL skills, with experience in query optimization and complex joins.
- Proficiency in Python for data processing, automation, and integration tasks.
- Experience with ETL tools such as Informatica, ODI, Talend, or custom Python-based ETL frameworks.
- Solid understanding of data modeling principles (Kimball or Inmon methodologies).
- Experience with data orchestration tools (e.g., Apache Airflow, Control-M, or Azure Data Factory).
- Familiarity with cloud-based data platforms (Snowflake, Redshift, BigQuery, or Oracle Cloud).
- Knowledge of EBS modules (Finance, SCM, HR, Manufacturing) and underlying data structures.
Preferred Qualifications:
- Experience integrating EBS data with modern BI tools (Power BI, Tableau, or Oracle Analytics).
- Working knowledge of API integrations (REST/SOAP) to extract EBS or external system data.
- Exposure to DataOps practices, Git-based version control, and CI/CD for data pipelines.
- Understanding of data governance frameworks (e.g., Collibra, Alation, or Data Catalogs).
- Strong problem-solving, analytical thinking, and communication skills.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.