Overview
On Site
$Competitive
Contract - W2
Skills
SAP ECC
SAP ERP
Finance
Human Resources
Decision-making
Storage
High Availability
RESTful
Microsoft Exchange
Authentication
OAuth
Real-time
Scripting
Data Analysis
Data Profiling
Business Rules
Reporting
Meta-data Management
Regulatory Compliance
Data Security
Version Control
Management
Documentation
Collaboration
Technical Support
SQL
Stored Procedures
Query Optimization
Performance Tuning
Data Governance
Data Quality
Amazon Web Services
Google Cloud
Google Cloud Platform
Data Integration
Workflow
Problem Solving
Conflict Resolution
SAP HANA
Cloud Computing
Enterprise Resource Planning
Python
PySpark
Data Processing
Microsoft
ADF
Continuous Integration
Continuous Delivery
DevOps
GitHub
Computer Science
Data Engineering
SAP
Microsoft Azure
Extract
Transform
Load
Pentaho
Informatica
API
Computerized System Validation
PDF
Thought Leadership
Debugging
Screening
Job Details
Project Overview:
Client's current on-premises Enterprise Resource Planning (ERP) system, SAP ECC 6.0, has been in use for over a decade and is nearing technological obsolescence resulting in client's requirement to select and implement a new ERP system.
The objective for deploying a new ERP system, is to successfully implement a system that integrates all business functions, including finance, operations, and human resources into a cohesive platform.
This implementation aims to enhance organizational efficiency, improve data accuracy, and provide real-time reporting capabilities.
The goal is to streamline processes, reduce operational costs, and support informed decision-making across all departments.
 Client's current on-premises Enterprise Resource Planning (ERP) system, SAP ECC 6.0, has been in use for over a decade and is nearing technological obsolescence resulting in client's requirement to select and implement a new ERP system.
The objective for deploying a new ERP system, is to successfully implement a system that integrates all business functions, including finance, operations, and human resources into a cohesive platform.
This implementation aims to enhance organizational efficiency, improve data accuracy, and provide real-time reporting capabilities.
The goal is to streamline processes, reduce operational costs, and support informed decision-making across all departments.
Responsibilities:
Cloud Data Engineering & Integration: Design and implement data pipelines across AWS, Azure, and Google Cloud.
Develop SAP BTP integration with cloud and on-premises systems.
Ensure seamless data movement and storage between cloud platforms.
ETL & Data Pipeline Development: Develop and optimize ETL workflows using Pentaho and Microsoft ADF /or equivalent ETL tools.
Design scalable and efficient data transformation, movement, and ingestion processes.
Monitor and troubleshoot ETL jobs to ensure high availability and performance.
API Development & Data Integration: - Develop and integrate RESTful APIs to support data exchange between SAP and other platforms.
Work with API gateways and authentication methods like OAuth, JWT, API keys.
Implement API-based data extractions and real-time event-driven architecture.
Data Analysis & SQL Development: Write and optimize SQL queries, stored procedures, and scripts for data analysis, reporting, and integration.
Perform data profiling, validation, and reconciliation to ensure data accuracy and consistency.
Support data transformation logic and business rules for ERP reporting needs. Data Governance & Quality (Ataccama, Collibra):
Work with Ataccama and Collibra to define and enforce data quality and governance policies.
Implement data lineage, metadata management, and compliance tracking across systems.
Ensure compliance with enterprise data security and governance standards.
Cloud & DevOps (AWS, Azure, Google Cloud Platform): Utilize Azure DevOps and GitHub for version control, CI/CD, and deployment automation.
Deploy and manage data pipelines on AWS, Azure, and Google Cloud.
Work with serverless computing (Lambda, Azure Functions, Google Cloud Functions) to automate data workflows.
Collaboration & Documentation: Collaborate with SAP functional teams, business analysts, and data architects to understand integration requirements.
Document ETL workflows, API specifications, data models, and governance policies.
Provide technical support and troubleshooting for data pipelines and integrations.
 
Required Skills & Experience:
7+ years of experience in Data Engineering, ETL, and SQL development.
Hands-on experience with SAP BTP Integration Suite for SAP and non-SAP integrations.
Strong expertise in Pentaho (PDI), Microsoft ADF, and API development.
Proficiency in SQL (stored procedures, query optimization, performance tuning). - Experience working with Azure DevOps, GitHub, and CI/CD for data pipelines.
Good understanding of data governance tools (Ataccama, Collibra) and data quality management.
Experience working with AWS, Azure, and Google Cloud (Google Cloud Platform) for data integration and cloud-based workflows.
Strong problem-solving skills and ability to work independently in a fast-paced environment.
 
Preferred Qualifications:
Experience working on SAP S/4HANA and cloud-based ERP implementations.
Familiarity with Python, PySpark for data processing and automation.
Experience working on Pentaho, Microsoft ADF/or equivalent ETL tools
Knowledge of event-driven architecture
Familiarity with CI/CD for data pipelines (Azure DevOps, GitHub Actions, etc.)
 
Education and Certifications:
Bachelor's or master's degree in a relevant field like Computer science, Data Engineering or related technical field.
 
Nice to have the certifications below:
Azure Data Engineer associate
SAP Certified Associate - Integration Developer
SAP BTP integration suite
Azure Data Factory or other ETL tools (Pentaho, Informatica)
Expertise connecting to various data sources (API, CSV, PDF's)
Demonstrated thought leadership to debug and resolve coding issues
Need to be able to work independently
In-person interview required after first screening interview
 Cloud Data Engineering & Integration: Design and implement data pipelines across AWS, Azure, and Google Cloud.
Develop SAP BTP integration with cloud and on-premises systems.
Ensure seamless data movement and storage between cloud platforms.
ETL & Data Pipeline Development: Develop and optimize ETL workflows using Pentaho and Microsoft ADF /or equivalent ETL tools.
Design scalable and efficient data transformation, movement, and ingestion processes.
Monitor and troubleshoot ETL jobs to ensure high availability and performance.
API Development & Data Integration: - Develop and integrate RESTful APIs to support data exchange between SAP and other platforms.
Work with API gateways and authentication methods like OAuth, JWT, API keys.
Implement API-based data extractions and real-time event-driven architecture.
Data Analysis & SQL Development: Write and optimize SQL queries, stored procedures, and scripts for data analysis, reporting, and integration.
Perform data profiling, validation, and reconciliation to ensure data accuracy and consistency.
Support data transformation logic and business rules for ERP reporting needs. Data Governance & Quality (Ataccama, Collibra):
Work with Ataccama and Collibra to define and enforce data quality and governance policies.
Implement data lineage, metadata management, and compliance tracking across systems.
Ensure compliance with enterprise data security and governance standards.
Cloud & DevOps (AWS, Azure, Google Cloud Platform): Utilize Azure DevOps and GitHub for version control, CI/CD, and deployment automation.
Deploy and manage data pipelines on AWS, Azure, and Google Cloud.
Work with serverless computing (Lambda, Azure Functions, Google Cloud Functions) to automate data workflows.
Collaboration & Documentation: Collaborate with SAP functional teams, business analysts, and data architects to understand integration requirements.
Document ETL workflows, API specifications, data models, and governance policies.
Provide technical support and troubleshooting for data pipelines and integrations.
Required Skills & Experience:
7+ years of experience in Data Engineering, ETL, and SQL development.
Hands-on experience with SAP BTP Integration Suite for SAP and non-SAP integrations.
Strong expertise in Pentaho (PDI), Microsoft ADF, and API development.
Proficiency in SQL (stored procedures, query optimization, performance tuning). - Experience working with Azure DevOps, GitHub, and CI/CD for data pipelines.
Good understanding of data governance tools (Ataccama, Collibra) and data quality management.
Experience working with AWS, Azure, and Google Cloud (Google Cloud Platform) for data integration and cloud-based workflows.
Strong problem-solving skills and ability to work independently in a fast-paced environment.
Preferred Qualifications:
Experience working on SAP S/4HANA and cloud-based ERP implementations.
Familiarity with Python, PySpark for data processing and automation.
Experience working on Pentaho, Microsoft ADF/or equivalent ETL tools
Knowledge of event-driven architecture
Familiarity with CI/CD for data pipelines (Azure DevOps, GitHub Actions, etc.)
Education and Certifications:
Bachelor's or master's degree in a relevant field like Computer science, Data Engineering or related technical field.
Nice to have the certifications below:
Azure Data Engineer associate
SAP Certified Associate - Integration Developer
SAP BTP integration suite
Azure Data Factory or other ETL tools (Pentaho, Informatica)
Expertise connecting to various data sources (API, CSV, PDF's)
Demonstrated thought leadership to debug and resolve coding issues
Need to be able to work independently
In-person interview required after first screening interview
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.