Overview
On Site
Depends on Experience
Full Time
50% Travel
Skills
Data engineer
SAP BW
Business Warehouse
databricks
azure
aws
pyspark
Job Details
Data Engineer with SAP BW Dallas, TX Fulltime
Key Responsibilities
Required Skills and Qualifications:
Educational Background:
Preferred Qualifications:
4+ years of relevant work experience in data engineering or software engineering equivalent. 2+ years of experience in implementing big data processing technology: AWS / Azure / Google Cloud Platform, Apache Spark, Python. Experience writing and optimizing SQL queries in a business environment with large-scale, complex datasets.
Key Responsibilities
- Design, build, and maintain scalable ETL/ELT pipelines and data integration frameworks using Databricks, PySpark, and SQL.
- Collaborate with cross-functional teams (Data Architects, Analysts, and SAP teams) to design data models and ensure smooth data flow across systems.
- Extract and integrate data from SAP BW (Business Warehouse) and other enterprise systems into cloud data platforms for analytics and reporting.
- Build data ingestion pipelines to bring SAP and non-SAP data into data lakes or data warehouses (Snowflake, Delta Lake, etc.).
- Optimize performance of large-scale data processing workflows and ensure reliability, scalability, and governance.
- Support data modeling, transformation, and lineage tracking to ensure data consistency and quality.
- Automate and orchestrate data workflows using Airflow, ADF, or similar orchestration tools.
- Work closely with BI and analytics teams to enable self-service data access and reporting.
- Stay current with new tools, technologies, and best practices in modern data engineering and SAP data integration.
Required Skills and Qualifications:
Educational Background:
- Bachelor s or Master s degree in Computer Science, Information Technology, or a related field.
- Certifications in Databricks, Azure, or SAP BW are an added advantage.
- Strong experience with Python, PySpark, and SQL for data transformation and analytics.
- Hands-on experience with Databricks, Azure Data Factory, or equivalent cloud data engineering tools.
- Solid understanding of data warehousing concepts, data modeling, and performance tuning.
- Working experience or knowledge in SAP BW data extraction and integration (e.g., working with InfoProviders, DSOs, or Open Hubs).
- Experience connecting SAP BW data to cloud or non-SAP systems for analytics purposes.
- Familiarity with cloud platforms (Azure, AWS, or Google Cloud Platform) and modern data stack components.
- Experience with data orchestration/versioning tools like Airflow, Git, or Databricks Workflows.
Preferred Qualifications:
- 5+ years of experience in data engineering or data integration roles.
- 1 2 years of exposure to SAP BW environments and cross-system data extraction.
- Experience in data migration or modernization projects involving SAP and cloud data platforms.
- Strong problem-solving and collaboration skills in agile delivery environments.
- Knowledge of data warehousing concepts and tools (e.g., Snowflake, Redshift)
- Experience with data versioning and orchestration tools like Git, Airflow, or Dagster.
- Solid understanding of Big Data ecosystems (Hadoop, Hive, etc.).
4+ years of relevant work experience in data engineering or software engineering equivalent. 2+ years of experience in implementing big data processing technology: AWS / Azure / Google Cloud Platform, Apache Spark, Python. Experience writing and optimizing SQL queries in a business environment with large-scale, complex datasets.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.