Overview
Skills
Job Details
Location: Mechanicsburg,PA (Onsite)
#of Positions : 2
Duration: 12+ months
* Design, develop, and optimize data pipelines in Databricks using SQL, Python, and PySpark.
* Implement ETL/ELT processes to ingest, transform, and deliver data from multiple sources.
* Ensure pipeline scalability, performance, and reliability across the full lifecycle.
* Deliver structured, high-quality datasets to support Tableau dashboards and advanced analytics.
* Implement data validation, error handling, and logging for data quality and traceability.
* Collaborate with architects, business analysts, and existing Informatica developers to modernize legacy ETL workflows into Databricks.
* Strong, hands-on experience with Databricks pipelines and the Lakehouse architecture.
* Proficiency in SQL, Python, and PySpark.
* Proven experience designing and maintaining scalable ETL/ELT pipelines.
* Strong problem-solving, analytical, and communication skills
Infowave Systems is an equal opportunity employer that is committed to diversity and inclusion in the workplace.