Overview
Hybrid
Depends on Experience
Contract - W2
Contract - 12 Month(s)
Able to Provide Sponsorship
Skills
Python
SQL
Azure Data Factory
Databricks
and PySpark
Job Details
ONLY W2 - NO CORP CORP
OPEN TO SPONSOSRSHIP
Location: Washington, DC
Hybrid Onsite: 3/4 Days onsite per week from Day1
MUST HAVE
- Strong hands-on experience with Python, SQL, Azure Data Factory, Databricks, and PySpark.
- Proven expertise in data migration and transformation projects.
- Solid understanding of cloud-based data engineering on Azure.
- Excellent problem-solving and communication skills.
- Ensure security, compliance, and performance, best practices are followed across the data stack.
Responsibilities:
- Design, develop, and optimize data pipelines using Azure Data Factory, Databricks, and PySpark.
- Migrate on-premises data solutions to Azure cloud stack.
- Ensure data quality, scalability, and performance across ETL processes.
- Collaborate with cross-functional teams to support business data needs.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.