Overview
Skills
Job Details
Job Title: Software Engineer
Location: Glendale, AZ
Duration: 12 Months
"Must be legally authorized to work in the U.S. without employer sponsorship"
Major Responsibilities:
Design, develop, and optimize scalable and efficient data processing pipelines and architectures within Azure Data Lake and Databricks, leveraging best practices for performance and maintainability.
Implement and manage complex ETL (Extract, Transform, Load) processes to seamlessly integrate data from diverse sources (e.g., databases, APIs, streaming platforms) into Azure Data Lake, ensuring data quality and consistency.
Develop and maintain interactive dashboards and reports using SAP Business Objects and Power BI, translating complex data into actionable business insights. Focus on performance optimization and data accuracy.
Leverage Azure Data Factory for data orchestration, workflow automation, and scheduling, ensuring reliable and timely data delivery.
Implement and maintain Azure Security & Governance policies, including access control, data encryption, and compliance frameworks, to ensure data protection and adherence to industry best practices.
Optimize data storage and retrieval mechanisms within Azure, including performance tuning of Databricks clusters and Azure SQL databases, to improve query performance and scalability.
Collaborate effectively with cross-functional teams (e.g., business analysts, data scientists, product managers) to understand business requirements, translate them into technical solutions, and communicate technical concepts clearly.
Implement data quality checks and validation rules throughout the data pipeline to ensure data accuracy, completeness, and consistency.
Monitor, troubleshoot, and enhance existing data solutions, proactively identifying and resolving performance bottlenecks and data quality issues.
Create and maintain comprehensive technical documentation, including design specifications, data flow diagrams, and operational procedures, to facilitate knowledge sharing and team collaboration.
Education and Experience Requirements:
- Bachelor s degree in computer science, Engineering, or a related field and 2-5 years of experience or 5+ years of relevant work experience.
Required Knowledge, Skills, and Abilities:
- 4+ years of hands-on experience in data engineering, data warehousing, and cloud-based data platforms.
- Deep expertise in Azure Data Lake, Azure Data Factory, Azure Security & Governance, Databricks, and SAP Business Objects.
- Strong proficiency in SQL, including complex query writing, query optimization, and performance tuning.
- Proven experience in developing and maintaining Power BI dashboards and reports.
- Hands-on experience with Azure services such as Azure Synapse Analytics, Azure SQL Database, and Azure Blob Storage.
- Solid understanding of data modeling concepts, ETL processes, and big data frameworks (e.g., Spark).
- Experience in optimizing and managing large-scale datasets in cloud environments.
- Experience developing and maintaining ETL packages using SSIS and reports using SSRS.
- Strong analytical and problem-solving skills with a keen attention to detail.
- Excellent communication and collaboration skills.
- Master's degree in a relevant field.
- Familiarity with machine learning models and data science concepts.
- Understanding of DevOps practices and CI/CD pipelines for data applications.
- Experience with data governance tools and frameworks.
- Experience with other cloud platforms (e.g., AWS, Google Cloud Platform)