Overview
Skills
Job Details
Onsite role to Detroit, MI from day one. Only local to MI candidates. NO C2C.
Essential Job Functions:
- Understand requirements and engage with team to design and deliver projects.
- Design and implement data lake house projects within azure.
- Design and develop application lifecycle utilizing Microsoft Azure technologies
- Participate in design and planning and necessary documentation
- Participate in Agile ceremonies including daily standups, scrum, retrospectives, demos, code reviews.
- Hands on with Python development and Azure data pipelines
- Engage with team to develop and deliver cross functional products
- Postgres SQL knowledge is a plus
Key Skills
a. Data Engineering and SQL
b. Python
c. PySpark
d. Azure Data lake and ADF
*****Postgres SQL knowledge is a plus.
Minimum Qualifications and Job Requirements:
* Bachelor's degree in CS.
- 7 years of hands-on experience in designing and developing distributed data pipelines.
- 5 years of hands-on experience in Azure data service technologies.
- 5 years of hands-on experience in Python, SQL, Object oriented programming, ETL and unit testing
- Experience with data integration with APIs, Web services, Queues
- Experience with Azure DevOps and CI/CD as well as agile tools and processes including JIRA, confluence.
- Working experience with writing SQL queries from scratch{color}
Other Responsibilities:
* Document and maintain project artifacts.
* Maintain comprehensive knowledge of industry standards, methodologies, processes, and best practices.
* Complete training as required for Privacy, Code of Conduct etc.
* Promptly report any known or suspected loss, theft or unauthorized disclosure or use of PI to the General Counsel/Chief Compliance Officer or Chief Information Officer.
* Adhere to the company's compliance program.
* Safeguard the company's intellectual property, information, and assets.