Azure DevOps

  • Columbus, OH
  • Posted 19 days ago | Updated 17 days ago

Overview

On Site
65
Accepts corp to corp applications
Contract - CON_HIRE_CORP

Skills

Strong background as a framework developer with a focus on data DevOps
demonstrating expertise in the SDLC. Proficiency in reading code in PySpark
understanding data pipelines
and optimizing workflows for cost efficiency. Experience with Azure Data Factory
Azure Databricks
SQL
and Snowflake. Familiarity with Azure practices
including the ability to write code and build infrastructure in the Azure cloud environment.

Job Details

Role : Azure DevOps

Location: Remote

Send profiles to jobs1@

Job Description :

As a Framework Developer (Data DevOps) at our retail client, you will play a pivotal role in supporting our data modernization initiative by maintaining and enhancing our existing framework and DevOps practices. You will collaborate closely with their devops team and contribute to optimizing data workflows for improved efficiency and cost-effectiveness. This role offers a unique opportunity to work with cutting-edge cloud technologies and make a meaningful impact on our data infrastructure.

In this role, you will:

  • Work hands-on to support and enhance our existing framework and DevOps practices, focusing on data batch processing.

  • Provide backup support to the offshore team as needed, ensuring smooth operations outside regular hours.

  • Collaborate with cross-functional teams to find solutions, design work, document processes, and provide guidance to junior developers.

  • Utilize your expertise in Azure Data Factory, Azure Databricks, SQL, Snowflake, and PySpark to optimize data workflows and pipelines.

  • Implement best practices in Azure cloud infrastructure, writing code, and building scalable and reliable solutions.

  • Actively contribute to a team of 10 developers, maintaining a high standard of excellence in all aspects of framework development and DevOps practices.

The Ideal Candidate :

  • Strong background as a framework developer with a focus on data DevOps, demonstrating expertise in the SDLC.

  • Proficiency in reading code in PySpark, understanding data pipelines, and optimizing workflows for cost efficiency.

  • Experience with Azure Data Factory, Azure Databricks, SQL, and Snowflake.

  • Familiarity with Azure practices, including the ability to write code and build infrastructure in the Azure cloud environment.

Nice-to-Have Skills:

  • Experience with PowerBI

  • Experience with Kafka