Azure Data Engineer

Contract W2, Contract Corp-To-Corp, 12 Months
Depends on Experience
Work from home available

Job Description


Please find the below Job Description and revert me back with your interest.


Job Title: Azure Data Engineer

Location: New York, NY / Remote

Contract: Long term contract

What you will be doing:
• Build and maintain business functions with PySpark library with platform independence by using Python Language.
• Integrate business logic solution with PySpark Framework running on Azure Databricks, Google Cloud Platform dataproc, and On-prem Spark environment.
• Integrate business logic with upstream and downstream systems and applications, including RDBMS, File system, Hive, Delta lake, Azure Data Lake, Azure Event Grid, Azure Function, Azure Event Hub, etc.
• Cooperate with PySpark Framework backbone developers to integrate the business logic python program to be plug and play with PySpark Framework.
• Build and maintain DevOps process to integrate the CI/CD pipeline through Jenkins, Gitlab, Nexus Repo, checkmarkrx to ensure development and deployment integration and quality control with automated testing and security governance.
• Publish data processing and modeling result/dataset to RDBMs on Azure SQL solution and generate microservice through restful Azure Function API.
• Manage and maintain the Data bricks job execution through Azure job scheduler and/or other orchestration tools.

Here's what you will need to know/have:
• A minimum of 2 years of professional, hands-on data management/operations and/or analytic experience
• A minimum of 3 years of professional, hands-on building enterprise level application with full stack experience including data preparation, data processing, data publication and consumption with UI and microservice.
• A minimum of 2 years of experience in building enterprise level solution on Azure cloud environment with Azure Function, Azure Databricks, Azure Event Hub, Azure Event Grid, Azure data lake, Azure Data factory
• A minimum of 1 year of experience on hands-on coding in python and/or pyspark on Azure Databricks environment
• A minimum of 1 year of experience in snowflake
• Demonstrated experience interacting with and influencing decision-making by non-analytical business audiences
• Excellent problem-solving skills along with experience in constructing data management, automated solutions, including building application/process to address business problems
• Proficiency with data access, manipulation and retrieving data from large databases with SQL on RDBMS such as Teradata, Oracle, and SQL Server
• Experience with data access, manipulations and statistical analysis through python and or PySpark
• Experience in using Terraform and/or similar infrastructure as code to provision environment on Azure
• Experience in DevOP CI/CD with gitlab, Jenkins, Nexus Repo, checkmarkrx, and or other quality control or security governance tools



Kalyan Gowd

US IT Recruiter

Dice Id : ittb
Position Id : 7136543
Originally Posted : 2 months ago
Have a Job? Post it

Similar Positions

Data Engineer
  • HonorVet Technologies
  • New York, NY, USA
AZURE DATA ENGINEER - 100% REMOTE (Junior To Mid-Level)
  • Provish Consulting
  • Denver, CO, USA
Azure data lakes
  • Smart TechLink Solutions Inc.
  • Jackson, MI, USA
Azure Data Lead:
  • 1 Point System
  • Louisville, KY, USA
Azure Data Engineer
  • IT Trailblazers, LLC
  • Houston, TX, USA
Azure Data Architect Redmond, WA 28093
  • PRIMUS Global Services Inc.,
  • Redmond, WA, USA
Azure Data Engineer
  • Denken Solutions
  • St. Louis, MO, USA
Data Engineer
  • Atyeti
  • New York, NY, USA
Azure Data Architect
  • Axtria Inc
  • Philadelphia, PA, USA