Data Engineer

Overview

On Site
Depends on Experience
Contract - W2
Contract - 12 Month(s)
No Travel Required
Able to Provide Sponsorship

Skills

ADF
Banking
Data Storage
python
Azure
NoSQL
RDBMS
SQL Azure

Job Details

Role: Data Engineer

Location: Raleigh, NC (Onsite)

Duration: 12+ Months

 

Responsibilities:

  • Design, build, and maintain scalable ETL data pipeline using ADLS Gen2, Azure Blob storage, Azure Databricks, Python, Azure SQL Database and Azure Cosmos DB.
  • Perform deep database performance tuning, query optimization, and indexing strategies across Azure SQL Database and Cosmos DB.
  • Lead database migration initiatives, including strategic migration from Azure SQL Database and CosmosDB to PostgreSQL.
  • Implement efficient data models and manage data storage solutions using ADLS Gen2 and real time data streaming using ADF, Spark and Confluent Kafka.
  • Develop and maintain data quality checks, monitoring, and alerting systems. Ensure data solutions comply with regulatory and security standards (e.g., PCI DSS).
  • Optimize batch processing workflows for intra-day and end-of-day banking operations.

 

Requirements:

  • 10 years of hands-on experience in data engineering with strong database expertise.
  • At least 5 years of experience in implementing operational data store (ODS) for the retail banking space, which captures the multiple cross system transactions, providing high throughput and low latency solution using RDBMS or NoSQL Technologies.
  • Should have hands-on experience implementing solutions using ADLS Gen2, Azure Blob storage, Azure Databricks, Azure SQL Database and Azure Cosmos DB.
  • Should have hand on experience in writing queries using the Normalized Data model for the retail banking space. Expert-level skills in database performance tuning, query optimization, and execution plan analysis
  • Proven experience with large-scale database migrations (Azure SQL Database to PostgreSQL preferred)
  • Strong programming skills in Python for data processing and automation
  • Experience with data modeling, ETL/ELT processes, and data warehouse concepts.
  • Excellent problem-solving skills with ability to diagnose and resolve complex data pipeline issues
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.