Azure DataBricks Engineer with Snowflake, Python and SQL

Overview

Hybrid
$60 - $80
Contract - W2
Contract - Independent
Contract - 1 Year(s)
25% Travel

Skills

Databricks
PySpark
SQL
Snow Flake Schema
Python
ADF
Data Migration
Data Integration
Extract
Transform
Load
Finance
Financial Reporting
Legacy Systems
Microsoft Azure
Data Warehouse
Scripting

Job Details

We need Azure DataBricks Engineer with Snowflake, Python and SQL

Domain: Capital market

Face to Face Client interview is must

Responsibilities:

  • Design, build, and maintain scalable and efficient Snowflake data warehouse solutions tailored for financial industry applications.
  • Develop complex SQL queries, stored procedures, and scripts for data manipulation and retrieval.
  • Optimize Snowflake performance through effective data modeling, partitioning, and indexing strategies.
  • Collaborate with financial analysts and business stakeholders to understand data requirements and deliver precise reporting and analytical solutions.
  • Ensure data integrity and security within the Snowflake environment, complying with industry regulations and standards.
  • Manage data migration projects from legacy systems to Snowflake, ensuring minimal disruption to business operations.
  • Provide technical expertise and support for Snowflake to internal teams, including troubleshooting and resolving issues.
  • Stay up to date with Snowflake features and updates, continuously improving and expanding our data warehouse capabilities.
  • Develop and maintain documentation for Snowflake processes, systems, and procedures.

 

Requirements:

  • Proven 8-12+ years (Senior) of experience in Snowflake data warehousing Development
  • Strong expertise in Strong SQL and SQL query optimization skills
  • Strong knowledge of data warehousing concepts and best practices.
  • Familiarity with Snowflake's architecture, features, and best practices.
  • Experience with data modeling, ETL development, and data integration techniques.
  • Experience with Azure Data Factory (ADF), Data Bricks and other Azure services.
  • Knowledge of financial industry standards, regulatory requirements, and data security protocols.
  • Proficiency in Python & Pyspark for data processing and automation.

 

It would be great if you also had:

  • Experience with financial reporting, risk management, or investment analytics.
  • Familiarity with other data integration tools and BI platforms.
  • Relevant certifications such as the Snowflake SnowPro Core Certification.
  • DataBricks Certification.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Crea Services LLC