Azure Data Engineer || Iselin NJ or NYC, NY || Capital Markets Experience || LOCALS ONLY

  • Woodbridge Township, NJ
  • Posted 14 hours ago | Updated 14 hours ago

Overview

Hybrid
Depends on Experience
Contract - W2
Contract - Independent
Contract - 12 Month(s)

Skills

azure
data engineer
Snowflake
SQL
Python
ADF
Azure Data Bricks
PySpark
Data Warehouse

Job Details

Hi,
Please review the JD and let me know if interested.
 
Role: Azure Data Engineer
Location: Iselin NJ / NYC NY (3 Days/week Hybrid onsite from Day 1)
Duration: 12+ months
W2 only
LOCALS ONLY
 
15+ Years of Experience
Need strong Capital Market Domain experience.
2 Professional references are required at the time of submission
 
Must Have:
Snowflake
SQL
Python
ADF
Azure Data Bricks
PySpark
Data Warehouse Concepts
 
Job Description :
This position is for a Cloud Data engineer with a background in Python, Pyspark, SQL and data warehousing for enterprise level systems. The position calls for someone that is comfortable working with business users along with business analyst expertise.
 
Major Responsibilities:
Build and optimize data pipelines for efficient data ingestion, transformation and loading from various sources while ensuring data quality and integrity.
Design, develop, and deploy Spark program in databricks environment to process and analyze large volumes of data.
Experience of Delta Lake, DWH, Data Integration, Cloud, Design and Data Modelling.
Proficient in developing programs in Python and SQL
Experience with Data warehouse Dimensional data modeling.
Working with event based/streaming technologies to ingest and process data.
Working with structured, semi structured and unstructured data.
Optimize Databricks jobs for performance and scalability to handle big data workloads.
Monitor and troubleshoot Databricks jobs, identify and resolve issues or bottlenecks.
Implement best practices for data management, security, and governance within the Databricks environment. Experience designing and developing Enterprise Data Warehouse solutions.
Proficient writing SQL queries and programming including stored procedures and reverse engineering existing processes.
Perform code reviews to ensure fit to requirements, optimal execution patterns and adherence to established standards.
 
Skills:
9+ years Python coding experience.
5+ years - SQL Server based development of large datasets
5+ years with Experience with developing and deploying ETL pipelines using Databricks Pyspark.
Experience in any cloud data warehouse like Synapse, Big Query, Redshift, Snowflake.
Experience in Data warehousing - OLTP, OLAP, Dimensions, Facts, and Data modeling.
Previous experience leading an enterprise-wide Cloud Data Platform migration with strong architectural and design skills.
Experience with Cloud based data architectures, messaging, and analytics.
Cloud certification(s).
Any experience with Airflow is a Plus.
 
 
Regards,
Palak Rajora
Senior Technical Recruiter
MetaSense, Inc
 
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About MetaSense, Inc.