Data Engineer with ETL & Snowflake

Overview

On Site
Depends on Experience
Contract - W2

Skills

Spark
Scala
Cloudera
SQL
GCP
AWS
AZURE
ETL
Python
PySpark

Job Details

Role: Data Engineer with ETL & Snowflake

Location: TX, NJ, Chicago

Type of Hire: W2

Job Description:

Primary Skills- Strong Data Engineer with skills around Spark/Scala, Cloudera, SQL, and Any Cloud (Google Cloud Platform /AWS/ AZURE)

  • 8+ years of experience in ETL development and data engineering.
  • 6-8 years+ experience on Snowflake SQL advanced SQL expertise
  • 6-8 years+ experience on data warehouse experiences hands on knowledge with the methods to identify, collect, manipulate, transform, normalize, clean, and validate data, star schema, normalization / denormalization, dimensions, aggregations etc.,
  • 6-8 years+ experience working in reporting and analytics environments development, data profiling, metric development, CICD, production deployment, troubleshooting, query tuning, Atlassian Bitbucket and Bamboo, etc.,
  • 6 years+ on Python advanced Python expertise
  • 6 years+ on any cloud platform AWS preferred
  • Hands on experience on AWS on Lambda, S3, SNS / SQS, EC2 is a bare minimum,
  • 6 years+ on any ETL / ELT tool Informatica, Fivetran, DBT, Airflow etc. - Must
  • 6 years+ with developing functional metrics in any specific business
  • 4+ years of exp in PySpark
  • 6-8 + yrs. of exp in SQL
  • Excellent hands-on experience with Python

Secondary Skills Required:

  • 8+ years of experience in ETL development and data engineering.
  • Strong proficiency in SQL, Python, or Scala for data manipulation.
  • Expertise in AWS cloud services (DMS, Glue, Redshift, S3, Lambda, Step Functions, EMR).
  • Data Engineering: Experience with data ingestion, ETL, and data warehousing.
  • Databases: Knowledge of relational databases
  • Experience in data integration (ETL/ELT) development using multiple languages (e.g., Python, PySpark, Scala) and data transformation (e.g., dbt).
  • Experience with AWS-based data services technologies (e.g., Glue, RDS, Athena, etc.) and Snowflake CDW, as well as BI tools (e.g., Power BI).
  • Cloud Platforms: Experience with AWS
  • Data Modelling: Ability to design and implement data models for analytical and reporting purposes.
  • Security: Understanding of data security principles and best practices.
  • Communication and Collaboration: Excellent communication and collaboration skills.

Certification Preferred:

  • Azure AZ 900
  • Snowflake
  • Data Bricks

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Brilliant Infotech Inc.