Data Engineer- AWS, Snowflake, SQL, DBT, Python

Overview

Hybrid
$80 - $85
Contract - Independent
Contract - W2
Contract - 24 Month(s)

Skills

Agile
Amazon EC2
Amazon S3
Amazon SQS
Amazon Web Services
Analytical Skill
Apache Hadoop
Apache Hive
Apache Spark
Attention To Detail
Boomi
Business Intelligence
Cloud Computing
Communication
Computer Engineering
Computer Science
Conflict Resolution
Data Engineering
Data Integration
Data Lake
Data Management
Data Marts
Data Quality
Data Validation
Data Warehouse
Database
ELT
Extract
Transform
Load
Extraction
GitHub
Informatica
Informatica PowerCenter
Java
Management
Meta-data Management
Multitasking
NoSQL
Object-Oriented Programming
Problem Solving
Python
Relational Databases
SQL
Scala
Scripting
Scrum
Snow Flake Schema
Streaming
Talend

Job Details

Must Have - AWS, Snowflake, SQL, DBT, Python

The Data Engineer will develop ETL and data pipelines using AWS, Snowflake, and DBT. The ideal candidate has experience building data pipelines with ELT methodology.

Qualifications

  • The requirements listed below are representative of the qualifications necessary to perform the job.
  • Education and Experience
  • Bachelor's degree (or foreign equivalent) in Computer Science, Computer Engineering, or a related field.
  • 8+ years of experience with Data Engineering, ETL, data warehouse/data mart development, data lake development
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
  • Experience working with Cloud Datawarehouse like Snowflake, Google BigQuery, Amazon Redshift
  • Experience with AWS cloud services: EC2, S3, Lambda, SQS, SNS, etc.
  • Experience with Cloud Integration Tools like Matillion, Dell Boomi, Informatica Cloud, Talend, AWS Glue
  • Experience with GitHub and its integration with the ETL tools for version control
  • Experience with Informatica PowerCenter, various scripting languages, SQL, querying tools
  • Familiarity with modern data management tools and platforms including Spark, Hadoop/Hive, NoSQL, APIs, Streaming, and other analytic data platforms
  • Experience with object-oriented/object function scripting languages: Python, Java, Scala, etc., a plus.
  • Experience with Agile/Scrum is valuable.
  • Ability to work with business as well as IT stakeholders.
  • Strong written and oral communication skills with the ability to work with users, peers, and management.
  • Strong interpersonal skills.
  • Ability to work independently and as part of a team to successfully execute projects.
  • Highly motivated, self-starter with problem-solving skills.
  • Ability to multitask and meet aggressive deadlines efficiently and effectively.
  • Extraordinary attention to detail and accuracy.

Responsibilities

  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using Cloud Integration ETL Tools, Cloud Data Warehouse, SQL, and AWS technologies.
  • Design and develop ELT, ETL, Event driven data integration architecture solutions
  • Work with the Data Analysts, Data Architects, BI Architects, Data Scientists, and Data Product Owners to establish an understanding of source data and determine data transformation and integration requirements
  • Troubleshoot and tune complex SQL
  • Utilize On-Prem and Cloud-based ETL platforms, Cloud Datawarehouse, AWS, GitHub, various scripting languages, SQL, querying tools, data quality tools, and metadata management tools.
  • Develop data validation processes to ensure data quality
  • Demonstrated ability to work individually and as a part of the team in a collaborative manner

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.