Data Engineer

Overview

On Site
Depends on Experience
Contract - W2

Skills

Data Warehouse
Data Engineering
Snowflake
AWS
DataStage
Talend
SSIS
Python
Informatica

Job Details

Role: Data Engineer

Type of Employment: Contract for 6 months with possibility of extension

Location: NJ / NY

Description:

  • Inbound data integrations e.g. source data replication, ETL, View creation for dashboards and data feeds
  • Outbound data integrations e.g. flat file data feeds to S3/SFTP, DataShares, etc.
  • Persons will be responsible for working with Client data architects to design, build and test data integrations, provide support and ensure meticulous documentation of solutions provided. The ideal candidate will have a strong background in enterprise data warehouse architecture, experience with Snowflake Cloud Data Warehouse, and proficiency in DBT ETL processes.

ESSENTIAL DUTIES AND RESPONSIBILITIES:

  1. Design and deliver end-to-end data integration solutions with Enterprise Cloud Data Warehouse.
  • Data replication from variety of internal and external data sources into Snowflake Data Warehouse Raw layer.
  • Collaborate with cross-functional teams to understand business requirements and translate them into effective data models.
  • Experience with AWS Services (S3, Event Bridge, Cloudwatch, etc.) is required.
  1. Experience with Snowflake Cloud Data Warehouse Platform is required:
  • Experience using Snowflake capabilities such as Snowpipes, Tasks, Streams, Stored Procedures, Data Replication.
  • Conform to CLIENT best practices for data auditing, user access and data security.
  1. Support client project deliverables:
  • Support the implementation and integration of data solutions for clients.
  • Support data mapping activities to align business requirements with data warehouse structures.
  1. Design and implement ETL processes:
  • Experience using DBT ETL is required.
  • Experience using Qlik Replication and Compose ETL is optional (less preferred).
  • Experience using CICD pipelines and code repositories such as Bitbucket is preferred.
  • Develop and implement efficient ETL processes to support data integration from different engagement platforms.
  • Optimize data workflows for performance and scalability.
  1. Demonstrate proficiency in SQL; Python scripting skills are a valuable addition:
  • Write and optimize SQL queries for data extraction, transformation, and loading.
  • Python skills for automation and scripting are desirable.
  1. Knowledge of Life Sciences or Healthcare business processes is advantageous
  • Familiarity with Patient Services and Specialty Pharmacy systems and business processes is a plus.
  1. Exhibit strong documentation abilities:
  • Experience using Jira.
  • Familiarity with SDLC processes and documentation.
  • Experience using work management and documentation platforms such as Jira, Confluence is a plus.
  • Document requirements and conduct peer reviews.
  • Ensure documentation is accessible and understandable for both technical and non-technical stakeholders.

MINIMUM KNOWLEDGE, SKILLS AND ABILITIES:

  • At least 3+ years with Data Warehouse architecture design and/or data engineering
  • At least 2+ years of experience working with Snowflake cloud data warehouse who is having AWS + DBT
  • Knowledge of or experience with any ETL tool : Informatica, Datastage, Talend or SSIS.
  • Strong problem-solving and analytical abilities.
  • Ability to design, execute and communicate solutions to different stakeholders.
  • Experience with AWS and Python scripting.
  • Bachelor's or advanced degree in Computer Science, Information Technology, or a related field.

For immediate consideration and interviews please apply here or reply with a copy of resume' to

About Brilliant Infotech Inc.