Data Engineer

Overview

Remote
Depends on Experience
Full Time

Skills

Data Warehousing
Snowflake
AWS
DBT
Informatica
Python
SSIS

Job Details

Data Engineer

Location: Remote

Job Description:

Inbound data integrations e.g. source data replication, ETL, View creation for dashboards and data feeds

Outbound data integrations e.g. flat file data feeds to S3/SFTP, DataShares, etc.

Persons will be responsible for working with Client data architects to design, build and test data integrations, provide support and ensure meticulous documentation of solutions provided. The ideal candidate will have a strong background in enterprise data warehouse architecture, experience with Snowflake Cloud Data Warehouse, and proficiency in DBT ETL processes.

ESSENTIAL DUTIES AND RESPONSIBILITIES:

  1. Design and deliver end-to-end data integration solutions with Enterprise Cloud Data Warehouse:
  2. Data replication from variety of internal and external data sources into Snowflake Data Warehouse Raw layer.
  3. Collaborate with cross-functional teams to understand business requirements and translate them into effective data models.
  4. Experience with AWS Services (S3, Event Bridge, Cloudwatch, etc.) is required.
  5. Experience with Snowflake Cloud Data Warehouse Platform is required:
  6. Experience using Snowflake capabilities such as Snowpipes, Tasks, Streams, Stored Procedures, Data Replication.
  7. Conform to CLIENT best practices for data auditing, user access and data security.
  8. Support client project deliverables:
  9. Support the implementation and integration of data solutions for clients.
  10. Support data mapping activities to align business requirements with data warehouse structures.
  11. Design and implement ETL processes:
  12. Experience using DBT ETL is required.
  13. Experience using Qlik Replication and Compose ETL is optional (less preferred).
  14. Experience using CICD pipelines and code repositories such as Bitbucket is preferred.
  15. Develop and implement efficient ETL processes to support data integration from different engagement platforms.
  16. Optimize data workflows for performance and scalability.
  17. Demonstrate proficiency in SQL; Python scripting skills are a valuable addition:
  18. Write and optimize SQL queries for data extraction, transformation, and loading.
  19. Python skills for automation and scripting are desirable.
  20. Knowledge of Life Sciences or Healthcare business processes is advantageous
  21. Familiarity with Patient Services and Specialty Pharmacy systems and business processes is a plus.
  22. Exhibit strong documentation abilities:
  23. Experience using Jira.
  24. Familiarity with SDLC processes and documentation.
  25. Experience using work management and documentation platforms such as Jira, Confluence is a plus.
  26. Document requirements and conduct peer reviews.
  27. Ensure documentation is accessible and understandable for both technical and non-technical stakeholders.

MINIMUM KNOWLEDGE, SKILLS AND ABILITIES:

  • At least 3+ years with Data Warehouse architecture design and/or data engineering
  • At least 2+ years of experience working with Snowflake cloud data warehouse who is having AWS + DBT
  • Knowledge of or experience with any ETL tool: Informatica, Datastage, Talend or SSIS.
  • Strong problem-solving and analytical abilities.
  • Ability to design, execute and communicate solutions to different stakeholders.
  • Experience with AWS and Python scripting.
  • Bachelor's or advanced degree in Computer Science, Information Technology, or a related field.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Brilliant Infotech Inc.