Data Engineer

Overview

Remote
Hybrid
USD 54.00 - 68.00 per hour
Contract - W2

Skills

Slowly changing dimensions
Relational databases
Data warehouse
Stored procedures
Data modeling
SAP HANA
Data integrity
Extract
transform
load
Data flow
Knowledge sharing
Decision-making
Project planning
Data engineering
SAP ECC
Soft skills
Problem solving
Functional analysis
Data
SQL
Database
Snow flake schema
Debugging
Macros
Testing
Acceptance testing
Cloud computing
ELT
Snap Logic
Design
Analytics
Reporting
Collaboration
Warehouse
Agile
Salesforce.com

Job Details

We're seeking a Data Engineer for our client, a global biotech company with headquarters in San Diego, CA. Data Engineer needs advanced working SQL knowledge and experience working with relational databases, query authoring (SQL), familiarity with a variety of databases, and DBT (data build tool) Snowflake Data Warehouse.

Overview:
** Start date: Immediate
** Duration: 2+month W2 contract, expected 3 month extension
** Location: Remote from United States, will support core Pacific Time business hours
** Compensation: The expected compensation is $54 - 68/hr W2 plus benefits.
The offered compensation to a successful candidate will be dependent on several factors that may include (but are not limited to) the type and length of experience within the industry, education, etc.

Requirements:

Bachelor's degree with 8+ years of experience working on relational databases or Master's degree with 3 years of experience
3-8+ years of experience with SQL and stored procedures, with excellent knowledge in SQL
3+ years of experience working on Snowflake, building data warehousing solutions, dealing with slowly changing dimensions as well.
3+ years of experience in developing and deploying data transformations using DBT including creating/debugging macros.
5+ experience in supporting end-to-end data model build and maintenance including testing/UAT.
Build, maintain and test data pipelines using cloud ETL/ELT tools, preferably Snaplogic.
Prior experience in working on SAP HANA.

Description:
Develop and maintain scalable data models in Snowflake, ensuring data integrity and reliability.
Design and implement data transformations using DBT to support analytics and reporting requirements.
Collaborate with data engineers and data analysts to understand data needs and translate them into technical solutions.
Optimize Snowflake warehouse configurations and DBT models for performance and cost efficiency.
Troubleshoot and resolve data pipeline issues, ensuring smooth and efficient data flow.
Participate in code reviews and provide feedback to team members to ensure code quality and adherence to best practices.
Stay updated with the latest developments in Snowflake and DBT technologies, and propose and implement innovative solutions.
Document data pipelines, transformations, and processes to facilitate knowledge sharing and maintain data lineage.
Work closely with cross-functional teams to support data-driven decision-making and business objectives.
Contribute to agile project planning and execution related to data engineering tasks and initiatives.

Desired skills:
Highly preferred to have prior experience in creating DW models on SAP ECC, Salesforce systems.

Soft skills:
Flexibility to adapt to changing situations, handle multiple tasks, and meet tight deadlines
Strong problem-solving, cross-functional analysis and forward-thinking abilities

Please apply today!