Data Engineer

Overview

On Site
Depends on Experience
Accepts corp to corp applications
Contract - Independent
Contract - W2
Contract - 18 Month(s)

Skills

Agile
Amazon Web Services
SQL
Snow Flake Schema
Bitbucket
Amazon S3

Job Details

C2C Opportunity
Job Description:
Tech stack: Snowflake, DBT & AWS airflow are crucial
Responsibilities
Hands-on coding, inspire a dedicated team of onshore and offshore developers to architect, build, deploy, and support best-in-class software solutions for internal and external customers.
Leverage technical expertise and the latest tech stack to implement software development best practices. Implement application resiliency, scalability, and performance design.
Collaborate across business units, technology, and product teams to build and deliver business value.
Work actively with quality assurance engineer teams, Release Management, and DevOps to ensure all SDLC processes are adhered to.
Develop technical documentation to define the system components and workflows.
Participate in day-to-day activities of POD using Agile/Scrum methodology.
Requirements:
Minimum of 5 years of professional software development experience in building ETL using Python and SQL.
Minimum of 3 years of software development experience in Snowflake, DBT, Airflow with AWS platform - Glue, S3, SQL & NoSQL datastores
Minimum of 2 years of experience building production grade data pipelines using DBT & Airflow for data platforms
Minimum of 3 years of expertise with Source Control such as GIT, Bitbucket, or TFS. CI/CD experience with TeamCity/OctopJenkins/Jules.
Minimum of 3+ years of financial services/brokerage firm experience required
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.