Senior Data warehouse Developer

  • Oaks, PA
  • Posted 3 days ago | Updated 3 days ago

Overview

Hybrid
Depends on Experience
Full Time

Skills

Data warehouse
Snow flake
ETL
SSIS
Informatica
SQL

Job Details

Role: Senior Data warehouse Developer

Location: Oaks, PA

Mode Of Hire: Full Time

Responsibilities:

  • Design, implement, and optimize efficient ETL processes to transform raw data into actionable insights.
  • Develop and maintain robust data warehouse solutions, including the implementation of star and snowflake schemas.
  • Establish and manage reliable data pipelines to ensure timely data availability.
  • Create modular, maintainable, and scalable dbt workflows for advanced data transformations.
  • Leverage dbt testing, documentation, snapshotting, and Change Data Capture (CDC) for incremental data refresh.
  • Implement and manage Type 2 data modelling techniques for historical data.
  • Develop reusable macros and packages using Python libraries and dbt packages.
  • Optimize complex SQL queries and leverage Snowflake s performance-enhancing features like Streams, Time Travel, partitioning, and clustering.
  • Orchestrate data pipelines for both batch and near real-time data refresh scenarios.
  • Write and optimize Snowflake SQL queries and stored procedures for seamless data transformation and integration.
  • Ensure compliance with data governance policies and implement security controls for sensitive data.
  • Work closely with data engineering and analytics teams to align workflows with broader data strategies.
  • Monitor and implement strategies to improve the performance of large-scale data queries and systems.
  • Stay updated on the latest technologies and practices to bring innovation into the data warehousing process.

Skills Required:

  • Bachelor s degree in computer science, Information Systems, or a related field.
  • 3-5 years of experience in data warehouse development and ETL tools (e.g., dbt, SSIS, Informatica, or Azure Data Factory).
  • 1 2 years of experience with dbt and Snowflake.
  • Proficiency in SQL/PL-SQL for data querying and optimization.
  • Strong knowledge of data warehousing concepts and design.
  • Familiarity with Python for enhancing dbt pipelines.
  • Strong analytical, problem-solving, and communication skills.

Additional knowledge and/or experience desired:

  • Familiarity with the financial services industry, including banking, wealth management, or investment platforms.
  • Hands-on experience with ETL tools such as DBT, SSIS, Informatica, or Azure Data Factory.
  • Knowledge of Snowflake, including query writing and data integration.
  • Familiarity with cloud platforms like Azure Synapse, AWS Redshift, or Snowflake.
  • Experience with Agile methodologies.
  • Development experience in the financial services industry is a plus.
  • Experience with CI/CD tools like Jenkins, Docker, or Terraform.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.