Sr Data Engineer Lead - AWS

Overview

Hybrid
$70 - $80
Full Time

Skills

Snowflake

Job Details

Position: Sr Data Engineer Lead - AWS, Cloud Technologies/ Data Engineer

Client : Aviva

Location: Markham, Toronto
Position: Subcon
Mode: Hybrid (Mandatorily need to visit office 3 days a week)

1. Data Engineer:

  • Subcon: CAD 65/HR.

2. Senior Data Engineer Lead:

  • Subcon: CAD 75/HR.

Sr Data Engineer Lead - AWS, Cloud Technologies

This role will be part of and a member of our Information Technology Enterprise Data Services Group. You will be responsible for leading the architecture, high level and low-level solution

engineering design, analysis, and implementation in a successful and experienced team. You ll be required to apply your depth of knowledge and expertise with both modern and legacy data

platforms to develop data ecosystems that will meet business requirements and align with Client's enterprise architecture goals and standards. Client has embarked on an exciting journey to

modernize, craft, and build a next generation data platform Snowflake to support the growing data needs of the business and to enable the capabilities of AI, and GenAI to drive business value.

We embrace a culture challenging the status quo and constantly look to efficiently simplify processes, technology, and workflow.


This role will be reporting into AVP, Data Engineering.
What you'll do

As a Senior Data Engineer Lead, you will be instrumental in shaping and delivering enterprise-scale data solutions. You ll define the technical roadmap, drive data strategy, and lead the design and implementation of robust, scalable data pipelines. This role requires a strong blend of technical leadership, hands-on engineering, and cross-functional collaboration.

Must Have skills:

  1. Snowflake
  2. dBT
  3. AWS


Key Responsibilities

  • Technical Leadership: Define and drive the data engineering strategy, standards, and best practices across the organization.
  • Solution Design: Develop high-level and low-level solution architectures, ensuring alignment with business and technical goals.
  • Data Pipeline Development: Lead the design and implementation of high-performance data pipelines using tools like dbt Core/Cloud, ensuring scalability and maintainability.
  • Data Modeling: Design and review conceptual, logical, and physical data models to support business needs.
  • Code Ownership: Write and maintain clean, reusable code in SQL, Python, Shell, and Terraform.
  • Quality & Governance: Champion data quality, governance, and cataloging practices; create and review test plans to validate data solutions.
  • Issue Resolution: Perform root cause analysis and implement effective solutions for complex data issues.
  • Agile Delivery: Lead agile ceremonies, foster a delivery-focused mindset, and ensure timely execution of data initiatives.
  • Mentorship: Guide and mentor Data Engineers and project team members, elevating team capabilities and engineering excellence.
  • Collaboration: Work closely with architects, designers, QA engineers, and delivery teams to ensure cohesive and customer-centric data products.
  • Documentation: Produce and maintain comprehensive technical documentation to support implementation and knowledge sharing.
  • Talent Development: Contribute to hiring by designing technical challenges, conducting interviews, and supporting onboarding.


What you'll bring

Extensive Experience:

  • 15+ years of professional experience delivering over 10 high-impact data projects from inception through warranty.
  • 5+ years of Snowflake, dbtCore/Cloud, and AWS Cloud Technologies.
  • 7+ years of experience with coding in multiple programming languages such as Python, Java, etc.

Technical Expertise: Deep knowledge of relational databases (Snowflake, PostgreSQL, Amazon Aurora), big data platforms (Hadoop), and NoSQL databases (e.g., MongoDB).

Data Visualization Proficiency: Skilled in tools such as Snowsight, Streamlit, Qlik, and SAP BusinessObjects to communicate insights effectively.

Advanced Coding Skills: Expert-level proficiency in SQL, Python, Shell, and Terraform, with a strong focus on performance, reusability, and maintainability.

Presentation & Communication: Strong technical and business presentation skills; able to identify and address gaps in data designs and processes with both internal and external

stakeholders.

Pipeline & Orchestration Tools: Hands-on experience with orchestration tools like Zena and AWS Managed Airflow.

Resilience & Adaptability: Proven ability to thrive in fast-paced, ambiguous, and high-pressure environments.

Mentorship & Leadership: A track record of mentoring Data Engineers at all levels, fostering a culture of engineering excellence and continuous improvement.

Customer-Centric Mindset: Passion for solving real-world problems using data-driven insights to deliver impactful business outcomes.

Collaborative Approach: Strong interpersonal and communication skills, with the ability to lead teams and influence cross-functional stakeholders.

Domain Knowledge: Familiarity with insurance industry processes and systems is a strong asset.

AI/ML & GenAI Exposure: Experience in operationalizing AI/ML & GenAI models is a plus.

Required Certifications (having 2 or more is an asset):

SnowPRO Core,

SnowPRO Advanced Data Engineer (DEA-C01, DEA-C02),

SnowPro Advanced: Architect (ARA-C01), dbt Developer,

AWS Cloud Practitioner

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Prism IT Corp