W2 contract to Fulltime position
No Sponsorship candidates required
==
Role: Data Platform Engineer
Location: 100% Remote (Looking for EST based candidates only)
Jo Description:
The SaaS platform connects the world s leading organizations with qualified suppliers, contractors, and vendors. We bring unmatched visibility to companies through cloud-based technology and human insights to improve supply chain risk and compliance. As a result, we foster sustainable growth for businesses and their supply chains globally. Our SaaS subscription-based software is used by 85k+ active customers in over 100 countries spanning a diverse set of industries.
We are seeking a Data Engineer to own and evolve our modern analytics engineering platform. This role is primarily responsible for DBT and Fivetran platform ownership, ensuring reliable, scalable, and well-governed data pipelines that power analytics and downstream use cases. As a team member, you will set technical standards, lead by example, and partner closely with analytics, the BI team and business stakeholders.
Advanced knowledge and experience with dbt coding and configuration, Advanced Snowflake including SnowPro Certification, Advanced AWS Cloud, solid CI/CD Pipeline experience and experience with Fivetran platforms. We would like candidates who have been the Lead Platform engineer with these skills.
ESSENTIAL DUTIES AND RESPONSIBILITIES:
Own the DBT and Fivetran platforms, including configuration, environment setup, access controls, and ongoing maintenance.
Manage DBT Cloud environments, including job configuration, scheduling, orchestration, and dependency management
Configure and manage Fivetran connectors, sync schedules, schemas, transformations, and performance tuning
Build, operate, and optimize end-to-end ELT pipelines from source systems through Snowflake using Fivetran and DBT
Design and maintain robust orchestration patterns for dbt runs, incremental models, and downstream dependencies
Ensure pipelines are reliable, scalable, and cost-efficient.
Own and manage data platform infrastructure using Terraform to create Snowflake resources
Implement, maintain and own CI/CD pipelines using GitHub Actions for DBT deployments, testing, and environment promotion.
Implement and maintain data quality solutions (e.g., freshness checks, anomaly detection).
Build and manage monitoring and observability solutions for pipelines, jobs, and data SLAs.
Define and track platform and data reliability metrics, alerting on failures, delays, or data issues.
Own Snowflake administration, including warehouses, roles, permissions, resource monitoring, and cost optimization.
Tune performance for DBT models and queries through clustering, warehouse sizing, and query optimization.
Ensure security, governance, and best practices across Snowflake environments.
PREFERRED QUALIFICATIONS:
Support analytics-ready dbt models, including staging, intermediate, and mart layers.
Experience implementing data observability or data quality frameworks.
Familiarity with modern metrics layers or semantic models.
Experience supporting analytics and BI
Strong documentation and stakeholder communication skills.
Aptitude for agile delivery (e.g., JIRA) and backlog management in cross-functional data teams
IDEAL EDUCATION & TRAINING:
6+ years of experience in data engineering or analytics engineering, with demonstrated senior-level ownership
Bachelor's or master s degree in data engineering, Computer Science, or related field
Certifications in dbt (Advanced), Snowflake (SnowPro), GitHub Actions Certification, AWS Certified Solutions Architect Professional