Overview
Skills
Job Details
Cycle3 IT Staffing is seeking a Senior SAS Databricks Migration Engineer
The Role
Own end-to-end migrations from SAS (Base/PROC/DI/Grid/VA) to Databricks on Delta Lake. You ll inventory legacy assets, convert code and jobs, harden performance, and lead cutovers working closely with data, platform, and business teams.
What You ll Do
Discovery & Planning: Inventory SAS jobs, DATA steps, PROC SQL, macros, schedules, libraries; map dependencies and SLAs; define wave plans.
Code Conversion: Rewrite DATA steps to PySpark/DataFrames, PROC SQL to Databricks SQL; replace macros with parameterized notebooks/config.
Pipelines & Orchestration: Build DLT/Jobs pipelines (Bronze/Silver/Gold), Autoloader, and Workflows with retries, alerts, and SLA metrics.
Data Modeling & Storage: Stand up Delta Lake tables with OPTIMIZE/Z-ORDER, partitioning strategy, expectations/constraints.
Governance: Implement Unity Catalog (RBAC, tags, masking, lineage), secrets/key management, and auditability.
Performance & Cost: Tune joins, caching, Photon, cluster sizing; track cost with tags and usage dashboards.
Validation & Cutover: Design parity tests (row counts, checksums, KPIs), dual-run, reconcile, and coordinate BI/feeds switchovers.
Enablement: Create runbooks, code patterns, and handoff docs; mentor client teams.
Help in technical pre sales scoping For SAS migrations to Databricks
Must-Have Qualifications
8+ years in data engineering; 5+ years hands-on SAS (DATA step, PROC SQL/MACROs; DI/VA a plus).
4+ years Databricks (PySpark & SQL) with proven SAS Spark/SQL conversions in production.
Strong Delta Lake fundamentals (ACID, OPTIMIZE/VACUUM, schema evolution) and Unity Catalog.
Building DLT or Jobs-based pipelines, Workflows, widgets/parameters, secrets.
Cloud experience (Azure preferred; AWS/Google Cloud Platform acceptable): storage, networking basics, IAM.
CI/CD with Git; code reviews; testing frameworks (pytest, Great Expectations/Delta Expectations).
Excellent communication with business/SMEs; comfortable running workshops and cutovers.
Nice to Have
SAS Grid/DI Studio/VA migrations; scheduler migrations (Stonebranch/Airflow/ADF Workflows).
Data quality frameworks, observability/logging; Databricks Asset Bundles or dbx.
Domain experience in Healthcare, Financial Services, or Manufacturing.
BI integrations (Tableau/Power BI), CDC patterns, and performance benchmarking.