Role Summary
We are seeking a visionary Senior Data Architect to lead our enterprise-wide data transformation. You will be the primary architect responsible for the strategic migration of legacy data infrastructures into a modern, cloud-native data plane. This role requires a rare blend of deep engineering roots, enterprise architectural foresight, and innovation leadership to build data systems that are not just scalable, but "future-proof."
Key Responsibilities
1. Strategic Architecture & Migration
- Legacy-to-Modern Roadmap: Design and execute the multi-year strategy to migrate from legacy RDBMS and mainframes to modern Data Lakehouses and Data Meshes.
- Data Plane Evolution: Define the "Modern Data Plane" architecture, ensuring seamless integration between storage (S3/ADLSS), compute (Spark/Snowflake/Databricks), and consumption layers.
- Hybrid Cloud Strategy: Architect robust patterns for data movement across hybrid and multi-cloud environments, ensuring minimal downtime and zero data loss.
2. Engineering Excellence & Implementation
- Pattern Development: Create reusable "Blueprints" for ELT/ETL pipelines, Change Data Capture (CDC), and real-time streaming architectures using Kafka or Redpanda.
- Performance Engineering: Optimize cloud spend and query performance by implementing advanced indexing, partitioning, and caching strategies.
- Automated Governance: Architect "Security-as-Code" within the data plane, integrating PII redaction, data masking, and RBAC directly into the automated deployment pipelines.
3. Innovation & Emerging Tech
- AI/ML Readiness: Architect data planes specifically optimized for Generative AI and LLM workloads (e.g., Vector Databases, Feature Stores).
- Self-Service Innovation: Design and lead the implementation of a Data Catalog and self-service portal to empower data scientists and analysts.
- Innovation Lab: Lead Proof-of-Concepts (POCs) for emerging technologies (e.g., Iceberg/Hudi formats, serverless data processing).
Required Skills & Qualifications
Experience
10+ years in Data Architecture/Engineering; 5+ years leading large-scale cloud migrations.
Cloud Platforms
Expert-level proficiency in AWS (Lake Formation/Glue), Azure (Synapse/Fabric), or Google Cloud Platform (BigQuery/Dataflow).
Legacy Systems
Strong understanding of legacy patterns (Netezza, Teradata, Oracle, Mainframe) and how to decouple them.
Engineering
Mastery of Python, SQL, and Scala/Java. Deep experience with Spark, Flink, and Airflow.
Modern Standards
Hands-on experience with Data Mesh, Data Fabric, and Open Table Formats (Iceberg, Delta Lake).
Certifications
Preferred: AWS Solutions Architect Pro, Google Professional Data Engineer, or CDMP.