Your OpportunityAt Schwab, you're empowered to make an impact on your career. Here, innovative thought meets creative problem solving, helping us challenge the status quo and transform the finance industry together. We succeed as One Schwab-collaborating with trust, integrity, and a shared commitment to doing the right thing for our clients and each other.
We believe in the importance of in-office collaboration and fully intend for the selected candidate for this role to work on site in the specified location(s).
In this role, you'll join Schwab's Global Data and Analytics team to help build and evolve a large-scale data intelligence platform on Google Cloud Platform (Google Cloud Platform). You'll work at the intersection of data architecture and AI, engineering resilient pipelines that enable advanced analytics, fraud detection, and machine-learning models at scale. This is a hands-on, end-to-end engineering role where your work directly supports teams protecting clients and strengthening trust across the firm.
Key Responsibilities- Data Pipeline Architecture and Development: Design, build, and maintain scalable batch and streaming data pipelines using tools such as Dataflow (Apache Beam), Cloud Composer (Airflow), and Pub/Sub to ingest terabytes of transaction and behavioral data.
- Advanced Coding: Write high-performance, production-grade Python and SQL, optimizing existing codebases for efficiency, latency, and cost.
- Data Modeling: Implement complex data models in BigQuery, utilizing partitioning, clustering, and materialized views for optimal performance.
- System Design: Architect robust backend data services and microservices to power analytics and AI platforms.
- Infrastructure as Code: Write and maintain Terraform scripts to provision and manage Google Cloud Platform resources, ensuring reproducible and secure infrastructure.
- Data Quality Engineering: Implement automated testing frameworks, data contracts, and anomaly detection systems into pipeline code.
- Performance Tuning: Deep dive into query execution plans and pipeline bottlenecks to actively reduce latency and cloud costs.
- Incident Resolution: Act as the highest level of escalation for critical data engineering issues, debugging complex failures in distributed systems.
- Technical Leadership: Elevate team coding standards through rigorous code reviews and creation of solution architecture documents.
- Mentorship: Mentor senior and junior engineers via pair programming and technical design sessions, helping them grow their skills.
- Strategy: Collaborate with stakeholders to define the technical roadmap, selecting the right tools and patterns for long-term success.
What you haveRequired Qualifications- 8+ years of hands-on software and data engineering experience with a proven track record of shipping complex systems to production.
- 4+ years as a hands-on senior engineer in startups and/or large organizations.
- Bachelor's degree in Computer Science or a related field.
- Strong software engineering foundation, applying best practices (CI/CD, unit testing, modular design) to data pipelines.
- Deep, practical experience with BigQuery, Dataflow, Pub/Sub, Cloud Storage, and IAM.
- Expert-level proficiency in Python and SQL, with the ability to write clean, maintainable, and efficient code.
- Mastery of dimensional modeling, distributed systems, and modern data-stack patterns.
- Extensive experience with workflow orchestration using Apache Airflow or Cloud Composer.
- Strong background in dbt (data build tool) implementation and strategy.
- Proven track record with CI/CD, Terraform (infrastructure as code), and containerization (Docker and Kubernetes).
Preferred Qualifications- Deep expertise in real-time data processing using Kafka or Pub/Sub.
- Deep understanding of big-data frameworks such as Apache Beam or Spark.
- Experience with modern data stacks such as Snowflake or Databricks, though Google Cloud Platform is our primary platform.
- Demonstrated business-domain knowledge in fraud analytics.
- Strong written and verbal communication skills to clearly convey ideas and feedback.
- Google Professional Data Engineer certification.
- Master's or advanced degree in Computer Science or a related field.
In addition to the salary range, this role is also eligible for bonus or incentive opportunities.