Overview
On Site
USD 130,000.00 - 155,000.00 per year
Full Time
Skills
Creative Problem Solving
Money Management
Asset Management
Software Asset Management
Supervision
Optimization
DevOps
Collaboration
Partnership
Management
Testing
Migration
ELT
Data Integration
Amazon Web Services
Google Cloud Platform
Google Cloud
Microsoft Azure
GCS
Data Flow
Data Warehouse
Apache Spark
Apache Beam
Apache Flink
Microsoft SSIS
Pentaho
NoSQL
Database
MongoDB
Amazon DynamoDB
Microsoft Visual Studio
PyCharm
Git
Bitbucket
Apache Maven
Jenkins
Nexus
TeamCity
Database Design
RDBMS
Star Schema
Continuous Integration and Development
Bamboo
Docker
GitHub
Continuous Integration
Continuous Delivery
Messaging
RabbitMQ
Apache Kafka
Informatica
Data Validation
DVO
Software Development
Research
Attention To Detail
Customer Focus
Problem Solving
Conflict Resolution
Technical Communication
Computer Science
Mathematics
Data Engineering
SQL
Relational Databases
Microsoft SQL Server
PostgreSQL
Python
Data Processing
Analytics
Cloud Computing
Snow Flake Schema
Extract
Transform
Load
Data Modeling
Data Quality
Reporting
Data Visualization
Microsoft Power BI
Tableau
Finance
Analytical Skill
Agile
Documentation
Job Details
Your Opportunity
At Schwab, you're empowered to make an impact on your career. Here, innovative thought meets creative problem solving, helping us "challenge the status quo" and transform the finance industry together. We believe in the importance of in-office collaboration and fully intend for the selected candidate for this role to work on site in the specified location(s).
Schwab Wealth and Asset Management Engineering is a part of the Schwab Technology Services organization supporting Schwab's money management, research, and asset management platforms to help clients manage their wealth.
We are seeking a Data Engineer for building out the cloud native data platform for Schwab Asset Management (SAM). This role is ideal for a professional with progressive experience in cloud native data engineering who is ready to take on more responsibility and operate with minimal supervision. You will be integral to enabling and enhancing our data assets, data pipelines and supporting a data platform on SQL Server, Snowflake and Google Cloud. This is an exciting opportunity to work in a dynamic, data-driven environment, contributing to the ongoing development and optimization of our data platform.
The role requires hands-on development in a client driven technology organization and executing regulatory, tactical and strategic business initiatives focused on developing the data platform and delivery of analytics and reporting projects. The ideal candidate is expected to be a detail oriented and work in Agile and DevOps model in partnership of our business-actively working with Product Owners, End-Users, Partners-in managing requirements, design, coding, testing (unit and functional), deployment and post-release support as well engage in migration activities to evolve the on-premise data stack to the cloud.
The ideal candidate will have:
What you have
To ensure that we fulfill our promise of "challenging the status quo," this role has specific qualifications that successful candidates should have.
Required Qualifications:
Preferred Qualifications:
In addition to the salary range, this role is also eligible for bonus or incentive opportunities.
At Schwab, you're empowered to make an impact on your career. Here, innovative thought meets creative problem solving, helping us "challenge the status quo" and transform the finance industry together. We believe in the importance of in-office collaboration and fully intend for the selected candidate for this role to work on site in the specified location(s).
Schwab Wealth and Asset Management Engineering is a part of the Schwab Technology Services organization supporting Schwab's money management, research, and asset management platforms to help clients manage their wealth.
We are seeking a Data Engineer for building out the cloud native data platform for Schwab Asset Management (SAM). This role is ideal for a professional with progressive experience in cloud native data engineering who is ready to take on more responsibility and operate with minimal supervision. You will be integral to enabling and enhancing our data assets, data pipelines and supporting a data platform on SQL Server, Snowflake and Google Cloud. This is an exciting opportunity to work in a dynamic, data-driven environment, contributing to the ongoing development and optimization of our data platform.
The role requires hands-on development in a client driven technology organization and executing regulatory, tactical and strategic business initiatives focused on developing the data platform and delivery of analytics and reporting projects. The ideal candidate is expected to be a detail oriented and work in Agile and DevOps model in partnership of our business-actively working with Product Owners, End-Users, Partners-in managing requirements, design, coding, testing (unit and functional), deployment and post-release support as well engage in migration activities to evolve the on-premise data stack to the cloud.
The ideal candidate will have:
- 5+ years of working experience and sound knowledge in building data platforms leveraging cloud (Google Cloud Platform/AWS) cloud native architecture, ETL/ELT and data integration
- 3-5 years of development experience with cloud services (AWS, Google Cloud Platform, AZURE) utilizing various support tools (e.g. GCS, Cloud Data flow, Airflow (Composer), Cloud Pub/Sub)
- 3-5 years of experience and sound knowledge in developing reliable data pipelines leveraging data warehouses (Snowflake, Big Query, SQL Server) and data processing frameworks (Apache Spark, Apache Beam, Apache Flink, Informatica, SSIS, Pentaho)
- Knowledge of NoSQL database technologies (e.g. MongoDB, BigTable, DynamoDB)
- Expertise in build and deployment tools - (Visual Studio, PyCharm, Git/Bitbucket/Bamboo, Maven, Jenkins, Nexus, TeamCity)
- Experience in database design techniques and philosophies (e.g. RDBMS, Document, Star Schema, Kimball Model)
- Experience leveraging continuous integration/development tools (e.g. Bamboo, Docker, Containers, GitHub, GitHub Actions) in a CI/CD pipeline
- Experience with SQL, ETL, and other code-based data transformation and delivery technologies.
- Experience in messaging and services-based software, preferably in cloud platform using RabbitMQ, Kafka or the equivalent.
- Experience with Informatica Developer Tool set or Data Validation Option (DVO) is a plus.
- Advanced understanding of software development and research tools
- Attention to detail and results oriented, with a strong customer focus
- Ability to work as part of a team and independently
- Analytical and problem-solving skills
- Problem-solving and technical communication skills
- Ability to prioritize workload to meet tight deadlines
What you have
To ensure that we fulfill our promise of "challenging the status quo," this role has specific qualifications that successful candidates should have.
Required Qualifications:
- Bachelor's degree in computer science, Engineering, Mathematics, or related field
- 5+ years of experience in data engineering or similar roles
- Proficiency in SQL and experience with relational databases (e.g., SQL Server, PostgreSQL, Snowflake)
- Solid experience with Python for data processing, ETL development, tooling and analytics implementations
- Familiarity with cloud data platforms (e.g., Snowflake, Google BigQuery)
- Experience building and maintaining ETL pipelines
- Strong experience in data modeling, including designing normalized and denormalized schemas for financial data
- Understanding of financial data concepts
- Knowledge of data quality, validation, and governance best practices
Preferred Qualifications:
- Deep understanding of data architectures and engineering patterns of data pipelines and reporting environments.
- Familiarity with data visualization tools (Power BI, Tableau)
- Exposure to regulatory requirements in finance (e.g., GDPR, N-PORT, 13F/G)
- Analytical and troubleshooting skills to identify and resolve data and platform issues effectively.
- Ability to work collaboratively within an agile team environment, supporting cross-functional initiatives and contributing to shared goals.
- Strong documentation skills and the ability to communicate technical concepts clearly and effectively to both technical and non-technical stakeholders.
In addition to the salary range, this role is also eligible for bonus or incentive opportunities.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.