Your Opportunity
At Schwab, you're empowered to make an impact on your career. Here, innovative thought meets creative problem solving, helping us "challenge the status quo" and transform the finance industry together.
Job Duties: Build-out of core data platform (WAM-Ex) capabilities, cloud native data platform (Big Query/Snowflake), and core data capabilities. This includes orchestration, data security, and data quality to be shared across the Company's Wealth organization and asset management. Develop data APIs and data delivery services to support critical operational processes, analytical models, and machine learning applications. Define and build best practices and standards for federated development on the WAM-Ex data platform. Design consistent and connected logical and physical data models across data domains. Design consistent data engineering life cycle for building data assets across WAM-Ex initiatives. Analyze and implement best practices in management of enterprise data, including master data, reference data, metadata, data quality, and lineage. Collaborate with other engineers, architects, data scientist, analytics teams, and business product owners to develop software in an Agile development environment. Architect, build, and support the operation of Cloud and On-Premises enterprise data infrastructure and tools. Support selection and integration of data related tools, frameworks, and applications required to expand platform capabilities. Design and develop robust, reusable, and scalable data-driven solutions and data pipeline frameworks to automate ingestion, processing, and delivery of both unstructured batch and real-time streaming data.
What you have
Job Requirements: Bachelor's in Computer Science, Engineering, or a related field and 60 months of progressive, post-Bachelor's experience in a related occupation.. Experience must include 36 months of experience involving the following: Cloud infrastructure development using Amazon Web Services (AWS) or Azure or Google Cloud Platform(Google Cloud Platform); Big data processing using Apache Spark, Pyspark, Python, and SQL; Data warehousing using Snowflake or Amazon Redshift or BigQuery; Workflow orchestration tools such as Apache Airflow; Cloud native batch and real time ETL/ELT pipelines; Secure access and RBAC (Role-Based Access Control) for data platforms; Data quality, lineage, and governance using tools or custom frameworks; and CICD/DevOps Tools including Git, Bitbucket, Jenkin, Bamboo, or Github. .
We offer competitive pay and benefits. Starting compensation depends on related experience. Annual bonus and other eligible earnings are not included in the ranges above. Benefits include: 401(k) w/ company match; employee stock purchase plan; paid vacation, volunteering, 28-day sabbatical after every 5 years of service for eligible positions; paid parental leave and family building benefits; tuition reimbursement; health, dental, and vision insurance; hybrid/remote work schedule available for eligible positions (subject to Schwab's internal approach to workplace flexibility).
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
- Dice Id: 90989465
- Position Id: ae075b6c139bc0702710d6319278978d
- Posted 21 hours ago