Job Description SummaryManages a team of system analysts that identify problems and design solutions with existing computer systems and networks relating specifically to data management and/or security.
Job DescriptionTriState Capital Bank is an independent chartered bank subsidiary of Raymond James. Headquartered in Pittsburgh, PA, TriState Capital Bank provides premier private banking, commercial banking, and treasury management products and services to corporate, institutional, and high-net-worth (HNW) clients.Summary of the Position:The Data Technology & Engineering Manager will lead the delivery of all data technology related solutions within an Agile framework, ensuring alignment with business priorities defined by the Product Owner. This role is responsible for building and leading a cloud-native data platform and engineering practice that delivers trusted, governed, production-grade datasets for analytics, AI, regulatory reporting, and partner integrations. This is a player-coach role that is to be hands-on enough to design and review code and pipelines, while setting strategy, roadmap, and talent standards.
Primary Functions of the Position:- Agile Delivery Leadership: Act as the delivery lead for data engineering initiatives, working closely with the Product Owner to refine backlog items, prioritize work, and ensure timely delivery of features that meet business objectives.
- Platform Stewardship: Serve as the guardian of the organization's Azure Data Lake platform, leveraging Data Lake and Blob storage within a medallion architecture to enable efficient data storage and processing.
- Team Coordination and Enablement: Collaborate with cross-functional teams of developers, data engineers, reporting analysts, and data governance to design, build, and continuously improve data pipelines, integration processes, and reporting solutions. Translate governance standards into code and controls (DQ rules, glossary links, lineage harvesting, RBAC/ABAC tagging); provide evidence for certification.
- Master Data (MDM) & Distribution: Implement Lean MDM in the Lakehouse for Customer and Account: entity resolution (deterministic + probabilistic), survivorship rules, and auditability. Publish Golden Records through APIM/APIs, reverse ETL to analytics/reporting platforms, and feature stores for AI/ML; synchronize with CRM/LOS.
- Delivery & Operations: Run Agile delivery: backlog prioritization, release cadence, and "definition of done" anchored in governance gates and production SLAs. Establish DataOps/SRE: end-to-end monitoring, runbooks, on-call rotations, capacity planning, RCA/postmortems, and continuous improvement.
- Self-Service Enablement: Drive initiatives that empower business users through self-service analytics tools such as Power BI Cloud, ensuring data accessibility and usability across the enterprise.
- Continuous Improvement: Promote best practices in data engineering, including automation, performance optimization, and adherence to security and compliance standards.
- Stakeholder Engagement: Act as a liaison between technical teams and business stakeholders, ensuring transparency, managing dependencies, and communicating progress effectively.
Essential Skills and Abilities:- Must have strong analytical skills, with the ability to assemble and interpret data, create executive summaries, and deliver actionable business insights.
- Deep experience with Azure data stack (Data Lake Storage, Databricks/Fabric, ADF/Synapse) and enterprise SQL Server tooling (SSIS/SSRS/SSAS).
- Strong programming in Python and/or Scala/SQL; expertise in Delta Lake, schema evolution, and orchestration.
- Proven delivery of governed pipelines, DQ frameworks, metadata & lineage (Purview), and Bronze Silver Gold certification workflows.
- Experience implementing MDM/Golden Records (match/merge, survivorship, audit fields) and distributing via APIs/APIM and analytics tools.
- CI/CD (GitHub/Azure DevOps), Infrastructure-as-Code (Terraform/Bicep), and DataOps/SRE practices.
- Must be self-motivated with the ability to manage tight deadlines and ever-changing priorities.
- Strong business communication, relationship management and negotiation skills.
- Excellent problem-solving skills, strong attention to detail, and the ability to work well in a team environment.
- Strong business requirements gathering skillset.
Education and Experience Requirements:- 10-15 years in data engineering/platform roles; 5+ years leading teams as a hands-on manager/architect.
- Financial services or regulated industry background; familiarity with privacy, retention, access controls, and audit requirements.
EducationBachelor's: Computer and Information Science, Bachelor's: Information Technology
Work ExperienceGeneral Experience - 13 months to 3 years, Manager Experience - 7 to 12 months
CertificationsTravelWorkstyleResident
At Raymond James our associates use five guiding behaviors (Develop, Collaborate, Decide, Deliver, Improve) to deliver on the firm's core values of client-first, integrity, independence and a conservative, long-term view.
We expect our associates at all levels to:
Grow professionally and inspire others to do the same
Work with and through others to achieve desired outcomes
Make prompt, pragmatic choices and act with the client in mind
Take ownership and hold themselves and others accountable for delivering results that matter
Contribute to the continuous evolution of the firm
At Raymond James - as part of our people-first culture, we honor, value, and respect the uniqueness, experiences, and backgrounds of all of our Associates. When associates bring their best authentic selves, our organization, clients, and communities thrive. The Company is an equal opportunity employer and makes all employment decisions on the basis of merit and business needs.