Role : Business/Data Analyst (Wealth Management Domain)
Location : Austin, TX
Duration : Full-Time , no C2C
Position Overview
We are seeking a highly skilled Data Analyst Wealth Management to join our growing team in Austin. This is a discovery- and analysis-driven role for a curious, detail-oriented professional who thrives on understanding complex financial data, translating business needs into clear data logic, and surfacing insights that drive decisions.
The ideal candidate excels at writing sophisticated SQL queries, analyzing and profiling large datasets, defining business logic, and validating data across wealth management systems. You will partner closely with investment teams, operations, technology, and business stakeholders to understand functional requirements and ensure data is accurate, consistent, and fit for purpose. Hands-on experience with Python and Databricks is a plus, but this role is fundamentally about analytical depth and business understanding - not pipeline engineering.
Key Responsibilities
Data Discovery & Profiling
Explore and profile large, complex financial datasets to understand structure, lineage, gaps, and anomalies across custodian, portfolio, and transaction data.
Identify data relationships, patterns, and inconsistencies across source systems to inform data mapping, transformation logic, and business rules.
Conduct deep-dive analysis on wealth management data - including positions, returns, benchmarks, fees, and cash flows - to validate completeness and accuracy.
Document data dictionaries, field definitions, and business logic for use by both technical and non-technical teams.
Investigate data quality issues end-to-end, trace root causes across source systems, and recommend remediation approaches.
Requirements Analysis & Business Logic
Engage directly with business stakeholders - advisors, portfolio managers, operations, and compliance - to gather, analyze, and document functional data requirements.
Translate business requirements into precise data logic, transformation rules, and acceptance criteria for downstream development and reporting.
Define and formalize calculation logic for KPIs such as AUM, performance returns, fee schedules, and client segmentation.
Review and validate business logic implemented in pipelines, data models, and reports to ensure alignment with requirements.
Act as a bridge between business teams and technology, ensuring data solutions are grounded in real operational needs.
Query Development & Pipeline Validation
Write complex SQL queries - including CTEs, window functions, and aggregations - to analyze datasets, build reusable logic, and support reporting and validation needs.
Validate pipeline outputs by querying source and target systems, reconciling counts, amounts, and key metrics to confirm data integrity.
Develop test cases and validation scripts to verify transformation logic, business rules, and data completeness after pipeline runs.
Use Python and/or Databricks notebooks for ad hoc data analysis, profiling, and validation where scale or complexity requires it.
Collaborate with engineering teams to review transformation logic, flag discrepancies, and verify that implemented pipelines match documented requirements.
Reporting & Insights
Develop and maintain dashboards, reports, and KPI frameworks to support advisors, portfolio managers, and leadership.
Support client segmentation, performance reporting, AUM analysis, and investment strategy analysis.
Translate complex financial data findings into clear, concise narratives and recommendations for non-technical audiences.
Ensure all reporting outputs comply with financial regulations and internal data governance standards.
Required Skills & Qualifications
Bachelor's or Master's degree in Finance, Data Science, Business Analytics, or related field.
5+ years of experience in a data analyst role within wealth management, asset management, or financial services.
Expert-level SQL skills - complex multi-table joins, CTEs, window functions, subqueries, and analytical query design.
Strong ability to gather and analyze functional requirements from business stakeholders and translate them into data logic and acceptance criteria.
Proven experience with data discovery and profiling - understanding data structures, identifying quality issues, and documenting findings clearly.
Experience validating data pipelines or ETL outputs - reconciling source vs. target data, verifying business logic, and writing test cases.
Solid understanding of wealth management data - custodian feeds, portfolio holdings, performance returns, AUM, fees, and transactions.
Proficiency with Python for data analysis and ad hoc exploration (pandas, numpy); PySpark experience is a plus.
Familiarity with Databricks or similar cloud data platforms for querying and analyzing large datasets.
Understanding of data governance, data quality frameworks, and regulatory compliance in financial services.
Excellent communication and stakeholder management skills - comfortable presenting findings to both technical and business audiences.
Preferred Qualifications
Hands-on experience with PySpark or Databricks (Delta Lake, Spark SQL, notebooks) for large-scale data processing.
Experience building or contributing to data pipelines, ETL processes, or workflow automation in a financial services context.
Exposure to custodian data formats and feeds (Schwab, Pershing, Fidelity, etc.) and reconciliation processes.
Experience with wealth management or portfolio management platforms such as Addepar, Orion, or Black Diamond.
Familiarity with cloud data platforms such as AWS, Azure, or Snowflake.
Knowledge of predictive analytics or basic ML applications in financial services (e.g., client segmentation, risk modeling).
Certifications in data analytics, financial analysis (CFA, CIPM), or cloud platforms are a plus