Overview
Skills
Job Details
Role/Title: Data and Analytics Lead
Location: Research Triangle Park, Raleigh, NC
Duration: 12+ Months
Special notes:
- Resource primary focus will be to build data models for customer proposals.
- Master Data & Analytics Lead will design and operationalize a unified data framework that integrates financial, operational, and technical data across multiple systems. This role sits within the Global Value Management (GVM) team an organization focused on developing proactive, data-driven proposals and value propositions that help customers understand the business impact.
Job Description:
We re seeking a Master Data & Analytics Lead to design and operationalize a unified data framework that integrates financial, operational, and technical data across multiple systems. This role sits within the Global Value Management (GVM) team an organization focused on developing proactive, data-driven proposals and value propositions that help customers understand the business impact of our solutions.
You ll build the analytical foundation that powers GVM engagements, linking data insights to customer performance, opportunity sizing, and value realization. The ideal candidate combines strong data architecture and quantitative.
Key Responsibilities:
- Data Model and Architecture:
- Define a master data model spanning financial, operational, and technical dimensions, including relationships and dependencies.
- Collaborate with Sales Operations to determine the right platform and integration architecture.
- Source and align data from multiple systems Customer, Competitor, GVM Engagements, HGInsights, Marketing Ops (6Sense, Leadspace, Gartner, IDC), AlphaSense, and Snowflake.
- Develop a data confidence scoring model (validated, inferred, assumed) and processes for maintenance, expiry, and refresh.
- Analytics and Insight Generation:
- Build relational data sets linking metrics such as $/TB and FTE/TB.
- Produce benchmarks, quartiles, and regression analyses to uncover performance drivers across cost, efficiency, and technical spread.
- Design outputs that highlight best-in-class performance by vertical or environment (Cloud vs On-Prem).
- Create searchable internal indices for GVM use cases (e.g., where similar takeouts or use cases exist).
- Deliver insight models that validate assumptions, expose trends, and inform customer recommendations.
- Customer and Opportunity Modeling:
- Correlate customer data against the master model to assess confidence and identify gaps.
- Use analytics to infer likely ranges for missing data and map customers to best-in-class benchmarks.
- Load validated data into business case models to inform account planning and opportunity prioritization.
- What Success Looks Like:
- A reliable, scalable master data framework that informs GVM and customer strategy and serves as a single source of truth.
- Automated confidence scoring and refresh processes.
- Analytical insights that guide opportunity sizing and customer value realization.
- Benchmarking frameworks that inform strategic decisions and account planning.
- A foundation for evidence-based, data-driven customer proposals.
Experience:
- 3+ years experience in data analytics, data modeling or quantitative analysis, ideally in a B2B or enterprise technology environment.
- Proficiency in SQL, Python/R, and BI tools (Tableau, Power BI, or similar).
- Experience designing data models and pipelines across multi-source systems (CRM, Marketing Ops, Financial Systems).
- Strong understanding of data governance, normalization, and confidence scoring techniques.
- Proven ability to synthesize large, complex datasets into actionable insights.
- Excellent collaboration skills with cross-functional teams including Sales, Finance, and Operations.
Preferred Experience:
- Background in enterprise data architecture or quantitative strategy consulting.
- Familiarity with Snowflake, Salesforce, and marketing intelligence or industry insights platforms (6Sense, Leadspace, HGInsights, etc).
- Experience with regression modeling, clustering, and multivariate analytics.
- Understanding of financial modeling, cost analysis, and performance benchmarking.
Master Data Role:
Key things we want to achieve:
- Collation:
- Define a data model of key types (tables) & data features (attributes) we want to collect, which span financial, operational, and technical scope.
- Define the data dependencies and relationships between them.
- Partner with Sales Ops to figure out what platform this should be built on.
- Source the multiple potential data records:
- Customer,
- Competitor
- Account team
- GVM (Engagements)
- HGInsights
- Marketing Ops data (6th Sense, Leadspace, Gartner, IDC etc)
- Past Value and Install Base (Skyline)
- Win/Loss
- AlphaSense
- Other stuff in Snowflake
- Map data sources to the master data model
- Define and build a confidence factor rating on the data (customer validated vs market data vs assumption)
- Define model to maintain and update data sources
- Set expiry dates on data.
- Analytics:
- Build relational tables that index data sources based on their dependencies (simple e.g $ / TB, FTE / TB)
- Produce quartiles and ranges of confidence factors based on indexed data sources
- Build design to produce insight reports that correlate attributes to performance - for example:
- Overall Cost related to technical spread (size, utilisation etc) and operational costs/performance
- Operational efficiency to cost
- Technical debt to cost and operational cost/performance
- Cloud / On Prem
- Locations / FTEs per scope or locations
- Design output to show best-in-class operating, technical and financial performance attributes. Allow variation by vertical industry.
- Searchable internal index for GVM team (i.e. where else have we done a PowerFlex takeout?)
- Referenceable index (i.e. $ / TB, FTE / TB in a given industry based on what we ve seen before)
- Benchmarking to give a range & comparison from what various data sets are telling us are the assumptions we have reasonable?
- Regression analysis - trends in the data that we haven t thought about, is there something hidden we re not seeing? (i.e. as one assumption goes up, something else does as well, or does the opposite)
- Customer Opportunity Opportunity Sizing, Validation and Testing:
- Map data that we have about the customer (direct of indirect).
- Produce analytics to validate customer data to master data and map to confidence factors (i.e what we believe to know of size and cost is in range of master data sources).
- Complete gaps with assumptions - use analytics to recommend the likely range of values based on master record data dependencies (i.e. 5 FTE in Operations, costs will be in range X-Y)
- Produce output analytics to map the customer to best-in-class model.
- Load financials into as-is business case model.
- Use this to inform account planning What do we think the opportunity is? Use as a fishing exercise to identify high potential opportunities?
About AgreeYa:
AgreeYa is a global systems integrator delivering a competitive advantage for its customers through software, solutions, and services. Established in 1999, AgreeYa is headquartered in Folsom, California, with a global footprint and a team of more than 1,800+ professionals across offices. AgreeYa works with 550+ organizations ranging from Fortune 100 firms to small and large businesses across industries such as Telecom, Banking, Financial Services & Insurance, Healthcare, Utility & Energy, Technology, Public Sector, Pharma & Biotech, Retail, Client, and others. Please visit us at for more information.
Equal Opportunity:
AgreeYa is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, gender identity, sexual orientation, national origin, disability, veteran status or other protected characteristics. Visit our website at to learn about our Career & Culture.