Overview
Skills
Job Details
Senior Data Architect - Long Term Project - Burbank, CA (Hybrid)
Title: Senior Data ArchitectLocation: Burbank, CA (Hybrid)
Duration: 12+ months Long Term Project
Compensation: $100-112.81/hr.
Work Requirements: , Holders or Authorized to Work in the U.S.
Job Description:
As part of the Studio Economics transformation, we are establishing a modern, governed, and reusable data foundation to power financial forecasting, title economics, sales planning, and AI-driven insights across the Studios.
The Senior Data Architect plays a critical role in designing this foundation ensuring that data models, pipelines, and integration frameworks are scalable, performant, and aligned with enterprise data governance and platform goals.
Embedded within the Platform Pod, the Senior Data Architect partners with the Platform Owner, Cloud Architect, data engineering teams, and product-aligned pods to ensure the architecture supports both immediate product needs and long-term platform evolution. This role directly enables the delivery of reliable, real-time, and reusable data products across multiple Studio Economics workstreams.
Typical Day in the Role
Design Scalable and Consistent Data Architecture
* Define and maintain canonical data models, entity relationships, and semantic layer specifications that ensure consistent use of data across products and domains.
* Develop and evolve logical and physical data models that support real-time analytics, forecasting, and scenario planning.
* Collaborate with product-aligned pods to design domain-aligned data products that are modular, governed, and discoverable.
Build Reusable, Performant Data Pipelines
* Architect data pipelines that support both batch and near real-time processing using AWS-native services (e.g., Glue, Kinesis, Lambda, Step Functions).
* Guide ingestion, transformation, and enrichment strategies that optimize for resilience, scalability, and lineage traceability.
* Work closely with the Cloud Architect to ensure that infrastructure and orchestration layers meet pipeline and data SLAs.
Embed Governance and Stewardship by Design
* Partner with enterprise data governance teams to implement standardized metadata, lineage, and access controls using tools such as Lake Formation, Informatica, or Snowflake.
* Define data quality rules, validation checkpoints, and anomaly detection processes to support trusted analytics and ML pipelines.
* Contribute to the enterprise data catalog and enable self-service access through secure, well-documented APIs and schemas.
Collaborate Across Platform and Product Pods
* Work with the Platform Owner to define and deliver shared data services and reusable semantic models that support multi-pod alignment.
* Support data scientists and analysts by enabling ML/AI-ready data pipelines and structuring data to accelerate model development and deployment.
* Participate in cross-pod architecture planning to coordinate integration strategies, resolve semantic conflicts, and align on domain boundaries.
Must Have Skills / Requirements
- Experience in data architecture and engineering, with a focus on cloud-native data platforms and modern analytics workflows.
7+ years of experience; Designing and delivering data architecture for cloud-based platforms, with strong knowledge of AWS (e.g., Glue, Lambda, Step Functions, Lake Formation) and modern tooling (e.g., Snowflake, Databricks, Informatica). - Hands-On Pipeline Design and Orchestration
7+ years of experience; Experience architecting and optimizing complex data pipelines ensuring performance, resilience, and real-time capabilities; Hands-on experience building batch and streaming pipelines that are performant, resilient, and traceable using orchestration frameworks that support real-time and ML/AI-ready processing. - Expertise in Canonical Modeling and Semantic Design
7+ years of experience; Proven ability to design scalable, reusable data models and translate them into physical implementations that align with business domains and analytic needs; Deep proficiency in designing canonical and semantic data models, with proven experience aligning data structures to business domains and analytic use cases.
Functional Knowledge / Skills in the following areas:
You'll thrive in this role if you naturally:
- Think in Domains and Products
- You understand that good data architecture starts with clear business semantics, and you design models that reflect the real-world entities behind Studio workflows.
- Bridge the Gap Between Models and Platforms
- You work fluidly across logical design, physical deployment, and infrastructure orchestration partnering with Cloud Architects and Engineers to bring your models to life.
- Govern Through Enablement
- You ensure compliance, lineage, and quality by embedding governance directly into design making the right path the easy path for product and engineering teams.
- Build for Reuse and Interoperability
- You optimize for consistency and extensibility designing assets and APIs that can be adopted across products, use cases, and future data science workflows.
- Promote Transparency and Stewardship
- You advocate for shared ownership of data definitions and practices and help business and technical stakeholders understand the value of consistency and quality.
You're likely a fit for this role if you bring:
- Cross-Functional Collaboration
- Strong ability to work across disciplines including engineering, analytics, product, and compliance and communicate design decisions in a way that drives alignment and adoption.
- Bias for Structure and Clarity
- You drive resolution of semantic conflicts, minimize redundancy, and create architectural clarity that simplifies downstream implementation.
- Excellent collaboration and communication skills, with the ability to facilitate cross-functional alignment and translate architectural decisions across technical and business audiences.
Technology Requirements:
1. You're likely a fit for this role if you bring:
a. Deep Data Architecture Experience
b. Experience designing and delivering data architecture for cloud-based platforms, with strong knowledge of AWS (e.g., Glue, Lambda, Step Functions, Lake Formation) and modern tooling (e.g., Snowflake, Databricks, Informatica).
c. Expertise in Canonical Modeling and Semantic Design
d. Proven ability to design scalable, reusable data models and translate them into physical implementations that align with business domains and analytic needs.
e. Deep proficiency in designing canonical and semantic data models, with proven experience aligning data structures to business domains and analytic use cases.
f. Hands-On Pipeline Design and Orchestration
g. Experience architecting and optimizing complex data pipelines ensuring performance, resilience, and real-time capabilities.
h. Hands-on experience building batch and streaming pipelines that are performant, resilient, and traceable using orchestration frameworks that support real-time and ML/AI-ready processing.
i. Governance and Metadata Awareness
j. Familiarity with data governance practices, including stewardship, lineage, access controls, and cataloging across enterprise environments.
k. Familiarity with data governance frameworks, including data quality rules, lineage tracking, access controls, and enterprise metadata management.
2. Other Qualifications:
a. Ability to partner with platform and cloud engineering teams to ensure infrastructure and orchestration layers support data reliability, scalability, and SLAs.
b. Strong understanding of data product design principles and ability to develop modular, reusable data services that support multiple products and delivery pods.
c. Experience contributing to or maintaining an enterprise data catalog and enabling self-service access through secure, well-documented APIs.
- 7+ years of experience in data architecture and engineering, with a focus on cloud-native data platforms and modern analytics workflows.
- Designing and delivering data architecture for cloud-based platforms, with strong knowledge of AWS (e.g., Glue, Lambda, Step Functions, Lake Formation) and modern tooling (e.g., Snowflake, Databricks, Informatica).
Our benefits package includes:
- Comprehensive medical benefits
- Competitive pay
- 401(k) retirement plan
- and much more!
About INSPYR Solutions
Technology is our focus and quality is our commitment. As a national expert in delivering flexible technology and talent solutions, we strategically align industry and technical expertise with our clients' business objectives and cultural needs. Our solutions are tailored to each client and include a wide variety of professional services, project, and talent solutions. By always striving for excellence and focusing on the human aspect of our business, we work seamlessly with our talent and clients to match the right solutions to the right opportunities. Learn more about us at inspyrsolutions.com.
INSPYR Solutions provides Equal Employment Opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, sex, national origin, age, disability, or genetics. In addition to federal law requirements, INSPYR Solutions complies with applicable state and local laws governing nondiscrimination in employment in every location in which the company has facilities.