Overview
On Site
Hybrid
USD 85.00 per hour
Full Time
Skills
Recruiting
ApacheBench
IBM WebSphere MQ
Computer Science
TOGAF
Apache Spark
Microsoft Power BI
Data Marts
Data Science
GitHub
Git
Version Control
Computer Networking
Authentication
PySpark
ELT
Computer Cluster Management
Microsoft
Artificial Intelligence
Data Analysis
Productivity
Requirements Analysis
Data Manipulation
Microsoft SQL Server
Database Design
Optimization
Extract
Transform
Load
Data Quality
Talend
RESTful
Microsoft Exchange
Message Queues
Apache ActiveMQ
Communication
Legacy Systems
Data Management
Enterprise Resource Planning
Data Engineering
Real-time
Predictive Modelling
Interfaces
Data Security
Analytical Skill
Management
Data Lake
Storage
ServiceNow
Geospatial Analysis
Scripting
Semantics
Reporting
Analytics
Machine Learning (ML)
Use Cases
Workflow
Python
SQL
Data Cleansing
Advanced Analytics
Databricks
Microsoft Azure
API Management
Data Governance
Meta-data Management
Regulatory Compliance
Privacy
Access Control
Encryption
Data Masking
Collaboration
Scalability
IT Management
Mentorship
Cloud Computing
Data Architecture
Job Details
Date Posted: 10/23/2025
Hiring Organization: Rose International
Position Number: 490413
Industry: Government
Job Title: Senior Data Architect
Job Location: Edmonton, AB, Canada, T5K 2J5
Work Model: Hybrid
Work Model Details: Primarily remote with onsite meetings
Shift: 08:15 to 16:30 Alberta time, Monday through Friday
Employment Type: Temporary
FT/PT: Full-Time
Estimated Duration (In months): 5
Min Hourly Rate($): 85.00
Max Hourly Rate($): 100.00
Must Have Skills/Attributes: Azure, Azure Synapse, Cluster, Data Architecture, DATA MANAGEMENT, Databricks, ETL, PowerBI, PySpark, Python, SQL
Experience Desired: Using AI for code generation, data analysis, automation, and data engineering workflows (1 yrs); Business requirement analysis related to data manipulation/transformation, cleansing, and wrangling (8 yrs); Building scalable ETL pipelines, data quality enforcement, and cloud integration using TALEND (2 yrs); Data governance, security, and metadata management within a Databricks-based platform (2 yrs); MQ Technologies, implementing message queuing using tools like ActiveMQ and Service Bus (3 yrs); ServiceNow- Azure based Data Management Platform Integrations (1 yrs)
Required Minimum Education: Bachelor's Degree
Preferred Certifications/Licenses: The Open Group Architecture Framework (TOGAF)
**C2C is not available**
Job Description
*** Only qualified Senior IT Technical Business Analyst candidates currently located in the Edmonton, Alberta area will be considered due to the position requiring an onsite Presence***
Required Education:
A college or bachelor's degree in computer science or a related field of study
Preferred certifications:
Certification in The Open Group Architecture Framework (TOGAF)
Required Skills:
Hands-on experience managing Databricks workspaces, including cluster configuration, user roles, permissions, cluster policies, and applying monitoring and cost optimization for efficient, governed Spark workloads (3 Years)
Experience as a Data Architect in a large enterprise, designing and implementing data architecture strategies and models that align data, technology, and business goals with strategic objectives (8 Years)
Experience designing data solutions for analytics-ready, trusted datasets using tools like Power BI and Synapse, including semantic layers, data marts, and data products for self-service, data science, and reporting (4 Years)
Experience in GitHub/Git for version control, collaborative development, code management, and integration with data engineering workflows (4 Years)
Experience with Azure services (Storage, SQL, Synapse, networking) for scalable, secure solutions, and with authentication (Service Principals, Managed Identities) for secure access in pipelines and integrations (5 Years)
Experience in Python (including PySpark) and SQL, applied to developing, orchestrating, and optimizing enterprise-grade ETL/ELT workflows in a large-scale cloud environment (6 Years)
Experience building scalable data pipelines with Azure Databricks, Delta Lake, Workflows, Jobs, and Notebooks, plus cluster management. Extending solutions to Synapse Analytics and Microsoft Fabric is a plus (3 Years)
Preferred Skills:
Use of AI-Experienced in using AI for code generation, data analysis, automation, and enhancing productivity in data engineering workflows (1 Year)
Direct, hands-on experience performing business requirement analysis related to data manipulation/transformation, cleansing, and wrangling (8 Years)
Experience and strong technical knowledge of Microsoft SQL Server, including database design, optimization, and administration in enterprise environments (8 Years)
Experience building scalable ETL pipelines, data quality enforcement, and cloud integration using TALEND technologies (2 Years)
Experience in data governance, security, and metadata management within a Databricks-based platform (2 Years)
Skilled in building secure, scalable RESTful APIs for data exchange, with robust auth, error handling, and support for real-time automation (3 Years)
Experience in Message Queueing Technologies, implementing message queuing using tools like ActiveMQ and Service Bus for scalable, asynchronous communication across distributed systems (3 Years)
Experience working with cross-functional teams to create software applications and data products (5 Years)
Experience working with ServiceNow- Azure based Data Management Platform Integrations (1 Year)
Our client's modernization initiatives are shifting from legacy systems to a cloud-native Azure Data Management Platform, alongside on-premises geospatial systems. This transformation requires a Data Architect to design, implement, and manage scalable, secure, and integrated data solutions. Our client Departments rely on complex data from systems like ServiceNow, ERP platforms, and geospatial tools. The Data Architect will enable seamless ingestion, transformation, and integration of this data using Azure services, including Data Factory, Synapse Analytics, Data Lake Storage, and Purview. Azure Databricks will be used to support advanced data engineering, analytics, and machine learning workflows. The Data Architect will ensure that data pipelines are optimized for both batch and real-time processing, supporting operational reporting, predictive modeling, and automation. Downstream systems will consume data via APIs and data services. The Data Architect will design and manage these interfaces using Azure API Management, ensuring secure, governed, and scalable access to data. Security, governance, and compliance are critical. The Data Architect will implement role-based access controls, encryption, data masking, and metadata management to meet FOIP and other regulatory requirements. As data volumes and complexity grow, the Data Architect will ensure the platform remains extensible, reliable, and future-ready, supporting new data sources, ministries, and analytical capabilities.
Job Duties:
Design and implement scalable, secure, and high-performance data architecture on Microsoft Azure, supporting both cloud-native and hybrid environments
Lead the development of data ingestion, transformation, and integration pipelines using Azure Data Factory, Azure Databricks, and Azure Synapse Analytics
Architect and manage data lakes and structured storage solutions using Azure Data Lake Storage Gen2, ensuring efficient access and governance
Integrate data from diverse source systems, including ServiceNow and geospatial systems, using APIs, connectors, and custom scripts
Develop and maintain robust data models and semantic layers to support operational reporting, analytics, and machine learning use cases
Build and optimize data workflows using Python and SQL for data cleansing, enrichment, and advanced analytics within Azure Databricks
Design and expose secure data services and APIs using Azure API Management for downstream systems
Implement data governance practices, including metadata management, data classification, and lineage tracking
Ensure compliance with privacy and regulatory standards (e.g., FOIP, GDPR) through role-based access controls, encryption, and data masking
Collaborate with cross-functional teams to align data architecture with business requirements, program timelines, and modernization goals
Monitor and troubleshoot data pipelines and integrations, ensuring reliability, scalability, and performance across the platform
Provide technical leadership and mentorship to data engineers and analysts, promoting best practices in cloud data architecture and development
Other duties as needed
Benefits:
For information and details on employment benefits offered with this position, please visit here. Should you have any questions/concerns, please contact our HR Department via our secure website.
California Pay Equity:
For information and details on pay equity laws in California, please visit the State of California Department of Industrial Relations' website here.
Rose International is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, age, sex, sexual orientation, gender (expression or identity), national origin, arrest and conviction records, disability, veteran status or any other characteristic protected by law. Positions located in San Francisco and Los Angeles, California will be administered in accordance with their respective Fair Chance Ordinances.
If you need assistance in completing this application, or during any phase of the application, interview, hiring, or employment process, whether due to a disability or otherwise, please contact our HR Department.
Rose International has an official agreement (ID #132522), effective June 30, 2008, with the U.S. Department of Homeland Security, U.S. Citizenship and Immigration Services, Employment Verification Program (E-Verify). (Posting required by OCGA 13/10-91.).
Hiring Organization: Rose International
Position Number: 490413
Industry: Government
Job Title: Senior Data Architect
Job Location: Edmonton, AB, Canada, T5K 2J5
Work Model: Hybrid
Work Model Details: Primarily remote with onsite meetings
Shift: 08:15 to 16:30 Alberta time, Monday through Friday
Employment Type: Temporary
FT/PT: Full-Time
Estimated Duration (In months): 5
Min Hourly Rate($): 85.00
Max Hourly Rate($): 100.00
Must Have Skills/Attributes: Azure, Azure Synapse, Cluster, Data Architecture, DATA MANAGEMENT, Databricks, ETL, PowerBI, PySpark, Python, SQL
Experience Desired: Using AI for code generation, data analysis, automation, and data engineering workflows (1 yrs); Business requirement analysis related to data manipulation/transformation, cleansing, and wrangling (8 yrs); Building scalable ETL pipelines, data quality enforcement, and cloud integration using TALEND (2 yrs); Data governance, security, and metadata management within a Databricks-based platform (2 yrs); MQ Technologies, implementing message queuing using tools like ActiveMQ and Service Bus (3 yrs); ServiceNow- Azure based Data Management Platform Integrations (1 yrs)
Required Minimum Education: Bachelor's Degree
Preferred Certifications/Licenses: The Open Group Architecture Framework (TOGAF)
**C2C is not available**
Job Description
*** Only qualified Senior IT Technical Business Analyst candidates currently located in the Edmonton, Alberta area will be considered due to the position requiring an onsite Presence***
Required Education:
A college or bachelor's degree in computer science or a related field of study
Preferred certifications:
Certification in The Open Group Architecture Framework (TOGAF)
Required Skills:
Hands-on experience managing Databricks workspaces, including cluster configuration, user roles, permissions, cluster policies, and applying monitoring and cost optimization for efficient, governed Spark workloads (3 Years)
Experience as a Data Architect in a large enterprise, designing and implementing data architecture strategies and models that align data, technology, and business goals with strategic objectives (8 Years)
Experience designing data solutions for analytics-ready, trusted datasets using tools like Power BI and Synapse, including semantic layers, data marts, and data products for self-service, data science, and reporting (4 Years)
Experience in GitHub/Git for version control, collaborative development, code management, and integration with data engineering workflows (4 Years)
Experience with Azure services (Storage, SQL, Synapse, networking) for scalable, secure solutions, and with authentication (Service Principals, Managed Identities) for secure access in pipelines and integrations (5 Years)
Experience in Python (including PySpark) and SQL, applied to developing, orchestrating, and optimizing enterprise-grade ETL/ELT workflows in a large-scale cloud environment (6 Years)
Experience building scalable data pipelines with Azure Databricks, Delta Lake, Workflows, Jobs, and Notebooks, plus cluster management. Extending solutions to Synapse Analytics and Microsoft Fabric is a plus (3 Years)
Preferred Skills:
Use of AI-Experienced in using AI for code generation, data analysis, automation, and enhancing productivity in data engineering workflows (1 Year)
Direct, hands-on experience performing business requirement analysis related to data manipulation/transformation, cleansing, and wrangling (8 Years)
Experience and strong technical knowledge of Microsoft SQL Server, including database design, optimization, and administration in enterprise environments (8 Years)
Experience building scalable ETL pipelines, data quality enforcement, and cloud integration using TALEND technologies (2 Years)
Experience in data governance, security, and metadata management within a Databricks-based platform (2 Years)
Skilled in building secure, scalable RESTful APIs for data exchange, with robust auth, error handling, and support for real-time automation (3 Years)
Experience in Message Queueing Technologies, implementing message queuing using tools like ActiveMQ and Service Bus for scalable, asynchronous communication across distributed systems (3 Years)
Experience working with cross-functional teams to create software applications and data products (5 Years)
Experience working with ServiceNow- Azure based Data Management Platform Integrations (1 Year)
Our client's modernization initiatives are shifting from legacy systems to a cloud-native Azure Data Management Platform, alongside on-premises geospatial systems. This transformation requires a Data Architect to design, implement, and manage scalable, secure, and integrated data solutions. Our client Departments rely on complex data from systems like ServiceNow, ERP platforms, and geospatial tools. The Data Architect will enable seamless ingestion, transformation, and integration of this data using Azure services, including Data Factory, Synapse Analytics, Data Lake Storage, and Purview. Azure Databricks will be used to support advanced data engineering, analytics, and machine learning workflows. The Data Architect will ensure that data pipelines are optimized for both batch and real-time processing, supporting operational reporting, predictive modeling, and automation. Downstream systems will consume data via APIs and data services. The Data Architect will design and manage these interfaces using Azure API Management, ensuring secure, governed, and scalable access to data. Security, governance, and compliance are critical. The Data Architect will implement role-based access controls, encryption, data masking, and metadata management to meet FOIP and other regulatory requirements. As data volumes and complexity grow, the Data Architect will ensure the platform remains extensible, reliable, and future-ready, supporting new data sources, ministries, and analytical capabilities.
Job Duties:
Design and implement scalable, secure, and high-performance data architecture on Microsoft Azure, supporting both cloud-native and hybrid environments
Lead the development of data ingestion, transformation, and integration pipelines using Azure Data Factory, Azure Databricks, and Azure Synapse Analytics
Architect and manage data lakes and structured storage solutions using Azure Data Lake Storage Gen2, ensuring efficient access and governance
Integrate data from diverse source systems, including ServiceNow and geospatial systems, using APIs, connectors, and custom scripts
Develop and maintain robust data models and semantic layers to support operational reporting, analytics, and machine learning use cases
Build and optimize data workflows using Python and SQL for data cleansing, enrichment, and advanced analytics within Azure Databricks
Design and expose secure data services and APIs using Azure API Management for downstream systems
Implement data governance practices, including metadata management, data classification, and lineage tracking
Ensure compliance with privacy and regulatory standards (e.g., FOIP, GDPR) through role-based access controls, encryption, and data masking
Collaborate with cross-functional teams to align data architecture with business requirements, program timelines, and modernization goals
Monitor and troubleshoot data pipelines and integrations, ensuring reliability, scalability, and performance across the platform
Provide technical leadership and mentorship to data engineers and analysts, promoting best practices in cloud data architecture and development
Other duties as needed
- **Only those lawfully authorized to work in the designated country associated with the position will be considered.**
- **Please note that all Position start dates and duration are estimates and may be reduced or lengthened based upon a client's business needs and requirements.**
Benefits:
For information and details on employment benefits offered with this position, please visit here. Should you have any questions/concerns, please contact our HR Department via our secure website.
California Pay Equity:
For information and details on pay equity laws in California, please visit the State of California Department of Industrial Relations' website here.
Rose International is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, age, sex, sexual orientation, gender (expression or identity), national origin, arrest and conviction records, disability, veteran status or any other characteristic protected by law. Positions located in San Francisco and Los Angeles, California will be administered in accordance with their respective Fair Chance Ordinances.
If you need assistance in completing this application, or during any phase of the application, interview, hiring, or employment process, whether due to a disability or otherwise, please contact our HR Department.
Rose International has an official agreement (ID #132522), effective June 30, 2008, with the U.S. Department of Homeland Security, U.S. Citizenship and Immigration Services, Employment Verification Program (E-Verify). (Posting required by OCGA 13/10-91.).
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.