Overview
Skills
Job Details
Job Location : 100% REMOTE using their own computer.
Travel Required : No
Overtime Required : No
Position Type : Contract
START DATE: ASAP AS EARLY AS MONDAY 8/18
Duration: 40 HRS PER WEEK 10 WEEK ENGAGEMENT END 10/10 WITH POTENTIAL TO EXTEND INTO FURTHER PROJECTS AND MAINTENANCE PHASES
Rate: $65/Hr C2C
Job Description :
We are seeking a highly skilled Data Warehouse Engineer or Architect to design, implement, and optimize enterprise-scale data solutions on Azure.
This role combines strong technical expertise with a deep understanding of business needs, ensuring that data is structured, stored, and exposed in ways that drive meaningful insights.
The project goal/deliverable is a data modernization initiative for our client.
We are working with data from their industry specific ERP which doesn t have the functionality that they d like to meet business requirements and reporting.
Key Responsibilities:
Design, develop, and maintain scalable data warehouse architectures aligned with enterprise needs.
Build and manage ETL/ELT processes using Azure Data Factory.
Write and maintain automation scripts using PowerShell and Python.
Develop, optimize, and maintain Azure SQL databases.
Implement and manage CI/CD pipelines for database code deployment.
Leverage AI-powered tools for code analysis, optimization, and large-scale data migration projects.
Collaborate with business stakeholders to understand reporting and analytics needs, mapping requirements to data storage solutions.
Ensure secure, high-performance, and cost-effective data infrastructure.
Partner with BI and analytics teams to expose curated data sets for reports, dashboards, and advanced analytics
Required Technical Skills:
Proven experience with Azure Data Factory, PowerShell, Python, and Azure SQL.
Strong data warehouse design expertise, including dimensional modeling and best practices.
Experience implementing CI/CD for database code (e.g., Azure DevOps, GitHub Actions).
Demonstrated use of AI-assisted development for code review, quality checks, and migration automation.
Azure Data Factory
Azure Functions
Azure Key Vault
Data Modeling (Star, Snowflake, SCD Type 1 and 2)
Azure SQL Database / SQL Server
T-SQL and Stored Procedures
Tableau
PowerShell
Query Performance Tuning
Source-to-Target Mapping and Documentation
Data Lineage and Metadata Management.
Preferred Experience:
Familiarity with Azure Synapse, Databricks, or similar platforms.
Performance tuning for large-scale data workloads.
Exposure to data governance and security best practices.
Soft Skills & Attributes:
Strong business acumen with the ability to translate complex requirements into practical data solutions.
Active, pragmatic problem solver who can troubleshoot quickly and effectively.
Excellent communication skills for cross-functional collaboration.
Organized, detail-oriented, and able to manage multiple priorities in fast-paced environments.
Passion for continuous learning and staying up to date on emerging data technologies.
12+ years of experience in data engineering, database development, or related roles.
Soft skills:
High sense of urgency.
Extremely good communication skills and responsiveness.
Resourceful it is imperative that they can help build a technical project plan, list out tasks, and estimate the lift for each activity and how long they will take.
Collaboration skills. Each person on the team will have dependencies on this person.
MUST be good in front of clients and be able to give demos in front of executives
Building documentation is really important here. There is a lot to keep track of. This is a new environment, nothing is provided.