Overview
Remote
Depends on Experience
Contract - Independent
Contract - W2
Skills
Microsoft SQL
Azure Data Factory;
Azure Data Warehouse
Azure Synapse
ETL
Job Details
Remote--Immediate hire of Data Engineer
Role:Data Engineer Remote
JOB PURPOSE: The Senior Data Engineer (Contractor) plays a critical role in advancing Crash Champions
enterprise data strategy by designing, developing, and maintaining scalable data solutions that support
business intelligence, analytics, and operational reporting. This role is responsible for building and
optimizing data pipelines, integrating data from diverse sources, and modernizing legacy systems to
ensure the delivery of high-quality, reliable, and accessible data. The Senior Data Engineer will work
closely with the Data Manager, Director of Data Analytics, and key business stakeholders to translate
complex business requirements into efficient technical solutions that drive informed decision-making
across the organization. ESSENTIAL DUTIES AND RESPONSIBLITIES:
Design and engineer robust data solutions by collaborating with the Data and Reporting teams
to gather business requirements, define use cases, create technical designs, and implement
scalable solutions for data analysis, reporting, and system integration. Serve as a technical liaison in cross-functional environments, effectively communicating complex
data concepts and delivering actionable insights to business and functional leaders.
Develop complex SQL queries, data transformations, aggregations, stored procedures, and
triggers to support enterprise data needs.
Design and implement technical/data architecture for Operational Data Stores (ODS) and Data
Warehouses using the Kimball methodology.
Build and maintain SSIS ETL packages within SQL Server Data Tools (SSDT), ensuring
performance and scalability.
Manage individual workload and task prioritization using Agile Scrum practices, participating in
sprint planning and daily standups.
Conduct unit, system, and integration testing to validate the accuracy and performance of all
developed code and database objects.
Create and maintain comprehensive documentation for data integration solutions, including
data dictionaries, process/data flow diagrams, and runbooks to support ongoing operations and
future development. Core Competencies:
Teamwork: Builds strong relationships and collaborates effectively to achieve shared goals;
promotes mutual trust and supports team members.
Accountability: Takes ownership for commitments and outcomes; monitors progress, learns
from setbacks, and continuously seeks improvement. Results-Driven: Delivers consistent results by executing priorities with urgency and focus;
maintains momentum to achieve high-performance goals.
Sound Judgment: Applies critical thinking and experience to evaluate issues, identify root
causes, and make well-informed decisions.
Customer Focus: Maintains a strong internal and external customer focus; proactively identifies
and addresses needs to improve user experience and service quality. QUALIFICATIONS:
7+ years of hands-on experience with Microsoft SQL Server; experience with Microsoft Fabric
and Python is a plus.
7+ years of experience with Azure SQL Server, Azure Data Warehouse, and Azure Data Factory;
Azure Synapse experience is a plus.
Proficient in ETL design patterns for data warehousing, including implementation of Kimball
methodology.
Demonstrated experience integrating data from multiple sources (e.g., SQL Server, Excel, Access, flat files, on-prem/cloud systems, RESTful APIs, Smartsheet) using SSIS or other ETL tools.
Skilled in developing complex T-SQL queries, stored procedures, triggers, and handling error
resolution.
Strong analytical, problem-solving, and critical thinking skills with the ability to troubleshoot
complex issues and determine scalable solutions.
High attention to detail with a strong focus on data validation and thorough documentation of
processes, standards, and procedures.
Excellent written and verbal communication skills, with the ability to translate technical
concepts into clear, actionable insights for both technical and non-technical stakeholders.
Effective interpersonal skills; able to build positive relationships across teams and collaborate in
cross-functional environments.
Ability to multi-task and manage competing priorities in a fast-paced, dynamic setting.
Demonstrated stress management capabilities; able to remain focused and perform well under
pressure.
Experience with full software development lifecycle (SDLC) using tools such as Azure DevOps,
GIT, and Smartsheet.
Bachelor s degree in Management Information Systems, Engineering, Mathematics, Economics,
Computer Science, or a related field. Technical Proficiencies / Tools & Technologies
Candidates should have practical experience with the following tools and platforms:
Microsoft SQL Server 2012+ database administration and development
T-SQL advanced querying, scripting, and performance tuning
MS BI Stack: SSRS (Reporting), SSIS (Integration), SSAS (Analytics)
Microsoft Azure SQL Server, Data Warehouse, Data Factory
JSON/XML for handling semi-structured data
Azure DevOps / GIT for source control and deployment pipelines
Smartsheet for data integration and workflow tracking
Azure Synapse (preferred)
Power BI (preferred, not required)
Microsoft Fabric Platform (preferred)
Python for scripting or automation (preferred)
Role:Data Engineer Remote
JOB PURPOSE: The Senior Data Engineer (Contractor) plays a critical role in advancing Crash Champions
enterprise data strategy by designing, developing, and maintaining scalable data solutions that support
business intelligence, analytics, and operational reporting. This role is responsible for building and
optimizing data pipelines, integrating data from diverse sources, and modernizing legacy systems to
ensure the delivery of high-quality, reliable, and accessible data. The Senior Data Engineer will work
closely with the Data Manager, Director of Data Analytics, and key business stakeholders to translate
complex business requirements into efficient technical solutions that drive informed decision-making
across the organization. ESSENTIAL DUTIES AND RESPONSIBLITIES:
Design and engineer robust data solutions by collaborating with the Data and Reporting teams
to gather business requirements, define use cases, create technical designs, and implement
scalable solutions for data analysis, reporting, and system integration. Serve as a technical liaison in cross-functional environments, effectively communicating complex
data concepts and delivering actionable insights to business and functional leaders.
Develop complex SQL queries, data transformations, aggregations, stored procedures, and
triggers to support enterprise data needs.
Design and implement technical/data architecture for Operational Data Stores (ODS) and Data
Warehouses using the Kimball methodology.
Build and maintain SSIS ETL packages within SQL Server Data Tools (SSDT), ensuring
performance and scalability.
Manage individual workload and task prioritization using Agile Scrum practices, participating in
sprint planning and daily standups.
Conduct unit, system, and integration testing to validate the accuracy and performance of all
developed code and database objects.
Create and maintain comprehensive documentation for data integration solutions, including
data dictionaries, process/data flow diagrams, and runbooks to support ongoing operations and
future development. Core Competencies:
Teamwork: Builds strong relationships and collaborates effectively to achieve shared goals;
promotes mutual trust and supports team members.
Accountability: Takes ownership for commitments and outcomes; monitors progress, learns
from setbacks, and continuously seeks improvement. Results-Driven: Delivers consistent results by executing priorities with urgency and focus;
maintains momentum to achieve high-performance goals.
Sound Judgment: Applies critical thinking and experience to evaluate issues, identify root
causes, and make well-informed decisions.
Customer Focus: Maintains a strong internal and external customer focus; proactively identifies
and addresses needs to improve user experience and service quality. QUALIFICATIONS:
7+ years of hands-on experience with Microsoft SQL Server; experience with Microsoft Fabric
and Python is a plus.
7+ years of experience with Azure SQL Server, Azure Data Warehouse, and Azure Data Factory;
Azure Synapse experience is a plus.
Proficient in ETL design patterns for data warehousing, including implementation of Kimball
methodology.
Demonstrated experience integrating data from multiple sources (e.g., SQL Server, Excel, Access, flat files, on-prem/cloud systems, RESTful APIs, Smartsheet) using SSIS or other ETL tools.
Skilled in developing complex T-SQL queries, stored procedures, triggers, and handling error
resolution.
Strong analytical, problem-solving, and critical thinking skills with the ability to troubleshoot
complex issues and determine scalable solutions.
High attention to detail with a strong focus on data validation and thorough documentation of
processes, standards, and procedures.
Excellent written and verbal communication skills, with the ability to translate technical
concepts into clear, actionable insights for both technical and non-technical stakeholders.
Effective interpersonal skills; able to build positive relationships across teams and collaborate in
cross-functional environments.
Ability to multi-task and manage competing priorities in a fast-paced, dynamic setting.
Demonstrated stress management capabilities; able to remain focused and perform well under
pressure.
Experience with full software development lifecycle (SDLC) using tools such as Azure DevOps,
GIT, and Smartsheet.
Bachelor s degree in Management Information Systems, Engineering, Mathematics, Economics,
Computer Science, or a related field. Technical Proficiencies / Tools & Technologies
Candidates should have practical experience with the following tools and platforms:
Microsoft SQL Server 2012+ database administration and development
T-SQL advanced querying, scripting, and performance tuning
MS BI Stack: SSRS (Reporting), SSIS (Integration), SSAS (Analytics)
Microsoft Azure SQL Server, Data Warehouse, Data Factory
JSON/XML for handling semi-structured data
Azure DevOps / GIT for source control and deployment pipelines
Smartsheet for data integration and workflow tracking
Azure Synapse (preferred)
Power BI (preferred, not required)
Microsoft Fabric Platform (preferred)
Python for scripting or automation (preferred)
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.