Overview
On Site
Hybrid
USD 75.00 per hour
Full Time
Skills
Recruiting
Insurance
Project Management
Performance Management
Preventive Maintenance
Computer Science
Data Science
Information Technology
IaaS
Python
Extract
Transform
Load
Data Warehouse
Modeling
Mapping
Real-time
GitHub
Jenkins
Oracle
Database
PL/SQL
CA Workload Automation AE
Stored Procedures
Shell Scripting
Rally
Communication
IT Management
ELT
Scalability
Data Engineering
Big Data
Amazon Web Services
Apache Hadoop
Electronic Health Record (EHR)
Apache Spark
Snow Flake Schema
Talend
Informatica
Cloud Computing
Lean Methodology
Agile
Quality Management
Automated Testing
Environment Management
Job Details
Date Posted: 11/26/2025
Hiring Organization: Rose International
Position Number: 493910
Industry: Insurance
Job Title: Data Engineer
Job Location: Hartford, CT, USA, 06101
Work Model: Hybrid
Work Model Details: Onsite 3 days per week
Shift: 8AM - 5PM EST
Employment Type: Temporary
FT/PT: Full-Time
Estimated Duration (In months): 7
Min Hourly Rate($): 75.00
Max Hourly Rate($): 80.00
Must Have Skills/Attributes: AWS, Data Engineer, PL/SQL, Python, Spark
Experience Desired: Experience with DataOps tools (GitHub, Jenkins, UDeploy) (5+ yrs); Data Engineering experience (5+ yrs); Developing and operating production workloads in cloud infrastructure (3+ yrs)
Required Minimum Education: Bachelor's Degree
Preferred Certifications/Licenses: AWS services, Snowflake, Python, Spark Certifications
**C2C is not available**
Job Description
***Only qualified Data Engineer located near Hartford, CT or Charlotte, NC area to be considered due to the position requiring a Hybrid presence. ***
Required Education:
Bachelor's degree in computer science, Data Science, Information Technology, or related field.
Desired Certification:
AWS services, Snowflake, Python, or Spark Certifications
Required Experience, Knowledge & Skills:
5+ years of data engineering experience
2+ years developing and operating production workloads in cloud infrastructure
Hands-on experience with Snowflake (including SnowSQL, Snowpipe)
Expert-level skills in AWS services, Snowflake, Python, Spark
Proficiency in ETL tools such as Talend and Informatica
Strong knowledge of Data Warehousing (modeling, mapping, batch and real-time pipelines)
Experience with DataOps tools (GitHub, Jenkins, UDeploy)
Familiarity with P&C Commercial Lines business
Knowledge of legacy tech stack: Oracle Database, PL/SQL, Autosys, Hadoop, stored procedures, Shell scripting
Experience using Agile tools like Rally
Excellent written and verbal communication skills to interact effectively with technical and non-technical stakeholders
Key Responsibilities:
Serve as subject matter expert and/or technical lead for large-scale data products.
Drive end-to-end solution delivery across multiple platforms and technologies, leveraging ELT solutions to acquire, integrate, and operationalize data.
Partner with architects and stakeholders to define and implement pipeline and data product architecture, ensuring integrity and scalability.
Communicate risks and trade-offs of technology solutions to senior leaders, translating technical concepts for business audiences.
Build and enhance data pipelines using cloud-based architectures.
Design simplified data models for complex business problems.
Champion Data Engineering best practices across teams, implementing leading big data methodologies (AWS, Hadoop/EMR, Spark, Snowflake, Talend, Informatica) in hybrid cloud/on-prem environments.
Operate independently while fostering a collaborative, transformation-focused mindset.
Work effectively in a lean, fast-paced organization, leveraging Scaled Agile principles.
Promote code quality management, FinOps principles, automated testing, and environment management practices to deliver incremental customer value.
Benefits:
For information and details on employment benefits offered with this position, please visit here. Should you have any questions/concerns, please contact our HR Department via our secure website.
California Pay Equity:
For information and details on pay equity laws in California, please visit the State of California Department of Industrial Relations' website here.
Rose International is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, age, sex, sexual orientation, gender (expression or identity), national origin, arrest and conviction records, disability, veteran status or any other characteristic protected by law. Positions located in San Francisco and Los Angeles, California will be administered in accordance with their respective Fair Chance Ordinances.
If you need assistance in completing this application, or during any phase of the application, interview, hiring, or employment process, whether due to a disability or otherwise, please contact our HR Department.
Rose International has an official agreement (ID #132522), effective June 30, 2008, with the U.S. Department of Homeland Security, U.S. Citizenship and Immigration Services, Employment Verification Program (E-Verify). (Posting required by OCGA 13/10-91.).
Hiring Organization: Rose International
Position Number: 493910
Industry: Insurance
Job Title: Data Engineer
Job Location: Hartford, CT, USA, 06101
Work Model: Hybrid
Work Model Details: Onsite 3 days per week
Shift: 8AM - 5PM EST
Employment Type: Temporary
FT/PT: Full-Time
Estimated Duration (In months): 7
Min Hourly Rate($): 75.00
Max Hourly Rate($): 80.00
Must Have Skills/Attributes: AWS, Data Engineer, PL/SQL, Python, Spark
Experience Desired: Experience with DataOps tools (GitHub, Jenkins, UDeploy) (5+ yrs); Data Engineering experience (5+ yrs); Developing and operating production workloads in cloud infrastructure (3+ yrs)
Required Minimum Education: Bachelor's Degree
Preferred Certifications/Licenses: AWS services, Snowflake, Python, Spark Certifications
**C2C is not available**
Job Description
***Only qualified Data Engineer located near Hartford, CT or Charlotte, NC area to be considered due to the position requiring a Hybrid presence. ***
Required Education:
Bachelor's degree in computer science, Data Science, Information Technology, or related field.
Desired Certification:
AWS services, Snowflake, Python, or Spark Certifications
Required Experience, Knowledge & Skills:
5+ years of data engineering experience
2+ years developing and operating production workloads in cloud infrastructure
Hands-on experience with Snowflake (including SnowSQL, Snowpipe)
Expert-level skills in AWS services, Snowflake, Python, Spark
Proficiency in ETL tools such as Talend and Informatica
Strong knowledge of Data Warehousing (modeling, mapping, batch and real-time pipelines)
Experience with DataOps tools (GitHub, Jenkins, UDeploy)
Familiarity with P&C Commercial Lines business
Knowledge of legacy tech stack: Oracle Database, PL/SQL, Autosys, Hadoop, stored procedures, Shell scripting
Experience using Agile tools like Rally
Excellent written and verbal communication skills to interact effectively with technical and non-technical stakeholders
Key Responsibilities:
Serve as subject matter expert and/or technical lead for large-scale data products.
Drive end-to-end solution delivery across multiple platforms and technologies, leveraging ELT solutions to acquire, integrate, and operationalize data.
Partner with architects and stakeholders to define and implement pipeline and data product architecture, ensuring integrity and scalability.
Communicate risks and trade-offs of technology solutions to senior leaders, translating technical concepts for business audiences.
Build and enhance data pipelines using cloud-based architectures.
Design simplified data models for complex business problems.
Champion Data Engineering best practices across teams, implementing leading big data methodologies (AWS, Hadoop/EMR, Spark, Snowflake, Talend, Informatica) in hybrid cloud/on-prem environments.
Operate independently while fostering a collaborative, transformation-focused mindset.
Work effectively in a lean, fast-paced organization, leveraging Scaled Agile principles.
Promote code quality management, FinOps principles, automated testing, and environment management practices to deliver incremental customer value.
- **Only those lawfully authorized to work in the designated country associated with the position will be considered.**
- **Please note that all Position start dates and duration are estimates and may be reduced or lengthened based upon a client's business needs and requirements.**
Benefits:
For information and details on employment benefits offered with this position, please visit here. Should you have any questions/concerns, please contact our HR Department via our secure website.
California Pay Equity:
For information and details on pay equity laws in California, please visit the State of California Department of Industrial Relations' website here.
Rose International is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, age, sex, sexual orientation, gender (expression or identity), national origin, arrest and conviction records, disability, veteran status or any other characteristic protected by law. Positions located in San Francisco and Los Angeles, California will be administered in accordance with their respective Fair Chance Ordinances.
If you need assistance in completing this application, or during any phase of the application, interview, hiring, or employment process, whether due to a disability or otherwise, please contact our HR Department.
Rose International has an official agreement (ID #132522), effective June 30, 2008, with the U.S. Department of Homeland Security, U.S. Citizenship and Immigration Services, Employment Verification Program (E-Verify). (Posting required by OCGA 13/10-91.).
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.