Overview
On Site
0.0
Contract - W2
Skills
AWS Cloud
AWS Cloud
SQL
Teradata
SQL Server
SAS
Job Details
Title: Ab Initio Engineer
Location: Hartford, CT (Hybrid) Only Local)
Duration: Long Term Contract
Job description:
Work location: Hartford, 06183
Will this role work remote/In Office/Hybrid (in office & remote) - Hybrid (must be in office for 3 days)
Does the resource have to be Local: - Yes
"Must-have skills for this role: 1. Ab Initio 2. SAS 3. Databricks
Years of experience required for each skill: 4-6 years and need TL
What are some nice-to-have skills : 1. AWS Cloud 2. 3.
Job description
Client & Project: We are seeking a new talent to join the team where you will have the opportunity to collaborate in the project. The client is a company that provides a wide range of insurance products and services to individuals and businesses.
Here's a quick overview of the skillset we're looking for:
- Ab Initio
- Python
- AWS Cloud technology
- SQL ( Teradata , SQL Server and Client)
- Databricks
- SAS
Responsibilities: As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. Your day-to-day activities will include creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across systems. You are expected to be a subject matter expert, collaborate and manage the team to perform effectively. You will be responsible for team decisions, engage with multiple teams, and contribute to key decisions while providing solutions to problems for your immediate team and across multiple teams. Expert proficiency in Ab Initio is required. Expert proficiency in SAS Analytics is recommended. Expert proficiency in AWS Compute, Data Engineering, Databricks Platform, and Teradata SQL is suggested.
Develop innovative data solutions that enhance data processing efficiency.
Collaborate with cross-functional teams to identify and implement best practices in data management.
Monitor and optimize data pipelines for performance and reliability.
Conduct regular data quality assessments and implement necessary improvements.
Stay updated with industry trends and emerging technologies to continuously improve data engineering practices.
Location: Hartford, CT (Hybrid) Only Local)
Duration: Long Term Contract
Job description:
Work location: Hartford, 06183
Will this role work remote/In Office/Hybrid (in office & remote) - Hybrid (must be in office for 3 days)
Does the resource have to be Local: - Yes
"Must-have skills for this role: 1. Ab Initio 2. SAS 3. Databricks
Years of experience required for each skill: 4-6 years and need TL
What are some nice-to-have skills : 1. AWS Cloud 2. 3.
Job description
Client & Project: We are seeking a new talent to join the team where you will have the opportunity to collaborate in the project. The client is a company that provides a wide range of insurance products and services to individuals and businesses.
Here's a quick overview of the skillset we're looking for:
- Ab Initio
- Python
- AWS Cloud technology
- SQL ( Teradata , SQL Server and Client)
- Databricks
- SAS
Responsibilities: As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. Your day-to-day activities will include creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across systems. You are expected to be a subject matter expert, collaborate and manage the team to perform effectively. You will be responsible for team decisions, engage with multiple teams, and contribute to key decisions while providing solutions to problems for your immediate team and across multiple teams. Expert proficiency in Ab Initio is required. Expert proficiency in SAS Analytics is recommended. Expert proficiency in AWS Compute, Data Engineering, Databricks Platform, and Teradata SQL is suggested.
Develop innovative data solutions that enhance data processing efficiency.
Collaborate with cross-functional teams to identify and implement best practices in data management.
Monitor and optimize data pipelines for performance and reliability.
Conduct regular data quality assessments and implement necessary improvements.
Stay updated with industry trends and emerging technologies to continuously improve data engineering practices.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.