Sr Data Engineer

Overview

On Site
$60 - $70
Contract - W2
Contract - 6 Month(s)

Skills

AS/400 Control Language
Acceptance Testing
Agile
Amazon Redshift
Amazon Web Services
Analytical Skill
Analytics
Apache Hadoop
Data Analysis
Data Engineering
Continuous Delivery
Continuous Integration
Customer Relationship Management (CRM)
Cloud Computing
Common Lisp
Computer Hardware
Applied Mathematics
Big Data
CA Workload Automation AE
Client/server
Database
Design Patterns
DevOps
Elasticsearch
Electronic Health Record (EHR)
Coaching
Computer Science
Data Marts
Data Quality
Data Warehouse
Management
Management Information Systems
Mentorship
Nexus
Oracle
Insurance
Integration Testing
JavaScript
Jenkins
Kanban
Extract
Transform
Load
FOCUS
Git
GitHub
IT Management
Informatica
PL/SQL
Production Support
Programming Languages
Python
Quality Assurance
Relational Databases
Functional Management
Sales
SQL
Scrum
Shell Scripting
Snow Flake Schema
Stored Procedures
Systems Design

Job Details

Job ID: H#12798 - Sr. Data Engineer

PLEASE NOTE: This is a 6 month contract to hire and needs to meet Client full-time conversion policies. Those dependent on a work permit sponsor now or anytime in the future (ie H1B, OPT, CPT, etc) do not meet Client requirements for this opening.

Location: Hartford, CT or Charlotte, NC

**Top 3 Must Haves:

  • 1. Strong knowledge of Legacy tech stack Oracle Database, PL/SQL, Autosys, Hadoop, stored procedures, Shell scripting, etc.
  • 2. Experience with Snowflake cloud data platform including hands-on experience with snowflake utilities like Snow SQL, Snow pipe
  • 3. Experienced in DevOps tools like GitHub, Jenkins, Udeploy, etc.

What does the structure/makeup of the team currently look like?

8 Member team with 5 Data Engineers, 1 BDA, 1 QA and looking for a Sr Data Engineer

What will this contractor be accomplishing for your team?

Technical analysis, development, and leading/supporting the team on Production Support and Maintenance (Discretionary & Non-Discretionary) Deliverables

What will this contractor be working on / are there specific projects?

Business Insurance (Commercial Lines) Maintenance team

What makes this role stand out / what are some unique selling points?

Dynamic environment with a scope for learning multiple tool sets/implementation analysis/ testing scenarios on Legacy and Cloud assets

High Skilled team, Role has significance in terms of delivery and support

Interactions with Business and opportunity to learn E2E from source of data to consumption

Describe your expectations for training once a worker starts and expected ramp up.

We have a structured Knowledge Transfer plan specific to the role /applications/ toolset/ Lines of Business, etc.

What type of development will this contractor be doing? (Back end, front end, full stack?) and what will they be responsible for developing?

Back End primarily using Oracle, snowflake

What type of application will they be developing? Will this contractor primarily be working on new development or maintaining the current environment?

Maintaining/Supporting the current CL env/applications. And working on any new enhancements on them

Will this contractor be testing the application/software they are developing?

Yes, Integration testing, supporting UAT on the delivery

What programming languages do you want this contractor to have experience with? (How many years of experience with each language)

Experience in Java Script, Python would be an advantage

Join a fast-paced and talented Agile/Kanban team to unlock Data Capabilities for The Hartford s Business Insurance Production Support and Maintenance organization. You will have an opportunity to maintain Legacy data assets using Oracle Position will design, develop, and maintain large scale data assets across data warehouses, data marts, and data processes, leveraging next generation data and analytic technology stack across AWS Cloud, Informatica IDMC, Snowflake, and Big Data; focus on addressing challenges across legacy tech stack, data freshness issues, speed of delivery and quality; drive a technology step change by enabling business while simplifying the d and Informatica, while growing your knowledge with emerging technologies as we move aggressively with cloud-based assets. We use both Legacy and latest DATA technologies, software engineering practices, Agile delivery framework, and are passionate about technology and building well architected and innovative solutions that drive optimal business value generation.

Data environment and improving speed of delivery; provide technical leadership, mentoring, and coaching to technical team; implement design patterns across the portfolio delivery teams; deliver high quality work of high technical complexity; research and evaluate alternative solutions and recommend the most efficient and cost-effective solution for the systems design.

Role Expectations:

  • Design and develop high quality, scalable software modules for next generation analytics solution suite
  • Prototype high impact innovations, catering to changing business needs, by learning and leveraging new technologies (AWS Cloud, Big Data, Snowflake, Informatica Cloud).
  • Integrate with Data Quality Services to ensure Quality data is Published to consumers.
  • Possesses functional knowledge and skills reflective of a competent practitioner with the ability to deliver on work of varying technical complexity
  • Consults with functional management in the analysis of short and long-range business requirements and recommends innovations which anticipate the future impact of changing business needs
  • Works closely with client management to identify and specify the complex business requirements and processes for diverse development platforms, computing environments (e.g., Cloud, host based, distributed systems, client server), software, hardware, technologies and tools
  • Exposure to CI/CD processes and experience working with GIT, Jenkins, Urban Code Deploy, Nexus repositories
  • Coordinate activities with cross-functional IT unit stakeholders (e.g., database, operations, telecommunications, technical support, etc.)
  • Researches and evaluates alternative solutions and recommends the most efficient and cost-effective solution for the systems design
  • Work within a self-organized scrum development team regarding all design and implementation

Qualifications:

  • Bachelor s degree with at least 5 or more years of applicable work experience with respect to Data Analysis, manipulation, and technical development
  • Desired educational experience includes, but are not limited to: Computer Science, Engineering, IT, Management Information Systems, Data Analytics, Applied Mathematics, and Business.
  • Desire candidates with prior Data Engineer/ETL competencies and prior experience with successful enablement of Data Delivery initiatives.
  • Knowledge of ETL tools like Informatica & Talend; relational databases like Oracle & Snowflake; Oracle PL/SQL etc.
  • Understanding of current and emerging IT products, services, processes and methodologies.
  • Analytical approach with a strong ability to uncover and resolve problems by delivering innovative approaches and solutions.
  • Technical leadership with respect to design, implementation and overseeing complex technical processes
  • Strong ability to estimate project tasks and to deliver upon committed dates. Ability to develop and maintain systems according to a defined set of standards.
  • Experience with AWS Services (i.e. S3, EMR, etc) a plus
  • Experience with Cloud data warehouses, automation, and data pipelines (i.e. Snowflake, Redshift) a plus
  • Experience using Gen AI solutions to automate data engineering pipelines preferred.
  • Ability to work as part of and with high performing teams
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.