ETL Developer - Informatica

New York, NY, US • Posted 2 days ago • Updated 18 hours ago
Contract Independent
Contract Corp To Corp
Contract W2
No Travel Required
On-site
Depends on Experience
Company Branding Image
Fitment

Dice Job Match Score™

👤 Reviewing your profile...

Job Details

Skills

  • Informatica
  • ETL
  • ETL Developer
  • Position Description The position will be responsible for the development of ETL components
  • providing user access to the data via reports
  • data extracts
  • utilizing analysis tools such as OLAP
  • and for coding stored procedures. The candidate will be working with multiple database systems (Teradata
  • HIVE
  • SQL Server
  • DB2
  • and Snowflake) including Cloud system
  • both on prem and Cloud. The roll will require the candidate to possess a strong understanding of database concepts including data warehouse
  • operational data stores
  • and data marts. Responsibilities will also require in-depth knowledge of ETL concepts and hands-on experience in implementing data integrations in multiple database platforms using custom development
  • scripting language such as Unix Shell
  • and ETL tool such as Informatica. The Role requires the candidate to work on data engineering pipelines using Spark on Cloud with tools like Databricks and Snowflake
  • and work on Statistical Risk models developed on C++ to reengineer them with better latency for cloud using latest technologies like Pytorch and design reusable components
  • utilities and ability to think out-of-the-box architecture to have seamless experience for modelers from design to implementation to training to deployment of models to production lifecycle. Responsibilities: Work with Quantitative Strategist/Statistical Modeler to build
  • enhance
  • and execute/test scenarios Ability to develop
  • run and infer Machine Learning Models and Statistical Models on cloud Identify potential improvements to the current design/processes. Ability to assess risks in design/development upfront. Plan and co-ordinate the data/process migration across databases. Participate in multiple project discussions as a senior member of the team. Serve as a coach/mentor for junior developers. Provide thought leadership. Lead team in new initiatives such as cloud strategy. Required Skills 10 + years of total experience. Strong Python
  • Spark
  • PyTorch
  • PySpark
  • Java and C++ scripting experience Have knowledge on Machine Learning
  • Statistics and Model development
  • training and inference. Have strong design and problem-solving skills Strong understanding of machine learning tools ML Flow
  • Databricks
  • Snowflake on Cloud SQL skills and database programming skills including creating views
  • stored procedures
  • triggers
  • implementing referential integrity
  • as well as designing and coding for performance. Knowledge and hands-on experience of RDBMS systems (e.g.: DB2
  • Teradata
  • or Oracle
  • Snowflake). Good communication and leadership skills. Organization
  • discipline
  • detail-orientation
  • self-motivation
  • and focused on delivery. In-depth knowledge and hands-on experience Unix/Linux programming (shell and/or Perl). Desired Skills Experience in the Financial Industry Good understanding of Data engineering principles and risk model development. ETL experience with Informatica Experience in Cloud database (e.g.: AZURE
  • Snowflake). Experience with Scala/Spark/C++/Pytorch Experience with AngularJS Experience in KDB
  • Extract
  • Transform
  • Load
  • Conflict Resolution
  • Data Engineering
  • Data Marts
  • Data Warehouse
  • AngularJS
  • Apache Hive
  • Microsoft Azure
  • Microsoft SQL Server
  • Migration
  • IBM DB2
  • Java
  • Leadership
  • Linux
  • Machine Learning (ML)
  • Mentorship
  • Apache Spark
  • C++
  • Cloud Computing
  • Communication
  • Database
  • RDBMS
  • Finance
  • Motivation
  • OLAP
  • Scripting Language
  • Snow Flake Schema
  • Statistical Models
  • Oracle
  • Perl
  • Problem Solving
  • Python
  • SQL
  • Scala
  • Scripting
  • Shell
  • Statistics
  • Strategist
  • Test Scenarios
  • Thought Leadership
  • Training
  • Unix

Summary

Position Description
The position will be responsible for the development of ETL components, providing user access to the data via reports, data extracts, utilizing analysis tools such as OLAP, and for coding stored procedures. The candidate will be working with multiple database systems (Teradata, HIVE, SQL Server, DB2, and Snowflake) including Cloud system, both on prem and Cloud. The roll will require the candidate to possess a strong understanding of database concepts including data warehouse, operational data stores, and data marts. Responsibilities will also require in-depth knowledge of ETL concepts and hands-on experience in implementing data integrations in multiple database platforms using custom development, scripting language such as Unix Shell, and ETL tool such as Informatica.

The Role requires the candidate to work on data engineering pipelines using Spark on Cloud with tools like Databricks and Snowflake, and work on Statistical Risk models developed on C++ to reengineer them with better latency for cloud using latest technologies like Pytorch and design reusable components, utilities and ability to think out-of-the-box architecture to have seamless experience for modelers from design to implementation to training to deployment of models to production lifecycle.


Responsibilities:
Work with Quantitative Strategist/Statistical Modeler to build, enhance, and execute/test scenarios
Ability to develop, run and infer Machine Learning Models and Statistical Models on cloud
Identify potential improvements to the current design/processes.
Ability to assess risks in design/development upfront.
Plan and co-ordinate the data/process migration across databases.
Participate in multiple project discussions as a senior member of the team.
Serve as a coach/mentor for junior developers.
Provide thought leadership.
Lead team in new initiatives such as cloud strategy.


Required Skills
10 + years of total experience.
Strong Python, Spark, PyTorch, PySpark, Java and C++ scripting experience
Have knowledge on Machine Learning, Statistics and Model development, training and inference.
Have strong design and problem-solving skills
Strong understanding of machine learning tools ML Flow, Databricks, Snowflake on Cloud
SQL skills and database programming skills including creating views, stored procedures, triggers, implementing referential integrity, as well as designing and coding for performance.
Knowledge and hands-on experience of RDBMS systems (e.g.: DB2, Teradata, or Oracle, Snowflake).
Good communication and leadership skills.
Organization, discipline, detail-orientation, self-motivation, and focused on delivery.
In-depth knowledge and hands-on experience Unix/Linux programming (shell and/or Perl).


Desired Skills
Experience in the Financial Industry
Good understanding of Data engineering principles and risk model development.
ETL experience with Informatica
Experience in Cloud database (e.g.: AZURE, Snowflake).
Experience with Scala/Spark/C++/Pytorch
Experience with AngularJS
Experience in KDB

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 90824428
  • Position Id: 8936189
  • Posted 2 days ago

Company Info

About Zealogics

Welcome to Zealogics


Zealogics LLC provides a broad range of IT consulting, systems implementation and application outsourcing services through an optimized global delivery model. Zealogics also built deep knowledge of traditional product engineering across mechanical, electronics and software platforms, to enable clients to navigate their digital transformation.



Zealogics value engineering techniques, automation frameworks, and reference models are refined through engagements with Fortune 500 enterprises and OEMs. We combine customer-centric product strategies with a collaborative approach to execution. It helps harmonize processes, identify bottlenecks, and eliminate non-value adding tasks to deliver world-class products.



We also focus on connecting the right people with the right positions whether for a full-time career opportunity or a temporary assignment. Our long-term relationships with employees and clients have been built in an environment of integrity and commitment, with a shared goal of mutual success. We hire talented people to solve a wide array of IT and Engineering challenges that our clients face day to day. Here at Zealogics we provide the best services for the best price in town. That s our guarantee.





Careers

Website
About_Company_OneAbout_Company_Two
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

New York, New York

Today

Easy Apply

Contract, Third Party

New York, New York

Today

Contract

USD 70.00 - 75.00 per hour

New York, New York

2d ago

Easy Apply

Contract

$80 - $90

Hybrid in New York, New York

12d ago

Easy Apply

Contract

$60 - $75

Search all similar jobs