Senior Data Engineer - Local to NY, NJ, CT - W2 Contract

Overview

Hybrid
$60 - $70
Contract - W2
Contract - 12 Month(s)

Skills

Data Engineer
ADF
Azure Data Factory
Data warehouse
Data warehousing
Data modeling
DATABRICKS
Data Bricks
API
ENHANCEMENT

Job Details

Job Title Senior Data Engineer
Location White Plains, NY

Duration 12 Months

Hours Weekly: 37.5hrs/week

Hybrid onsite role 3 days

Only w2 contract

Project Overview

Responsible for managing critical Business-As-Usual (BAU) services that support enterprise-wide data operations. These services include the development, maintenance, and monitoring of data pipelines, integrations, and reporting infrastructure that are essential for ongoing business functions. Key responsibilities include: 1) Maintaining and troubleshooting ETL workflows (Pentaho, Databricks, ADF) 2) Supporting daily data loads and ensuring data availability for business reporting 3) Responding to ad-hoc requests from business users 4) Coordinating with DBAs and application teams for incident resolution 5) Performing enhancements to support evolving business data needs These BAU services are essential for keeping business operations running smoothly and delivering timely insights across multiple departments

Job Functions and Responsibilities

  • ETL & Data Integration: - Design, develop, and optimize ETL pipelines using Azure Databricks, ADF, and Pentaho to support enterprise data workflows.
  • Implement and maintain data movement, transformation, and integration across multiple systems.
  • Ensure seamless data exchange between cloud, on-prem, and hybrid environments.
  • Work with Globalscape FTP for secure file transfers and automation. API Development and Integration:
  • Develop, consume, and integrate RESTful and SOAP APIs to facilitate data .
  • Work with API gateways and authentication methods such Oauth, JWT, certificate, and API keys.
  • Implement and optimize API-based data extractions and real-time data integrations Data Quality & Governance:
  • Implement data validation, cleansing, and enrichment techniques.
  • Develop and execute data reconciliation processes to ensure accuracy and completeness.
  • Adhere to data governance policies and security compliance standards. BAU Support & Performance Optimization:
  • Troubleshoot and resolve ETL failures, data load issues, and performance bottlenecks**.
  • Optimize SQL stored procedures and complex queries for better performance.
  • Support ongoing enhancements and provide operational support for existing data pipelines. Collaboration & Documentation:
  • Work closely with Data Analysts, Business Analysts, and stakeholders to understand data needs.
  • Document ETL processes, data mappings, and workflows for maintainability and knowledge sharing.
  • Provide guidance and best practices to ensure scalability and efficiency of data solutions.

Skills

Required Skills & Experience:

  • 7+ years of experience in ETL development, data integration, and SQL scripting.
  • Strong expertise in Azure Databricks, ADF (Azure Data Factory), and Pentaho.
  • Experience handling secure file transfers using Globalscape FTP.
  • Hands-on experience in developing and consuming APIs (REST/SOAP).
  • Experience working with API security protocols (Oauth, JWT, API Keys, etc.,).
  • Proficiency in SQL, stored procedures, performance tuning, and query optimization.
  • Understanding of data modeling, data warehousing, and data governance best practices.
  • Hands-on experience with cloud-based data platforms (Azure/AWS) is a plus.
  • Strong problem-solving skills, troubleshooting abilities, and ability to work independently.
  • Excellent communication skills and ability to work in a fast-paced environment.

Preferred Qualifications:

  • Experience working in large-scale enterprise data integration projects.
  • Knowledge of Python, PySpark for big data processing.
  • Familiarity with CI/CD for data pipelines (Azure DevOps, GitHub Actions, etc.).

Education and Certifications

  • Bachelor's or Master's degree in a relevant field like Computer science , Data Engineering or related technical field Nice to have below certifications: a) Databricks certified Data Engineer b) Azure Data Engineer associate

Why Choose Cogent? Cogent Infotech stands at the forefront of technology consulting and is recognized globally for its award-winning services. With our headquarters in Pittsburgh, PA, USA, we specialize in guiding enterprises through digital transformation, leveraging the power of emerging technologies such as Cloud Computing, Cybersecurity, Data Analytics, and AI. Our mission is to provide innovative workforce solutions that address the complex challenges faced by today s businesses.

As an ISO-certified firm and appraised at CMMI level 3, our reputation for excellence is well-established. We are proud to collaborate with over 70 Fortune 500 companies and more than 150 Federal C State agencies, delivering cutting-edge technology solutions that drive success.

Cogent is an equal opportunity employer. Cogent will not discriminate against applicants or employees based on race, color, religion, national origin, age, sex, pregnancy (including childbirth or related medical condition), genetic information, sexual orientation, gender
identity, military status, citizenship, or any other class protected by applicable law

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.