Job Title: *Data Engineer (SQL / ETL )* Location: Jersey City, NJ / Dallas, TX Columbus, OH Experience : 14+ Years MUST
Mandatory Skills: SQL, ETL, PowerShell, .Net, DataStage, experience with Rhine (or similar metadata-driven orchestration frameworks
Job Overview:
We are seeking a skilled Data Engineer with strong expertise in SQL, ETL development, and hands-on experience with Rhine (or similar metadata-driven orchestration frameworks). The ideal candidate will play a key role in building scalable data pipelines, managing data transformation workflows, and supporting analytics initiatives across the enterprise.
Mandatory Skills:
SQL Server 2022 , Oracle SQL ++ and JavaScript |
Must be strong in Microsoft SQL Server ( Functions , Common table expressions, Views) |
and should be familiar with Data ware house & Operational Data Store Concepts [ODS] |
Windows Powershell and .NET must |
and REST API must
Familiarity with Change control Processes | | | | | and Change management | | | | | Proven experience in ETL development using tools like SSIS, Informatica, Talend, DataStage, or custom Python/Scala frameworks. | | | | | Python, Scala (optional) | | | | | and Change management | | | | | SHould be able to run explain Plans and have good understanding of data profiling and indexing or query tuning based on SQL |
|
Key Responsibilities:
Design, develop, and maintain ETL pipelines using best practices and enterprise data architecture standards.
Write advanced SQL queries for data extraction, transformation, and analysis from structured and semi-structured data sources.
Work with Rhine-based pipelines to enable dynamic, metadata-driven data workflows.
Collaborate with data architects, analysts, and business stakeholders to understand data requirements and implement robust solutions.
Ensure data quality, consistency, and integrity across systems.
Participate in performance tuning, optimization, and documentation of data processes.
Troubleshoot and resolve issues in data pipelines and workflows.
Support deployment and monitoring of data jobs in production environments.
Required Qualifications:
Bachelor's degree in Computer Science, Engineering, Information Systems, or related field.
Strong hands-on experience with SQL (complex joins, window functions, CTEs, performance tuning).
Proven experience in ETL development using tools like Informatica, Talend, DataStage, or custom Python/Scala frameworks.
Familiarity with or experience in using Rhine for metadata-driven pipeline orchestration.
Working knowledge of data warehousing concepts and dimensional modeling.
Exposure to cloud platforms (AWS, Azure, or Google Cloud Platform) and tools such as Snowflake, Redshift, or BigQuery is a plus.
Experience with version control (e.g., Git) and CI/CD for data jobs
Regards
Radiantze Inc