ETL DataStage with Teradata

Remote • Posted 3 hours ago • Updated 3 hours ago
Contract W2
Remote
Company Branding Image
Fitment

Dice Job Match Score™

✨ Finding the perfect fit...

Job Details

Skills

  • Datastage
  • Data Integration
  • Stored Procedures
  • Macros
  • Data Processing
  • Collaboration
  • Workflow
  • Data Quality
  • Teradata
  • BTEQ
  • TPT
  • Data Lake
  • Storage
  • Analytics
  • Databricks
  • SQL
  • Scripting
  • Unix
  • Python
  • Data Modeling
  • Agile
  • Version Control
  • Git
  • IBM InfoSphere DataStage
  • IBM InfoSphere
  • Data Warehouse
  • Extract
  • Transform
  • Load
  • Analytical Skill
  • Conflict Resolution
  • Problem Solving
  • Microsoft Azure
  • DevOps
  • Continuous Integration
  • Continuous Delivery

Summary

Job Title: ETL DataStage with Teradata

Location: Remote (CST) (W2 Only)

Job Summary:

We're looking for an experienced Data Stage, Teradata Developer with expertise in Azure, Databricks, and Lakehouse technologies to join our team. The ideal candidate will be responsible for designing, developing, and implementing data integration solutions using Teradata, Data Stage, Azure, Databricks, and Lakehouse.

Key Responsibilities:

- Design, develop, and implement ETL processes using Data Stage and Teradata tools (BTEQ, TPT, etc.)

- Develop and maintain complex SQL queries, stored procedures, and macros in Teradata

- Work with Azure Data Lake Storage (ADLS), Azure Synapse Analytics, and Databricks for data processing and storage

- Implement data pipelines using Azure Data Factory, Databricks, and Lakehouse technologies

- Collaborate with cross-functional teams to gather data requirements and implement solutions

- Optimize and enhance existing ETL workflows for improved performance and reliability

- Ensure data quality, integrity, and security

Required Skills:

- 10+ years of experience as a Data Stage, Teradata Developer

- Strong understanding of Teradata architecture and utilities (BTEQ, TPT, etc.)

- Experience with Azure Data Lake Storage (ADLS), Azure Synapse Analytics, and Databricks

- Proficiency in SQL, scripting languages (Unix, Python, etc.), and data modeling

- Familiarity with Lakehouse architecture and technologies

- Experience with Agile methodologies and version control systems (Git, etc.)

Preferred Skills:

- Experience with Data Stage (InfoSphere, etc.)

- Knowledge of data warehousing concepts and ETL methodologies

- Strong analytical and problem-solving skills

- Familiarity with Azure DevOps and CI/CD pipelines

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 91139027
  • Position Id: OOJ - 1683-684-1778086071
  • Posted 3 hours ago

Company Info

About Zuplon

WE SUPPORT CUSTOMER INITIATIVES TO CREATE COMPELLING BUSINESS APPLICATIONS AND TECHNOLOGY SOLUTIONS THAT PROVIDE MEASURABLE BUSINESS OUTCOMES.

About_Company_OneAbout_Company_Two
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Remote

Today

Easy Apply

Contract

Remote

Today

Easy Apply

Contract

Search all similar jobs