Job Title: ETL DataStage with Teradata
Location: Remote (CST) (W2 Only)
Job Summary:
We're looking for an experienced Data Stage, Teradata Developer with expertise in Azure, Databricks, and Lakehouse technologies to join our team. The ideal candidate will be responsible for designing, developing, and implementing data integration solutions using Teradata, Data Stage, Azure, Databricks, and Lakehouse.
Key Responsibilities:
- Design, develop, and implement ETL processes using Data Stage and Teradata tools (BTEQ, TPT, etc.)
- Develop and maintain complex SQL queries, stored procedures, and macros in Teradata
- Work with Azure Data Lake Storage (ADLS), Azure Synapse Analytics, and Databricks for data processing and storage
- Implement data pipelines using Azure Data Factory, Databricks, and Lakehouse technologies
- Collaborate with cross-functional teams to gather data requirements and implement solutions
- Optimize and enhance existing ETL workflows for improved performance and reliability
- Ensure data quality, integrity, and security
Required Skills:
- 10+ years of experience as a Data Stage, Teradata Developer
- Strong understanding of Teradata architecture and utilities (BTEQ, TPT, etc.)
- Experience with Azure Data Lake Storage (ADLS), Azure Synapse Analytics, and Databricks
- Proficiency in SQL, scripting languages (Unix, Python, etc.), and data modeling
- Familiarity with Lakehouse architecture and technologies
- Experience with Agile methodologies and version control systems (Git, etc.)
Preferred Skills:
- Experience with Data Stage (InfoSphere, etc.)
- Knowledge of data warehousing concepts and ETL methodologies
- Strong analytical and problem-solving skills
- Familiarity with Azure DevOps and CI/CD pipelines