Overview
Accepts corp to corp applications
Contract - 12 Month(s)
Skills
Databricks
Snowflake
Git
Apache Kafka
Data Integration
Data Pipelines
Technical Documentation
Microsoft Azure
Attention to Detail
Oracle databases
Analytical Thinking
Communication Skills
Continuous Integration
Databases
Infrastructure Management
Automation
Self Motivation
Cloud Computing
Python (Programming Language)
Workflows
Team Working
SQL Databases
Software Version Control
Oracle Applications
Data Warehousing
Data Ingestion
Extract Transform Load (ETL)
Information Engineering
Data Architecture
Data Modelling
Computer Programming
Data Processing
Collaborative Software
Records Management
Airflow
Raw Data
Job Details
Senior Data Engineer with Snowflake 12+ Needed
Chicago, IL - onsite
TCS
Key Responsibilities
Data Integration
- Implement and maintain data synchronization between on-premises Oracle databases and Snowflake using Kafka and CDC tools.
- Implement and maintain data synchronization between on-premises Oracle databases and Snowflake using Kafka and CDC tools.
Support Data Modeling
- Assist in developing and optimizing the data model for Snowflake, ensuring it supports our analytics and reporting requirements.
- Assist in developing and optimizing the data model for Snowflake, ensuring it supports our analytics and reporting requirements.
Data Pipeline Development
- Design, build, and manage data pipelines for the ETL process, using Airflow for orchestration and Python for scripting, to transform raw data into a format suitable for our new Snowflake data model.
- Design, build, and manage data pipelines for the ETL process, using Airflow for orchestration and Python for scripting, to transform raw data into a format suitable for our new Snowflake data model.
Reporting Support
- Collaborate with data architect to ensure the data within Snowflake is structured in a way that supports efficient and insightful reporting.
- Collaborate with data architect to ensure the data within Snowflake is structured in a way that supports efficient and insightful reporting.
Technical Documentation
- Create and maintain comprehensive documentation of data pipelines, ETL processes, and data models to ensure best practices are followed and knowledge is shared within the team.
- Create and maintain comprehensive documentation of data pipelines, ETL processes, and data models to ensure best practices are followed and knowledge is shared within the team.
Tools and Skillsets
- Data engineering proven track record of developing and maintaining data pipelines and data integration projects
- Data engineering proven track record of developing and maintaining data pipelines and data integration projects
Databases
- Strong experience with Oracle, Snowflake, and Databricks.
- Strong experience with Oracle, Snowflake, and Databricks.
Data Integration Tools
- Proficiency in using Kafka and CDC tools for data ingestion and synchronization.
- Proficiency in using Kafka and CDC tools for data ingestion and synchronization.
Orchestration Tools
- Expertise in Airflow for managing data pipeline workflows.
- Expertise in Airflow for managing data pipeline workflows.
Programming
- Advanced proficiency in Python and SQL for data processing tasks.
- Advanced proficiency in Python and SQL for data processing tasks.
Data Modeling
- Understanding of data modeling principles and experience with data warehousing solutions.
- Understanding of data modeling principles and experience with data warehousing solutions.
Cloud Platforms
- Knowledge of cloud infrastructure and services, preferably Azure, as it relates to Snowflake and Databricks integration.
- Knowledge of cloud infrastructure and services, preferably Azure, as it relates to Snowflake and Databricks integration.
Collaboration Tools
- Experience with version control systems (like Git) and collaboration platforms.
- Experience with version control systems (like Git) and collaboration platforms.
CICD Implementation
- Utilize CICD tools to automate the deployment of data pipelines and infrastructure changes, ensuring high-quality data processing with minimal manual intervention.
- Utilize CICD tools to automate the deployment of data pipelines and infrastructure changes, ensuring high-quality data processing with minimal manual intervention.
Communication
- Excellent communication and teamwork skills, with a detail-oriented mindset.
- Strong analytical skills, with the ability to work independently and solve complex problems.
- Excellent communication and teamwork skills, with a detail-oriented mindset.
- Strong analytical skills, with the ability to work independently and solve complex problems.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.