Overview
On Site
Depends on Experience
Contract - Independent
Contract - W2
Contract - 6 Month(s)
Skills
Amazon Redshift
Amazon S3
Amazon Web Services
Analytical Skill
Apache Airflow
Apache Spark
Big Data
Change Data Capture
Cloud Computing
Communication
Conflict Resolution
Continuous Delivery
Continuous Integration
Data Engineering
Data Integration
Data Modeling
Data Processing
Data Quality
Debugging
DevOps
Distributed Computing
Docker
Extract
Transform
Load
Gap Analysis
IBM DB2
Informatica
Informatica PowerCenter
Informatica PowerExchange
Job Details
Job Title: Senior Data Engineer Informatica & Big Data
Experience Level: Mid-Senior
Experience Required: 10+ Years (Big Data), 7+ Years (Informatica PowerCenter)
Education Level: Bachelor s Degree (required)
Job Function: Information Technology
Industry: Information Technology and Services
Total Positions: 1
Profile Requirements:
- Resume must be submitted in PDF format
- TCS Rehire Policy:
- Former TCS full-time employees are not eligible
- Former TCS contractors must observe a 6-month gap before re-engagement
Job Summary:
We are looking for a Senior Data Engineer with deep expertise in big data technologies, Informatica PowerCenter, and cloud-based data platforms. The ideal candidate should be a strong individual contributor who can also lead and mentor teams, manage performance optimization, and deliver robust data integration and transformation solutions.
Mandatory Skills & Experience:
- Big Data & Distributed Computing:
- 10+ years of hands-on experience with large-scale data processing and distributed systems
- Informatica PowerCenter:
- 7+ years of experience designing, developing, and optimizing workflows using Informatica PowerCenter
- Experience with PowerExchange CDC tools
- Strong debugging, performance tuning, and data quality validation skills
- Programming & Data Engineering:
- Strong hands-on experience with PySpark, Apache Spark, and Python
- Databases:
- Experience with both SQL and NoSQL systems, including:
- DB2
- PostgreSQL
- Snowflake
- Cloud Platforms:
- Solid hands-on experience working with AWS data services (e.g., S3, Glue, Redshift, etc.)
- ETL & Workflow Management:
- Proficient in ETL pipeline design, data modeling, and workflow orchestration using tools like Apache Airflow
- Experience with both SQL and NoSQL systems, including:
Nice-to-Have (Preferred):
- Experience with DevOps practices, CI/CD pipelines
- Familiarity with containerization: Docker, Kubernetes
- Agile methodology familiarity (POD model, ticket analysis, sprint grooming)
Responsibilities:
- Design, develop, configure, and debug ETL workflows using Informatica
- Lead efforts in building scalable data integration and transformation pipelines
- Perform gap analysis, optimize existing jobs, and troubleshoot performance issues
- Collaborate with cross-functional teams to manage new requirements and enhancements
- Maintain and document all ETL mappings, workflows, and architecture
- Handle production support tickets and contribute to root cause analysis and incident resolution
Soft Skills:
- Strong problem-solving and analytical skills
- Proven team leadership and mentoring capabilities
- Excellent written and verbal communication
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.