Apache Flink Engineer

Overview

Remote
Full Time
Contract - W2
Contract - 6 day((s))

Skills

Apache Flink

Job Details

Apache Flink Engineer
Location: Remote, but must reside near one of the following U.S. Bank hub cities: Minneapolis, Atlanta, Chicago, San Francisco, or Dallas
Duration: 12+ Months

Must-Have: Hands-on experience with Apache Flink
Position Overview:
U.S. Bank is seeking an experienced Java/Apache Flink Engineer to support ongoing modernization efforts of data movement and transformation systems. This role will focus on migrating legacy applications (e.g., Informatica PowerExchange) to a custom-built Apache Flink framework used primarily for Change Data Capture (CDC) from MongoDB and SQL Server.
You will join a small, high-impact engineering team (currently 2 members) working on several concurrent Flink projects. This is a hands-on engineering role requiring deep knowledge in real-time stream processing and solid software engineering practices.
Key Responsibilities:
  • Develop, enhance, and maintain a custom Flink-based CDC framework
  • Migrate legacy ETL/CDC pipelines to Flink-based data streams
  • Integrate new data sources into existing Flink pipelines
  • Collaborate with FTE engineer and contractor on tooling and pipeline modernization
  • Support CI/CD automation and code quality efforts
  • Participate in planning and design discussions for data streaming architecture
Required Skills & Qualifications:
  • Apache Flink solid experience with Flink SQL and/or DataStream API
  • Java professional experience in Java-based data engineering
  • Real-time data pipeline development
  • Cloud experience Azure preferred
  • CI/CD pipeline experience Azure DevOps preferred
  • GitHub for version control
  • Strong engineering fundamentals
Preferred (Nice to Have):
  • Apache Kafka experience working with Kafka topics and stream management
  • Python expertise especially if Java experience is limited but Python skills are very strong
  • Experience building AI-based solutions or agents (e.g., schema generation or metadata classification using LLMs like Azure OpenAI)
  • Interest or experience with metadata management, schema registry, and Kafka topic lineage tracking
Additional Notes:
  • Must be currently located in one of the listed hub cities (Minneapolis, Atlanta, Chicago, San Francisco, Dallas) no exceptions
  • This is a remote role, but candidates must be local to one of the hub cities
  • Flink experience is mandatory candidates without it will not be considered
  • Team is building a Center of Excellence (CoE) around CDC pipelines using Flink opportunity to contribute to enterprise-scale systems

Follow us over Linkedin -

#LI-BS1
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.