Overview
$55-60
Accepts corp to corp applications
Contract - W2
Contract - Independent
Contract - 6 month(s)
Skills
api
documentation
MONGODB
Kafka
Logging
Disaster Recovery
Job Details
Job Title: Flink with Datastreams API
Location- Dallas, TX/ Hybrid/in-person interview
Job Type: Contract to Hire
Location- Dallas, TX/ Hybrid/in-person interview
Job Type: Contract to Hire
Requirements: Please provide supporting documentation or case studies confirming you have successfully implemented Flink for previous clients, specifically using the DataStream's API Proof of experience is critical.
Confirmation that you are currently and actively supporting a client using Flink with the DataStream API is required.
Requirements for Resource:
- Proficient in writing and supporting both functional and non-functional aspects of Flink with DataStream's API.
Flink Functional Requirements:
- Expertise in Flink APIs (DataStream, Process functions, etc.).
- Competence in state management (checkpoints and save points) with local storage.
- Configuration of connectors like EventHub, Kafka, and Mongodb.
- Implementation of Flink API Aggregators.
- Handling watermarks for out-of-order events.
- Management of state using Azure Data Lake Storage (ADL
Flink Non-Functional Requirements:
- Set up a private Flink cluster within a designated AKS environment.
- Configure both session-based and application-type deployments.
- Define and build nodes and slots. Manage and configure Job/Task Managers.
- Establish necessary connectors, e.g., external storage for the Flink Cluster.
- Configure heap memory and RocksDB for state management.
- Define and set up checkpoints and savepoints for state recovery.
- Enable Auto-Pilot capabilities.Integrate network resources, such as Azure EventHub and external databases like MongoDB.
- Implement integration with ArgoCD for job submissions.
- Install LTM agents for logging and Dynatrace agents for monitoring purposes.
- Provide access to the Flink Dashboard.
- Establish High Availability (HA) and Disaster Recovery (DR) configurations.
Additional Requirements:
- Fully manage the underlying AKS infrastructure that supports Flink and its applications.
- Ensure end-to-end observability of Flink and AKS using Dynatrace.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.