NIFI Developer

Overview

On Site
Depends on Experience
Contract - W2
Able to Provide Sponsorship

Skills

NIFI
MongoDB
KAFKA

Job Details

Job Title: NIFI Developer

Role Type: Contract

Location: Dallas, TX or St Louis, MO

Contract Length: 12 Months +

How to Apply: Please send your resume and contact information to recruiters at 4aitservices dot com

 

We need a strong NIFI developer with a data engineer role who has good experience with NIFI and MongoDB, writing Mongo queries, setting up NIFI flows, and Kafka streaming. 

 

Detailed JD:-

Responsibilities:

  • Data Flow Design and Implementation:
    • Design and implement NiFi data pipelines for various business needs, including data ingestion, transformation, and loading. 
  • NiFi Configuration and Management:
    • Configure and manage NiFi clusters, including setting up processors, flow controllers, and other components. 
  • Data Pipeline Optimization:
    • Optimize data flows for performance, scalability, and efficiency, addressing bottlenecks and improving processing times. 
  • Troubleshooting and Debugging:
    • Identify and resolve issues within NiFi data pipelines, including data quality issues, performance problems, and security vulnerabilities. 
  • Data Integration:
    • Integrate NiFi with other data platforms and systems, including databases, message queues, and cloud services, like MongoDB, SQL  must have.
  • Data Quality and Security:
    • Ensure data quality and security throughout the data pipeline, implementing appropriate data validation, cleansing, and security measures. 
  • Collaboration and Communication:
    • Collaborate with other data engineers, developers, and business stakeholders to understand requirements, design solutions, and ensure project success. 
  • Documentation:
    • Document NiFi data pipelines, workflows, and procedures for future maintenance and troubleshooting. 

Skills:

  • NiFi Expertise: Strong understanding of NiFi architecture, processors, flow controllers, and expressions. 
  • Data Engineering: Experience with data ingestion, ETL, and data warehousing. 
  • Programming: Proficiency in SQL, MongoDB Query, Splunk querying, Python, Java. 
  • Databases: Knowledge of relational databases (SQL) and Mongo databases. 
  • Big Data Technologies: Familiarity with big data technologies like Kafka, Hadoop, Spark, and cloud platforms (AWS, Azure, Google Cloud Platform). 
  • Cloud Platforms: Experience with cloud platforms like AWS, Azure, or Google Cloud Platform. 
  • Data Modeling: Ability to design and implement data models for analytical and reporting purposes. 
  • Security: Understanding of data security principles and best practices. 
  • Communication and Collaboration: Excellent communication and collaboration skills. 

 

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.