ETL Data Engineer - Apache Nifi at Charlotte, NC (Hybrid Onsite) - CG

Hybrid in Charlotte, NC, US • Posted 4 hours ago • Updated 4 hours ago
Contract Corp To Corp
Contract Independent
Contract W2
No Travel Required
Hybrid
Depends on Experience
Fitment

Dice Job Match Score™

🔗 Matching skills to job...

Job Details

Skills

  • ETL Data Engineer
  • ETL Developer
  • Apache NiFi
  • controller services
  • templates
  • NiFi Registry
  • ETL/ELT
  • Oracle
  • SQL Server
  • PostgreSQL
  • Python
  • Git

Summary

ETL Data Engineer - Apache Nifi

Location: Charlotte, NC (Fully Onsite)

Duration: 12+ Months

Note: Local or nearby candidates only

Summary:

  • We are seeking an experienced ETL Data Engineer to design, build, and maintain scalable data ingestion and transformation pipelines, with a strong focus on Apache NiFi. The ideal candidate will have hands-on experience developing reliable ETL/ELT workflows that integrate data from diverse sources into enterprise data platforms. Experience with AWS-based data services is a plus but not mandatory.

Required Qualifications:

  • 4+ years of experience as a Data Engineer or ETL Developer.
  • Strong hands-on experience with Apache NiFi, including processors, controller services, templates, and NiFi Registry.
  • Solid understanding of ETL/ELT concepts, data integration patterns, and data pipeline design.
  • Proficiency in SQL and experience working with relational databases (e.g., Oracle, SQL Server, PostgreSQL).
  • Experience working with structured and semi-structured data formats (CSV, JSON, Avro, Parquet).
  • Familiarity with Linux environments and basic scripting (Python or Shell preferred).
  • Experience with version control systems such as Git.

Responsibilities:

  • Design, develop, and maintain ETL/ELT pipelines using Apache NiFi for batch and near-real-time data ingestion.
  • Build reusable, modular NiFi flows with proper versioning, parameterization, and robust error handling.
  • Integrate data from multiple sources including relational databases, flat files, APIs, message queues, and streaming platforms.
  • Implement data transformation, enrichment, validation, and routing logic to ensure high data quality and reliability.
  • Monitor, troubleshoot, and optimize ETL workflows for performance, scalability, and fault tolerance.
  • Collaborate with data architects, analytics teams, and downstream consumers to align pipelines with data models and business requirements.
  • Implement logging, monitoring, and alerting for data pipelines.
  • Follow best practices for CI/CD, version control, and secure data handling.

Preferred / Nice-to-Have Skills:

  • Experience with AWS data services such as S3, Glue, Lambda, RDS, Redshift, or EMR.
  • Exposure to streaming platforms such as Kafka or similar messaging systems.
  • Experience with containerization and orchestration tools (Docker, Kubernetes).
  • Knowledge of data lake or data warehouse architectures.
  • Familiarity with Agile delivery models and DevOps practices.

Education:

  • Bachelor s degree in Computer Science, Engineering, Information Systems, or a related field (or equivalent practical experience).
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 10120856
  • Position Id: 8911848
  • Posted 4 hours ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Charlotte, North Carolina

Today

Easy Apply

Contract

$70.32 - $77.67

Charlotte, North Carolina

3d ago

Easy Apply

Full-time, Third Party

Depends on Experience

Charlotte, North Carolina

Today

Easy Apply

Contract

$55 - $60.35

Remote

6d ago

Easy Apply

Contract

$60 - $70

Search all similar jobs