Title: Data Engineer
Location: Scott AFB, Illinois (5 days onsite)
Duration: Contract to Hire (6 months)
Client: DISA (Defense Information Systems Agency
Visa Requirements: with active Secret Clearance
Total Experience Required: 10+ Years
Top Must-Have Skills:
Big Data Engineering
Python, Java, or Scala programming
SQL and database technologies (PostgreSQL, MySQL, Oracle)
Hadoop, Spark, Kafka
Cloud platforms (AWS, Azure, or Google Cloud Platform)
Nice-to-Have Skills:
Data visualization (Kibana)
AI/ML exposure
Data governance, quality, and security best practices
Certifications (AWS Data Analytics, Google Data Engineer)
Job Description (Condensed):
This role involves designing, developing, and maintaining data pipelines, ETL processes, and scalable data systems supporting DISA enterprise analytics. The engineer will collaborate with analysts and data scientists to deliver high-quality data solutions, implement models, ensure optimized data workflows, and document processes.
Summary:
Experienced Data Engineer with over 10 years in building scalable Big Data pipelines, working on Hadoop/Spark, and programming in Python, Java, or Scala. Skilled in cloud environments and ETL optimization, with hands-on experience in data governance and secure systems for government clients.
Skillset:
Big Data Engineering | Python | Java | Scala | SQL | Spark | Hadoop | Kafka | AWS | Azure | Google Cloud Platform | PostgreSQL | MySQL | Oracle | Data Modeling | ETL | Kibana | Data Governance | AI/ML Concepts
5 Lines of Roles & Responsibilities:
Design and maintain scalable data pipelines and ETL workflows.
Collaborate with cross-functional teams for data integration and analytics.
Develop efficient schemas, models, and storage strategies.
Optimize and troubleshoot data workflows for performance and reliability.
Document data engineering processes and ensure compliance with standards.