We are seeking a highly skilled Software Engineer IV to join the Nuclear IT Data Fabric Team, supporting mission critical data products that power analytics, data science, and operational decision making across the nuclear fleet.
This role is ideal for a senior level engineer who enjoys hands on development while also influencing architecture, providing technical leadership, and mentoring others.
You will design and build modern cloud native, streaming, and data engineering solutions using AWS, Terraform, Kafka/Flink, and relational databases, contributing directly to the organization's data modernization strategy.
________________________________________
Key ResponsibilitiesData Engineering & Architecture
Design, build, test, and maintain scalable distributed data systems and data products.
Develop and enhance data pipelines that consolidate complex datasets into unified, governed views.
Build and maintain analytics infrastructure used by data analysts, data scientists, and business partners.
Create scalable, maintainable software components in Java and Python.
Streaming & Real-Time Processing
Build and optimize streaming applications using Kafka Streams and/or Apache Flink (Java APIs).
Implement highly reliable streaming topologies for ingestion, transformation, and delivery of real time datasets.
Cloud Engineering & Infrastructure
Develop and operate workloads using AWS EMR, Redshift, Lake Formation, Glue Catalog, and serverless components.
Use Terraform-including modular patterns-to provision consistent, secure infrastructure.
Build CI/CD pipelines using Concourse and Git/version control workflows.
Data Quality, Observability & Governance
Implement automated data quality checks, error handling, recovery processes, and replayability patterns.
Work with cross functional teams to ensure data governance, lineage, and compliance standards are adhered to.
Collaborate with application developers, DB architects, analysts, and data scientists on data delivery architecture.
Leadership & Collaboration
Mentor junior engineers and contribute to team technical direction.
Work effectively across cloud, database, and platform teams.
Document architectural decisions, data models, pipeline flows, and operational procedures.
________________________________________
Required Skills & ExperienceTop 4:
1) Terraform (modules, remote state, CI integration)
2) Java (proficient; experience with JVM performance tuning preferred) and AWS
3) Apache Flink (DataStream API, stateful streaming)
4) Kafka Streams
Concourse
AWS EMR
Amazon Redshift
AWS Lake Formation
AWS Glue Catalog
Hands on experience with Postgres, MySQL, and DocumentDB
Strong SQL skills and relational modeling experience
Ability to design, build, and maintain streaming or batch data pipelines
Experience constructing ETL/ELT workflows and data quality frameworks
________________________________________
Nice to Have ECS/EKS
AWS Certifications (Solutions Architect Associate, Developer Associate, or Data Specialty)
Experience designing governed data products or operating within a data mesh/data fabric environment
________________________________________
Education & Experience- Bachelor's degree in Computer Science, Engineering, or a related field