Overview
On Site
Full Time
Skills
Writing
Internal Communications
IC
Integrated Circuit
HTML
SFTP
Data Integration
Collaboration
Data Engineering
Continuous Integration
Extract
Transform
Load
Management
Apache Spark
Databricks
Workflow
Python
SQL
Data Manipulation
Cloud Computing
Amazon Web Services
Microsoft Azure
Documentation
Systems Architecture
Supply Chain Management
Predictive Analytics
LinkedIn
Artificial Intelligence
Job Details
Overview
BigBear.ai is seeking a Data Engineer/Integrator, your role entails collaborating closely with a team of developers to fulfill data integration requirements. This involves writing and maintaining code using an Extract-Transform-Load (ETL) platform to ensure data is transformed into suitable formats as defined by IC ITE initiatives.
What you will do
You will interface with external teams and systems, employing various protocols including HTML and SFTP to collect data efficiently. Additionally, your responsibilities include enhancing the ETL platform by adding features aimed at shortening timelines for future data integration efforts. Beyond coding tasks, you'll develop and maintain software, ensuring seamless integration into a fully functional system. Collaboration with external teams will be necessary to validate data ingest processes. Finally, you'll be responsible for providing comprehensive documentation covering system architecture, development, and any enhancements made throughout the process.
What you need to have
9+ Years of experience in Software or Data Engineering/Administration or roles or a highly related field of work with similar scope and responsibilities.
A Bachelor's degree may be substituted for 4 years of experience and a Master's Degree may be substituted for 6 years of experience
Specialization using the Databricks platform for building, deploying, and managing data and AI solutions
Experience building ETL pipelines, managing data pipelines, and working with large datasets using tools like Spark, Python, and SQL
Experience with technologies like Delta Lake, Delta Live Tables, and Databricks Workflows
What we'd like you to have
Be responsible for providing comprehensive documentation covering system architecture, development, and any enhancements made throughout the process.
About BigBear.ai
BigBear.ai is a leading provider of AI-powered decision intelligence solutions for national security, supply chain management, and digital identity. Customers and partners rely on Bigbear.ai's predictive analytics capabilities in highly complex, distributed, mission-based operating environments. Headquartered in McLean, Virginia, BigBear.ai is a public company traded on the NYSE under the symbol BBAI. For more information, visit and follow BigBear.ai on LinkedIn: @BigBear.ai and X: @BigBearai.
BigBear.ai is an Equal opportunity employer all protected groups, including protected veterans and individuals with disabilities.
BigBear.ai is seeking a Data Engineer/Integrator, your role entails collaborating closely with a team of developers to fulfill data integration requirements. This involves writing and maintaining code using an Extract-Transform-Load (ETL) platform to ensure data is transformed into suitable formats as defined by IC ITE initiatives.
What you will do
You will interface with external teams and systems, employing various protocols including HTML and SFTP to collect data efficiently. Additionally, your responsibilities include enhancing the ETL platform by adding features aimed at shortening timelines for future data integration efforts. Beyond coding tasks, you'll develop and maintain software, ensuring seamless integration into a fully functional system. Collaboration with external teams will be necessary to validate data ingest processes. Finally, you'll be responsible for providing comprehensive documentation covering system architecture, development, and any enhancements made throughout the process.
What you need to have
9+ Years of experience in Software or Data Engineering/Administration or roles or a highly related field of work with similar scope and responsibilities.
A Bachelor's degree may be substituted for 4 years of experience and a Master's Degree may be substituted for 6 years of experience
- Active TS/SCI with with CI Polygraph
Specialization using the Databricks platform for building, deploying, and managing data and AI solutions
Experience building ETL pipelines, managing data pipelines, and working with large datasets using tools like Spark, Python, and SQL
Experience with technologies like Delta Lake, Delta Live Tables, and Databricks Workflows
- Experience collaborating with data scientists.
- Familiarity with Advana.
- Strong Python programming skills
- Solid SQL knowledge for querying and data manipulation
- Cloud platform experience, such as AWS, Azure, etc
What we'd like you to have
Be responsible for providing comprehensive documentation covering system architecture, development, and any enhancements made throughout the process.
About BigBear.ai
BigBear.ai is a leading provider of AI-powered decision intelligence solutions for national security, supply chain management, and digital identity. Customers and partners rely on Bigbear.ai's predictive analytics capabilities in highly complex, distributed, mission-based operating environments. Headquartered in McLean, Virginia, BigBear.ai is a public company traded on the NYSE under the symbol BBAI. For more information, visit and follow BigBear.ai on LinkedIn: @BigBear.ai and X: @BigBearai.
BigBear.ai is an Equal opportunity employer all protected groups, including protected veterans and individuals with disabilities.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.