Basic Qualifications: * Bachelors degree in a technical field: Computer Science, Engineering or related discipline. * 5+ years experience developing solutions across a range of industry, c
Telecommuting not availableTravel required to 15%.
Were looking for a DevOps Engineer who wants to be responsible for defining technology solutions. Youll integrate different domain applications and data in ways that enhance our clients current environments and helps pushes them forward to next generation capabilities. This will include helping them build and support data science environments in AWS as well as applying big data analytics in a coordinated, cohesive manner to help clients solve business use cases they could not previously address. This position will be based at our Fairfax, VA HQ.
In this role youll work with experienced professionals with the skills required to unlock the power of governed data discovery. Our collaborative staff provides deep technical and business support for data acquisition, data analysis/data science, and interactive data visualizations.
What youll be doing:
Develop long-term partnerships with customers and vendors and contribute to the development of their strategic business plans by understanding the future direction of their organization, the industry, and the customer's business.
Facilitate DevOps activities thereby operating and automating the various activities across data management environments.
Work with the architects, tech leads and infrastructure teams to prepare necessary DevOps processes, plan, implement and maintain the same.
Work closely with internal and external stakeholder groups to work thru their test strategy and maintain a comprehensive test data management solution.
Apply standard best practices around change management, architecture design, including assisting with planning the implementation timelines and then looping back to address subsequent questions and confirming implementation guidelines are followed.
Maintain necessary metrics to keep track of DevOps activities that will be used for continuous improvement.
Bachelors degree in a technical field: Computer Science, Engineering or related discipline.
5+ years experience developing solutions across a range of industry, client and technologies.
Experience setting up sandbox environments (e.g., using Docker, vagrant), automated environment builds, deployment and maintenance routines, as well as implementing them on data management systems that include Big Data systems (e.g., Hadoop, MongoDB), data warehouses, and reporting/analytic systems.
Experience with test data management, data scrubbing routines, automation of real-time and batch data loads and automated database builds.
Ability to obtain and maintain a U.S. Government security clearance
Experience with driving Agile Software Delivery Life Cycle in a business environment.
Experience with large-scale or Massively Parallel Processing database implementations as well as large-scale structured and unstructured datasets in a cloud/non-cloud environments.
Experience with applied data science, including modeling and quantitative analysis using standard statistical software such as SAS, R.
Excellent listening, written, and oral communication skills
Ability to exercise independent judgment while effectively prioritizing and executing tasks while under pressure
Team player with the ability to work in a fast-paced environment.
ICF (NASDAQ:ICFI) is a global consulting and technology services provider with more than 5,000 professionals focused on making big things possible for our clients. We are business analysts, policy specialists, technologists, researchers, digital strategists, social scientists and creatives. Since 1969, government and commercial clients have worked with ICF to overcome their toughest challenges on issues that matter profoundly to their success. Come engage with us at icf.com
ICF is an equal opportunity employer that values diversity at all levels. (EOE Minorities/Females/ Protected Veterans Status/Disability Status/Sexual Orientation/Gender Identity)