Big Data Engineer

  • JPI,
  • Washington D.c., DC
  • 1 day ago
java, hadoop, spark, aws, python, c++, oracle, sql server, chef, ansible, puppet, jira, hive, impala, storm
Full Time
$120,000 - $180,000
Work from home not available Travel not required

Job Description

MULTIPLE CLOUD ENGINEER AND ARCHITECT ROLES WITH HOMELAND SECURITY. JPI is seeking a Senior Cloud Architect / Subject Matter Expert to support a big data initiative for a government client. This is a great opportunity to work on an enterprise-wide implementation of bleeding-edge technical solutions and be part of a high energy team. The Senior Cloud Architect will leverage in-depth, hands-on experience and expertise across multiple big data, Cloud, and analytics solutions. The successful candidate will have demonstrated experience architecting and implementing enterprise solutions leveraging Cloud and Big Data technology in a Federal environment and working closely with Government clients. In this role you will balance technical leadership with hands-on development and implementation to initiate, plan, and execute large-scale, highly-technical, and cross-functional data and analytics initiatives. US CITIZENSHIP REQUIRED FOR CLEARANCE PURPOSES.

Applicants must possess a demonstrated history of working in the information technology and services industry with a wide variety of skillsets; including but not limited to:

  • Cloud Architecture
  • Big Data / Analytics Tools
  • Relational and non-relational/unstructured database solutions
  • IT Security
  • Software Development leveraging Agile methodologies

Responsibilities:

Lead a technical team to architect, design, prototype, implement, and optimize cloud-enabled big data solutions

  • Architect, develop, implement, and test data processing pipelines, and data mining/data science algorithms on a variety of hosted settings
  • Assist customers with translating complex business analytics requirements into technical solutions and recommendations across diverse environments
  • Experience defining and implementing data ingestion and transformation methodologies, including between classified and unclassified sources
  • Participate in the design, implementation and support of Big Data, Analytics and Cloud solutions through participation in all stages of development lifecycle
  • Conduct regular peer code reviews to ensure code quality and compliance following best practices in the industry
  • Design, implement and optimize leading Big Data frameworks (Hadoop, Spark, SAP HANA) across hybrid hosting platforms (AWS, Azure, on-prem)
  • Review security requirements, analyze processes, and define security strategy to implement compliance and controls in line with organizational standards and industry best practices
  • Develop accreditation and security documentation, including Systems Security Plans (SSP) and Authorization to Operate (ATO) packages.
  • Provide thought leadership and innovation to provide recommendations on emerging technologies or optimization/efficiencies across architecture, implementation, hosting, etc.
  • Lead the planning, development, and execution of data onboarding processing capabilities and services for diverse customers and data sets
  • Communicate results and educate others through design and development of insightful visualizations, reports, and presentations

REQUIREMENTS

  • 15+ years of professional experience
  • At least 10 years of progressive experience in architecting, developing, and operating modular, efficient and scalable Cloud solutions
  • Experience architecting, implementing, and operating solutions in across multiple Cloud Service Providers (AWS, Azure, Google), including strong understanding of Cloud and distributed systems considerations (i.e., load balancing, scaling, etc.)
  • Fluency and demonstrated expertise across multiple programming languages, such as Python, Java, and C++, and the ability to pick up new languages and technologies quickly;
  • Hands-on experience with data warehousing and business intelligence software including Cloudera and Pentaho
  • Extensive experience with Relational (Oracle, SQL Server) and non-relational/unstructured database solutions (HBase, Mongo)
  • Experience with automation and orchestration tools including Chef, Ansible, and Puppet
  • Extensive experience working within a Linux computing environment and use of command line tools including shell/Python scripting
  • Demonstrated success executing within an agile development team and familiarity with common software development tools (i.e., JIRA) and version control systems (i.e. git)
  • Extensive experience with multiple large-scale, big data frameworks and tools including MapReduce, Hadoop, Spark, Hive, Impala, and Storm
  • Advanced knowledge of application, data and infrastructure architecture disciplines
  • Ability to manage complex engagements and interface with senior level management internally as well as with clients
  • Ability to lead client presentations and communicate complex technical concepts to non-technical audiences and identify and manage project interdependencies
  • Ability to interact with both business and technical stakeholders of clients to provide a sound technical solution
  • Bachelor of Science in Computer Science or a related field
  • Relevant technical certifications (i.e.., AWS Certified Solutions Architect) a plus
  • Master's degree a plus

 

Posted By

David Ferraro

2270 Kraft Drive, Suite 1850 Blacksburg, VA, 24060

Contact
Dice Id : 10220368
Position Id : 6223493
Originally Posted : 2 months ago
Have a Job? Post it