Overview
Skills
Job Details
Who we are:
ShorePoint is a fast-growing, industry recognized and award-winning cybersecurity services firm with a focus on high-profile, high-threat, private and public-sector customers who demand experience and proven security models to protect their data. ShorePoint subscribes to a work hard, play hard mentality and celebrates individual and company successes. We are passionate about our mission and going above and beyond to deliver for our customers. We are equally passionate about an environment that supports creativity, accountability, diversity, inclusion and a focus on giving back to our community.
The Perks:
As recognized members of the Cyber Elite, we work together in partnership to defend our nation s critical infrastructure while building meaningful and exciting career development opportunities in a culture tailored to the individuals technical and professional growth. We are committed to the belief that our team members do their best work when they are happy and well cared for. In support of this philosophy, we offer a comprehensive benefits package, including major carriers for health care providers. Highlighted benefits offered: 18 days of PTO, 11 holidays, 80% of insurance premium covered, 401k, continued education, certifications maintenance and reimbursement and more.
Who we're looking for:
We are seeking a Data Pipeline Technical Lead with hands-on technical leadership experience in designing, implementing and operating mission-critical data pipeline infrastructure for cybersecurity programs. The ideal candidate excels at facilitating complex technical discussions, breaking down ambiguous requirements into actionable work and guiding a large, diverse engineering team toward successful delivery. The Data Pipeline Technical Lead role operates in a fast-paced Agile environment and requires a strong mix of strategic planning and tactical problem-solving. This is a unique opportunity to shape the growth, development and culture of an exciting and fast-growing company in the cybersecurity market.
What you'll be doing:
- Lead architectural design discussions and perform comprehensive design reviews for data pipeline solutions.
- Conduct peer reviews of code commits, configurations and automation scripts across Kafka, DevSecOps and development teams.
- Guide data target analysis and support data mapping/modeling initiatives.
- Perform analysis of alternatives to support complex technical decisions.
- Facilitate epic decomposition and story development within Agile planning cycles.
- Provide technical direction for full end-to-end data pipeline solutions.
- Manage 17 19 direct reports, including Kafka engineers, developers, DevSecOps engineers and SRE staff.
- Facilitate daily standups, sprint ceremonies, backlog refinement sessions and other Agile meetings for a 26+ person team.
- Participate in program-wide technical leads meetings and customer stakeholder sessions.
- Identify and resolve cross-team dependencies and technical blockers.
- Provide mentorship to team leads and senior engineers across multiple technical disciplines.
- Interface regularly with customer stakeholders on technical approaches and program direction.
- Coordinate with other technical teams across the larger program ecosystem.
- Participate in SAFe PI planning cycles and Agile ceremonies.
- Facilitate ad-hoc technical decision meetings and design sessions.
What you need to know:
- Design, optimize and maintain secure, scalable data pipelines in mission-critical environments.
- Apply streaming technologies such as Kafka to support real-time cybersecurity data ingestion and processing.
- Implement containerized and automated deployments using infrastructure-as-code in cloud-native environments.
- Develop and manage data modeling, transformation logic and governance strategies for large-scale cybersecurity datasets.
- Execute Agile and SAFe methodologies across large, cross-functional engineering teams.
Must have's:
- Bachelor's degree in Cybersecurity, Computer Science, Information Systems, Mathematics, Engineering or a related technical field.
- 10+ years of technical experience in data engineering, software development or related technical fields including 5+ years leading technical teams of 8 or more engineers.
- Proven ability to analyze complex requirements and translate them into clear, actionable tasks and processes through critical thinking.
- Expert-level Python development experience with focus on data pipeline applications.
- Solutions architecture expertise in data engineering and pipeline design.
- Data mapping and modeling proficiency for complex cybersecurity datasets.
- Confluent Kafka platform expertise, including distributed streaming architecture, data governance frameworks, schema evolution strategies and enterprise-grade cluster management.
- Kubernetes/AWS EKS experience for containerized deployments.
- Infrastructure automation using Ansible, Python scripting and shell scripting.
- Experience with containerization technologies including Docker and Docker Compose.
- AWS cloud services experience in enterprise environments.
- Experience with SAFe/Agile methodologies and ceremony facilitation.
- Strong decomposition skills for complex technical epics and requirements.
- Proven ability to mentor senior engineers and technical specialists.
- Experience managing diverse technical disciplines, including infrastructure, development, DevSecOps and SRE.
Beneficial to have the following:
- Experience with the Elastic Stack and related technologies, including Elastic APM.
- Federal contracting or government sector experience.
- Background in cybersecurity data processing or SIEM technologies.
- Experience with large-scale data pipeline architectures.
- Familiarity with CISA frameworks or federal cybersecurity programs.
- Industry-recognized certifications.
Where it's done:
- Remote (Herndon, VA) must live within 50 miles of Washington, DC and attend onsite SAFe PI planning sessions two days per quarter.