Overview
On Site
USD 70.00 - 75.00 per hour
Full Time
Skills
Security Operations
Apache Spark
Database
Data Integrity
JIRA
Management
Data Lake
Extract
Transform
Load
Analytics
UPS
Sprint
Documentation
Presentations
Continuous Integration
Continuous Delivery
Computer Science
Data Engineering
Electronic Health Record (EHR)
Step-Functions
Amazon S3
Amazon Redshift
Scripting
Unix
Shell Scripting
Relational Databases
Version Control
Agile
Scrum
Communication
Software Design
Machine Learning (ML)
Data Science
Workflow
Data Governance
Regulatory Compliance
Information Security
Cyber Security
Cloud Computing
Python
SQL
Linux
GitLab
DevOps
GitHub
Terraform
Amazon Web Services
Data Analysis
Taxes
Life Insurance
Collaboration
Partnership
Business Transformation
Law
Job Details
Description
***HYBRID SCHEDULE***
The Cybersecurity team is seeking a Data Engineer with strong AWS development experience to help manage and evolve our Cyber Data Lake. This role is critical to supporting vulnerability remediation workflows and ensuring security compliance across the enterprise. You'll work with massive datasets, build scalable data pipelines, and enable actionable insights that drive security operations.
You'll be part of the team that manages the Cyber Data Lake, a centralized platform for security data. Your work will directly support critical workflows related to vulnerability remediation and security compliance. While you won't be scanning for vulnerabilities, you'll be responsible for managing and transforming the data that drives those efforts.
Key Responsibilities:
-Design, build, and maintain scalable data pipelines using AWS services (Lambda, Glue, Spark)
-Develop and optimize Python-based ETL processes for ingesting and transforming large-scale security data
-Work with Redshift and other databases to perform complex SQL aggregations across multiple tables
-Ensure data integrity and performance across Unix/Linux-based platforms
-Collaborate with cross-functional teams to support vulnerability remediation workflows
-Participate in Agile ceremonies including daily huddles and sprint planning via Jira
-Manage and integrate data from 40+ security data sources into the Cyber Data Lake
-Support ticketing workflows by enabling data-driven automation and insights
Key Responsibilities:
Design, build, and maintain scalable, secure, and efficient data pipelines using AWS services such as Glue, Lambda, Step Functions, S3, Redshift, EMR, and Data Pipeline.
Develop robust Python scripts for data ingestion, transformation, and automation.
Write and optimize complex SQL queries for ETL and analytics workflows.
Operate in Unix/Linux environments for scripting, automation, and system-level data operations.
Participate in Agile ceremonies (daily stand-ups, sprint planning, retrospectives) and contribute to iterative delivery of data solutions.
Collaborate with cross-functional teams to gather requirements and translate them into high-level architecture and design documents.
Communicate technical concepts clearly through documentation, presentations, and stakeholder meetings.
Implement monitoring, logging, and alerting for data pipelines to ensure reliability and performance.
Apply DevOps best practices using GitHub, Terraform, and CloudFormation for infrastructure automation and CI/CD.
Required Qualifications:
Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
3+ years of experience in data engineering or a similar role.
Strong hands-on experience with AWS data services (e.g., EMR, Glue, Lambda, Step Functions, S3, Redshift).
Advanced proficiency in Python for scripting and automation.
Solid experience with Unix/Linux shell scripting.
Strong command of SQL and experience with relational databases.
Proficiency with GitHub for version control and collaboration.
Experience with Terraform and/or AWS CloudFormation for infrastructure-as-code.
Experience working in Agile/Scrum environments.
Excellent verbal and written communication skills.
Proven ability to contribute to high-level solution design and architecture discussions.
AWS Certification (e.g., AWS Certified Data Analytics - Specialty, AWS Certified Solutions Architect, or equivalent).
Preferred Qualifications:
Exposure to machine learning pipelines or data science workflows.
Experience with data governance, security, and compliance best practices.
Skills
Security, Information security, Cyber security, Cloud, Aws, Python, Devops, Terraform, sql, Linux, gitlab
Top Skills Details
Security,Information security,Cyber security,Cloud,Aws,Python,Devops,Terraform,sql,Linux,gitlab
Additional Skills & Qualifications
Nice to haves:
-Experience with DevOps tools and practices
-Familiarity with GitHub and Terraform
-AWS Certifications such as Certified Solutions Architect or Data Analytics Associate
Experience Level
Expert Level
Pay and Benefits
The pay range for this position is $70.00 - $75.00/hr.
Eligibility requirements apply to some benefits and may depend on your job
classification and length of employment. Benefits are subject to change and may be
subject to specific elections, plan, or program terms. If eligible, the benefits
available for this temporary role may include the following:
Medical, dental & vision
Critical Illness, Accident, and Hospital
401(k) Retirement Plan - Pre-tax and Roth post-tax contributions available
Life Insurance (Voluntary Life & AD&D for the employee and dependents)
Short and long-term disability
Health Spending Account (HSA)
Transportation benefits
Employee Assistance Program
Time Off/Leave (PTO, Vacation or Sick Leave)
Workplace Type
This is a hybrid position in Reston,VA.
Application Deadline
This position is anticipated to close on Aug 20, 2025.
>About TEKsystems:
We're partners in transformation. We help clients activate ideas and solutions to take advantage of a new world of opportunity. We are a team of 80,000 strong, working with over 6,000 clients, including 80% of the Fortune 500, across North America, Europe and Asia. As an industry leader in Full-Stack Technology Services, Talent Services, and real-world application, we work with progressive leaders to drive change. That's the power of true partnership. TEKsystems is an Allegis Group company.
The company is an equal opportunity employer and will consider all applications without regards to race, sex, age, color, religion, national origin, veteran status, disability, sexual orientation, gender identity, genetic information or any characteristic protected by law.
About TEKsystems and TEKsystems Global Services
We're a leading provider of business and technology services. We accelerate business transformation for our customers. Our expertise in strategy, design, execution and operations unlocks business value through a range of solutions. We're a team of 80,000 strong, working with over 6,000 customers, including 80% of the Fortune 500 across North America, Europe and Asia, who partner with us for our scale, full-stack capabilities and speed. We're strategic thinkers, hands-on collaborators, helping customers capitalize on change and master the momentum of technology. We're building tomorrow by delivering business outcomes and making positive impacts in our global communities. TEKsystems and TEKsystems Global Services are Allegis Group companies. Learn more at TEKsystems.com.
The company is an equal opportunity employer and will consider all applications without regard to race, sex, age, color, religion, national origin, veteran status, disability, sexual orientation, gender identity, genetic information or any characteristic protected by law.
***HYBRID SCHEDULE***
The Cybersecurity team is seeking a Data Engineer with strong AWS development experience to help manage and evolve our Cyber Data Lake. This role is critical to supporting vulnerability remediation workflows and ensuring security compliance across the enterprise. You'll work with massive datasets, build scalable data pipelines, and enable actionable insights that drive security operations.
You'll be part of the team that manages the Cyber Data Lake, a centralized platform for security data. Your work will directly support critical workflows related to vulnerability remediation and security compliance. While you won't be scanning for vulnerabilities, you'll be responsible for managing and transforming the data that drives those efforts.
Key Responsibilities:
-Design, build, and maintain scalable data pipelines using AWS services (Lambda, Glue, Spark)
-Develop and optimize Python-based ETL processes for ingesting and transforming large-scale security data
-Work with Redshift and other databases to perform complex SQL aggregations across multiple tables
-Ensure data integrity and performance across Unix/Linux-based platforms
-Collaborate with cross-functional teams to support vulnerability remediation workflows
-Participate in Agile ceremonies including daily huddles and sprint planning via Jira
-Manage and integrate data from 40+ security data sources into the Cyber Data Lake
-Support ticketing workflows by enabling data-driven automation and insights
Key Responsibilities:
Design, build, and maintain scalable, secure, and efficient data pipelines using AWS services such as Glue, Lambda, Step Functions, S3, Redshift, EMR, and Data Pipeline.
Develop robust Python scripts for data ingestion, transformation, and automation.
Write and optimize complex SQL queries for ETL and analytics workflows.
Operate in Unix/Linux environments for scripting, automation, and system-level data operations.
Participate in Agile ceremonies (daily stand-ups, sprint planning, retrospectives) and contribute to iterative delivery of data solutions.
Collaborate with cross-functional teams to gather requirements and translate them into high-level architecture and design documents.
Communicate technical concepts clearly through documentation, presentations, and stakeholder meetings.
Implement monitoring, logging, and alerting for data pipelines to ensure reliability and performance.
Apply DevOps best practices using GitHub, Terraform, and CloudFormation for infrastructure automation and CI/CD.
Required Qualifications:
Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
3+ years of experience in data engineering or a similar role.
Strong hands-on experience with AWS data services (e.g., EMR, Glue, Lambda, Step Functions, S3, Redshift).
Advanced proficiency in Python for scripting and automation.
Solid experience with Unix/Linux shell scripting.
Strong command of SQL and experience with relational databases.
Proficiency with GitHub for version control and collaboration.
Experience with Terraform and/or AWS CloudFormation for infrastructure-as-code.
Experience working in Agile/Scrum environments.
Excellent verbal and written communication skills.
Proven ability to contribute to high-level solution design and architecture discussions.
AWS Certification (e.g., AWS Certified Data Analytics - Specialty, AWS Certified Solutions Architect, or equivalent).
Preferred Qualifications:
Exposure to machine learning pipelines or data science workflows.
Experience with data governance, security, and compliance best practices.
Skills
Security, Information security, Cyber security, Cloud, Aws, Python, Devops, Terraform, sql, Linux, gitlab
Top Skills Details
Security,Information security,Cyber security,Cloud,Aws,Python,Devops,Terraform,sql,Linux,gitlab
Additional Skills & Qualifications
Nice to haves:
-Experience with DevOps tools and practices
-Familiarity with GitHub and Terraform
-AWS Certifications such as Certified Solutions Architect or Data Analytics Associate
Experience Level
Expert Level
Pay and Benefits
The pay range for this position is $70.00 - $75.00/hr.
Eligibility requirements apply to some benefits and may depend on your job
classification and length of employment. Benefits are subject to change and may be
subject to specific elections, plan, or program terms. If eligible, the benefits
available for this temporary role may include the following:
Medical, dental & vision
Critical Illness, Accident, and Hospital
401(k) Retirement Plan - Pre-tax and Roth post-tax contributions available
Life Insurance (Voluntary Life & AD&D for the employee and dependents)
Short and long-term disability
Health Spending Account (HSA)
Transportation benefits
Employee Assistance Program
Time Off/Leave (PTO, Vacation or Sick Leave)
Workplace Type
This is a hybrid position in Reston,VA.
Application Deadline
This position is anticipated to close on Aug 20, 2025.
>About TEKsystems:
We're partners in transformation. We help clients activate ideas and solutions to take advantage of a new world of opportunity. We are a team of 80,000 strong, working with over 6,000 clients, including 80% of the Fortune 500, across North America, Europe and Asia. As an industry leader in Full-Stack Technology Services, Talent Services, and real-world application, we work with progressive leaders to drive change. That's the power of true partnership. TEKsystems is an Allegis Group company.
The company is an equal opportunity employer and will consider all applications without regards to race, sex, age, color, religion, national origin, veteran status, disability, sexual orientation, gender identity, genetic information or any characteristic protected by law.
About TEKsystems and TEKsystems Global Services
We're a leading provider of business and technology services. We accelerate business transformation for our customers. Our expertise in strategy, design, execution and operations unlocks business value through a range of solutions. We're a team of 80,000 strong, working with over 6,000 customers, including 80% of the Fortune 500 across North America, Europe and Asia, who partner with us for our scale, full-stack capabilities and speed. We're strategic thinkers, hands-on collaborators, helping customers capitalize on change and master the momentum of technology. We're building tomorrow by delivering business outcomes and making positive impacts in our global communities. TEKsystems and TEKsystems Global Services are Allegis Group companies. Learn more at TEKsystems.com.
The company is an equal opportunity employer and will consider all applications without regard to race, sex, age, color, religion, national origin, veteran status, disability, sexual orientation, gender identity, genetic information or any characteristic protected by law.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.