Overview
On Site
USD 70.00 - 84.00 per hour
Contract - W2
Skills
Big Data
Surveillance
Extract
Transform
Load
Data Science
Regulatory Compliance
Data Integrity
Cloud Computing
Data Storage
Jenkins
Collaboration
Sprint
Computer Science
Software Engineering
Data Engineering
Python
Scala
Apache Spark
Apache Hive
Distributed Computing
SQL
NoSQL
Database
PostgreSQL
MongoDB
Amazon DynamoDB
Amazon SageMaker
Amazon Web Services
Storage
Continuous Integration
Continuous Delivery
Management
Apache Airflow
Workflow
Orchestration
Machine Learning (ML)
Automated Testing
Agile
Scrum
Regulatory Reporting
Analytics
Finance
Data Processing
MEAN Stack
Customer Service
Training And Development
SAP BASIS
Job Details
Software Guidance & Assistance, Inc., (SGA), is searching for a Big Data Engineer (Python/Spark/AWS) for a Contract assignment with one of our premier Regulatory clients in Rockville, MD.
This role is hybrid (2 days/week Onsite)
We are seeking a Big Data Engineer to lead the design, develop, and maintain scalable data pipelines and infrastructure to support our market regulation technology applications. You will play a key role in modernizing existing data pipelines, enabling downstream users to experiment with data using no-code/low-code features. You will collaborate closely with product managers, data scientists, and engineers to deliver high-performance systems that power market surveillance and regulatory workflows.
Responsibilities :
Required Skills:
Preferred Skills:
SGA is a technology and resource solutions provider driven to stand out. We are a women-owned business. Our mission: to solve big IT problems with a more personal, boutique approach. Each year, we match consultants like you to more than 1,000 engagements. When we say let's work better together, we mean it. You'll join a diverse team built on these core values: customer service, employee development, and quality and integrity in everything we do. Be yourself, love what you do and find your passion at work. Please find us at .
SGA is an Equal Opportunity Employer and does not discriminate on the basis of Race, Color, Sex, Sexual Orientation, Gender Identity, Religion, National Origin, Disability, Veteran Status, Age, Marital Status, Pregnancy, Genetic Information, or Other Legally Protected Status. We are committed to providing access, equal opportunity, and reasonable accommodation for individuals with disabilities in employment, and our services, programs, and activities. Please visit our company to request an accommodation or assistance regarding our policy.
This role is hybrid (2 days/week Onsite)
We are seeking a Big Data Engineer to lead the design, develop, and maintain scalable data pipelines and infrastructure to support our market regulation technology applications. You will play a key role in modernizing existing data pipelines, enabling downstream users to experiment with data using no-code/low-code features. You will collaborate closely with product managers, data scientists, and engineers to deliver high-performance systems that power market surveillance and regulatory workflows.
Responsibilities :
- Design and modernize scalable, modular data pipelines from legacy SQL-based workflows.
- Build efficient and reliable ETL processes using Apache Spark and cloud-native tools on AWS.
- Create configurable pipelines that accelerate experimentation and feature selection for data science teams.
- Optimize pipeline performance, reduce storage costs, and improve compute efficiency.
- Ensure compliance with governance standards while maintaining data integrity and lineage.
- Integrate seamlessly with enterprise data platforms, lakes, and cloud-based storage systems.
- Implement and maintain CI/CD pipelines using Jenkins or similar tools.
- Write robust automated tests, including unit, integration, and end-to-end coverage.
- Collaborate within Agile teams, participating in sprint planning, reviews, and retrospectives.
Required Skills:
- Bachelor's degree in computer science, Software Engineering, or a related field.
- 7+ years of experience in large-scale data engineering or data infrastructure roles.
- Proficient in Python or Scala for building and maintaining data pipelines.
- Deep expertise with Apache Spark, Hive, and distributed computing frameworks.
- Strong knowledge of SQL and NoSQL databases (e.g., PostgreSQL, MongoDB, DynamoDB).
- Experience with feature stores (e.g., Feast, Tecton, SageMaker Feature Store).
- Hands-on with AWS services for compute, storage, and orchestration.
- Experience implementing CI/CD practices and managing production deployments.
- Familiarity with Apache Airflow and workflow orchestration tools.
- Experience with machine learning feature engineering and model serving frameworks.
- Comfortable with automated testing and working in test-driven environments.
- Proven ability to thrive in Agile/Scrum teams and contribute to iterative development cycles.
Preferred Skills:
- Familiarity with regulatory reporting, trade analytics, or financial data processing is a strong plus.
SGA is a technology and resource solutions provider driven to stand out. We are a women-owned business. Our mission: to solve big IT problems with a more personal, boutique approach. Each year, we match consultants like you to more than 1,000 engagements. When we say let's work better together, we mean it. You'll join a diverse team built on these core values: customer service, employee development, and quality and integrity in everything we do. Be yourself, love what you do and find your passion at work. Please find us at .
SGA is an Equal Opportunity Employer and does not discriminate on the basis of Race, Color, Sex, Sexual Orientation, Gender Identity, Religion, National Origin, Disability, Veteran Status, Age, Marital Status, Pregnancy, Genetic Information, or Other Legally Protected Status. We are committed to providing access, equal opportunity, and reasonable accommodation for individuals with disabilities in employment, and our services, programs, and activities. Please visit our company to request an accommodation or assistance regarding our policy.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.