Data Engineer

Rockville, MD, US • Posted 7 hours ago • Updated 7 hours ago
Contract W2
On-site
USD $82.14 - 85.00 per hour
Company Branding Image
Fitment

Dice Job Match Score™

📊 Calculating match score...

Job Details

Skills

  • Supervision
  • Software Development
  • Extract
  • Transform
  • Load
  • ELT
  • Surveillance
  • Amazon EC2
  • Analytical Skill
  • Amazon SageMaker
  • IBM Lotus Domino
  • Data Quality
  • SDK
  • Servers
  • API
  • Data Lake
  • Documentation
  • Confluence
  • Process Improvement
  • Collaboration
  • Innovation
  • Big Data
  • PySpark
  • Apache Hive
  • Scalability
  • Generative Artificial Intelligence (AI)
  • Interfaces
  • LangChain
  • Prompt Engineering
  • Semantics
  • Reasoning
  • Routing
  • Pair Programming
  • Debugging
  • Code Refactoring
  • Cloud Computing
  • Electronic Health Record (EHR)
  • Step-Functions
  • Amazon S3
  • Apache Spark
  • File Formats
  • Orchestration
  • Google Cloud Platform
  • Google Cloud
  • Python
  • Data Engineering
  • Functional Programming
  • Collections
  • Concurrent Computing
  • Management
  • SQL
  • Microsoft Certified Professional
  • Evaluation
  • Vertex
  • Vector Databases
  • Docker
  • Amazon EKS
  • Terraform
  • Continuous Delivery
  • Jenkins
  • GitLab
  • Continuous Integration
  • GitHub
  • Grafana
  • Amazon Web Services
  • Artificial Intelligence
  • Kubernetes
  • Computer Science
  • Data Science
  • Information Systems
  • Training
  • Financial Services
  • Object-Oriented Programming
  • Database
  • Software Engineering
  • Automated Testing
  • Build Automation
  • Configuration Management
  • Technical Communication
  • FOCUS
  • MEAN Stack
  • Customer Service
  • Training And Development
  • SAP BASIS

Summary

Software Guidance & Assistance, Inc., (SGA), is searching for a Data Engineer for a CONTRACT assignment with one of our premier Regulatory clients in Rockville, MD or Tysons, VA.

The Data Engineer works with moderate supervision across two equally weighted domains: (1) large-scale data pipeline development processing market events in a cloud environment, and (2) design and development of agentic AI systems including LLM-powered regulatory data assistants, MCP servers, and agent harness architectures. This position contributes to overall product quality throughout the software development lifecycle.

Responsibilities :
  • Build and maintain ETL/ELT pipelines using Apache Spark, Hive, and Trino across S3-based data lake environments
  • Develop and optimize SQL for large-scale surveillance datasets including window functions, multi-table joins, and complex aggregations
  • Build and engineer big data systems (EMR-on-EC2, EMR-on-EKS) and develop solutions on analytical platforms (SageMaker, Domino, Dataiku)
  • Participate in data quality monitoring, anomaly detection, and production incident investigation
  • Develop AI agent systems using AWS Bedrock and agent frameworks (Strands Agents SDK, LangChain/LangGraph, or equivalent)
  • Build agent harness architectures combining LLM reasoning with deterministic execution - skill/RAG-based SQL generation and structured output validation
  • Implement agent memory, context management, and tool integration (MCP servers, API connectors, data catalog lookups) across the data lake
  • Build evaluation frameworks for agent accuracy - paraphrase robustness, routing precision, and structural consistency
  • Stay informed of advances in LLM frameworks (LangGraph, Google ADK, AWS Strands) and emerging AI capabilities
  • Write clean, well-tested code; contribute to CI/CD Jenkins pipelines and infrastructure-as-code on AWS
  • Ensure secure handling of RCI and sensitive regulatory data across both data pipelines and agent outputs - auditable execution traces
  • Adhere to FINRA and team standards for secure development practices and technology policies
  • Partner across teams, communicate technical information at the appropriate level, and maintain documentation on Confluence/Wiki
  • Actively learn from senior team members; contribute to process improvement in line with FINRA's values of collaboration, expertise, innovation, and responsibility
Required Skills :
Data Engineering & Big Data Technologies
  • Experience building data pipelines using Apache Spark (PySpark preferred) and SQL
  • Experience with SQL query engines (Hive, Trino/Presto, or similar) and cloud data platforms (AWS S3, EMR, Lambda)
  • Understanding of common issues like data skew and strategies to mitigate it, working with large data volumes, and troubleshooting job failures due to resource limitations, bad data, and scalability challenges
  • Real-world experience with debugging and mitigation strategies
Generative AI & Agentic Systems
  • Practical experience building LLM-powered agent systems that use tools and produce structured outputs (not just chatbot interfaces)
  • Hands-on experience with at least one agent framework: LangChain, LangGraph, AWS Strands, or equivalent
  • Working knowledge of prompt engineering, RAG architectures, and context/memory management
  • Experience with foundation model APIs (Anthropic Claude, Amazon Nova, OpenAI, or similar)
  • Memory Architecture: Understanding of agent memory tiers - working memory, episodic memory, semantic memory - and strategies for context persistence, pruning, and retrieval across sessions
  • Agent Harness Design: Familiarity with harness patterns that wrap LLM reasoning with deterministic guardrails, tool routing, verification loops, and graceful degradation
AI Tool Proficiency
  • Hands-on experience with AI development tools (GitHub Copilot, Q Developer, ChatGPT, Claude, etc.)
  • Experience with spec-driven development - using structured specifications to guide AI code generation, review, and validation
  • Ability to leverage AI pair programming for code suggestions, debugging, refactoring, and automated test generation
Cloud Technologies
  • Experience with AWS services like S3, EMR, EMR on EKS, Lambda, Bedrock, Step Functions, etc.
  • Hands-on experience using S3 with Spark (e.g., dealing with file formats, consistency issues)
  • Familiarity with AWS Bedrock for foundation model invocation, knowledge bases, guardrails, and agent orchestration
  • Exposure to Google Cloud Vertex AI (model garden, grounding, agent builder) or equivalent managed AI platforms
  • Familiarity with AWS monitoring and logging tools (CloudWatch, CloudTrail) for production workloads
Programming - Python
  • Proficiency in Python for data engineering and automation
  • Ability to write clean, modular, and performant code
  • Experience with functional programming concepts (e.g., immutability, higher-order functions)
  • Strong understanding of collections, concurrency, and memory management
SQL Skills (Window Functions, Joins, Complex Queries)
  • Proficiency with SQL window functions, multi-table joins, and aggregations
  • Ability to write and optimize complex SQL queries
  • Experience handling edge cases like NULLs, duplicates, and ordering
Good to Have
  • AWS Bedrock AgentCore (memory, identity, tool gateway)
  • Model Context Protocol (MCP) server development and integration
  • Agent evaluation harnesses and agentic patterns (draft-verification, compile-style generation)
  • Fine-tuning foundation models for domain-specific tasks (LoRA, PEFT, or managed fine-tuning via Bedrock/Vertex AI)
  • Local model execution with Ollama, vLLM, or similar for development and experimentation
  • Vector databases (FAISS, Pinecone, OpenSearch)
  • Docker, Kubernetes, and Amazon EKS for containerized workloads
  • Infrastructure as Code (Terraform, CloudFormation)
  • Experience with CI/CD pipelines (Jenkins, GitLab CI, GitHub Actions, ArgoCD)
  • Experience with monitoring and observability tools (Prometheus, Grafana, ELK stack)
  • AWS certifications (AI Practitioner, Solutions Architect, or Kubernetes certifications like CKA/CKAD)
Education / Experience Requirements
  • Bachelor's degree in Computer Science, Data Science, Information Systems, or related discipline with at least two (2) years of related experience; or equivalent training and/or work experience; past Financial Services industry experience preferred
  • Demonstrated technical expertise in Object Oriented and database technologies/concepts which resulted in deployment of enterprise quality solutions
  • Extensive knowledge of industry leading software engineering approaches including Test Automation, Build Automation and Configuration Management frameworks
  • Strong written and verbal technical communication skills
  • Demonstrated ability to develop effective working relationships that improved the quality of work products
  • Ability to maintain focus and develop proficiency in new skills rapidly
  • Ability to work in a fast paced environment
SGA is a technology and resource solutions provider driven to stand out. We are a women-owned business. Our mission: to solve big IT problems with a more personal, boutique approach. Each year, we match consultants like you to more than 1,000 engagements. When we say let's work better together, we mean it. You'll join a diverse team built on these core values: customer service, employee development, and quality and integrity in everything we do. Be yourself, love what you do and find your passion at work. Please find us at .

SGA is an Equal Opportunity Employer and does not discriminate on the basis of Race, Color, Sex, Sexual Orientation, Gender Identity, Religion, National Origin, Disability, Veteran Status, Age, Marital Status, Pregnancy, Genetic Information, or Other Legally Protected Status. We are committed to providing access, equal opportunity, and reasonable accommodation for individuals with disabilities in employment, and our services, programs, and activities. Please visit our company to request an accommodation or assistance regarding our policy.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: sgainc
  • Position Id: 26-01183
  • Posted 7 hours ago

Company Info

About Software Guidance & Assistance

Founded in 1981, SGA is a technology and resource solutions provider with a national footprint and headquartered in the shadow of Wall Street. We’re a certified women-owned business. We provide contingent staffing, direct placement, and professional and managed services to transform businesses and evolve careers. We’re small enough to tailor our services to each client and big enough to deliver for some of the world’s largest employers. Our professionals are experts in areas such as IT, finance, accounting, risk, and clinical.

SGA provides contingent staffing, direct placement, and professional and managed services nationwide for Fortune 500 companies, mid-size businesses and select startups.

Our core skillsets include all areas of technology – business & data analysis, cyber & network security, database administration, development & architecture, infrastructure, program & project management, quality assurance & testing. We also deliver talent across professional business functions such as finance, accounting, risk, and clinical.

Our Professional & Managed Services team delivers IT projects through onshore, offshore and hybrid delivery models. We develop software products, modernize applications, add features, and integrate and maintain systems. Our scope covers, among others, complex application suites, data management and visualizations, machine learning and mobile applications.

About_Company_OneAbout_Company_Two
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Rockville, Maryland

Today

Contract

USD 67.85 - 70.00 per hour

Rockville, Maryland

Today

Contract

USD 75.00 - 78.00 per hour

Rockville, Maryland

Today

Contract

USD 82.14 - 85.00 per hour

Rockville, Maryland

Today

Contract

USD 75.00 - 80.00 per hour

Search all similar jobs