AI Data Engineer (Cloud & Automation)

  • Washington, DC
  • Posted 9 hours ago | Updated 9 hours ago

Overview

On Site
USD 49.00 - 57.00 per hour
Contract - W2
Contract - Independent

Skills

IT Service Management
Cloud Computing
Customer Engagement
Critical Thinking
Step-Functions
ELT
SQL Azure
Customer Relationship Management (CRM)
Analytics
Reporting
Real-time
Batch Processing
LangChain
Unstructured Data
Data Quality
Meta-data Management
Extract
Transform
Load
Business Intelligence
Interfaces
Natural Language
Stored Procedures
Query Optimization
Regulatory Compliance
Encryption
Virtual Private Cloud
RBAC
Firewall
Agile
Sprint
Computer Science
Open Source
Microsoft SSIS
Python
Bash
Shell
Amazon Web Services
Amazon S3
Remote Desktop Services
Amazon RDS
Microsoft SQL Server
Electronic Health Record (EHR)
Amazon DynamoDB
Apache Flume
Apache Kafka
Apache Solr
RESTful
JIRA
GitHub
Microsoft Azure
DevOps
Jenkins
Software Development Methodology
Continuous Integration
Continuous Delivery
Performance Tuning
SQL
Apache Spark
Data Engineering
Generative Artificial Intelligence (AI)
Lifecycle Management
Artificial Intelligence
Workflow
Communication
Presentations
Security Clearance

Job Details

AI Data Engineer (Cloud & Automation)
REMOTE (able to travel to DC Quarterly)

Type: Contract
$49.00 - $57.00 hourly
Location: Remote with quarterly travel to DC office


Job Description:

ALTA IT Services is seeking a GenAI Data Automation Engineer to design and implement innovative, AI-driven automation solutions across AWS and Azure hybrid environments. You will be responsible for building intelligent, scalable data pipelines and automations that integrate cloud services, enterprise tools, and Generative AI to support mission-critical analytics, reporting, and customer engagement platforms. Ideal candidate is mission focused, delivery oriented, applies critical thinking to create innovative functions and solve technical issues.

In this role, you will:
  • Design and maintain data pipelines in AWS using S3, RDS/SQL Server, Glue, Lambda, EMR, DynamoDB, and Step Functions.
  • Develop ETL/ELT processes to move data from multiple data systems including DynamoDB ? SQL Server (AWS) and between AWS ? Azure SQL systems.
  • Integrate AWS Connect, Nice inContact CRM data into the enterprise data pipeline for analytics and operational reporting.
  • Engineer, enhance ingestion pipelines with Apache Spark, Flume, Kafka for real-time and batch processing into Apache Solr, AWS Open Search platforms.
  • Leverage Generative AI services and Frameworks (AWS Bedrock, Amazon Q, Azure OpenAI, Hugging Face, LangChain) to:
  • Create automated processes for vector generation and embedding from unstructured data to support Generative AI models.
  • Automate data quality checks, metadata tagging, and lineage tracking.
  • Enhance ingestion/ETL with LLM-assisted transformation and anomaly detection.
  • Build conversational BI interfaces that allow natural language access to Solr and SQL data.
  • Develop AI-powered copilots for pipeline monitoring and automated troubleshooting.
  • Implement SQL Server stored procedures, indexing, query optimization, profiling, and execution plan tuning to maximize performance.
  • Apply CI/CD best practices using GitHub, Jenkins, or Azure DevOps for both data pipelines and GenAI model integration.
  • Ensure security and compliance through IAM, KMS encryption, VPC isolation, RBAC, and firewalls.
  • Support Agile DevOps processes with sprint-based delivery of pipeline and AI-enabled features.

Required Qualifications:
  • BS in Computer Science or related field with 2+ years of data engineering, automation experiences.
  • Hands-on experience with LLM, Generative AI frameworks using AWS Bedrock, Azure OpenAI or open source platform.
  • Hands-on experience with SQL, SSIS, Python, Spark, Bash, Power shell, AWS/Azure CLIs.
  • Experience with AWS services like S3, RDS/SQL Server, Glue, Lambda, EMR, DynamoDB.
  • Familiarity with Apache Flume, Kafka, Solr for large-scale data ingestion and search.
  • Experience with integrating REST API calls in data pipelines and workflows.
  • Familiarity with JIRA, GitHub / Azure DevOps / Jenkins for SDLC and CI/CD automation.
  • Strong troubleshooting and performance optimization skills in SQL, Spark or other data engineering solutions.
  • Experience operationalizing Generative AI (GenAI Ops) pipelines, including model deployment, monitoring, retraining, and lifecycle management for LLMs and AI-enabled data workflows.
  • Good communication and presentation skills.
  • ship and ability to obtain Public Trust clearance


System One, and its subsidiaries including Joul, ALTA IT Services, and Mountain Ltd., are leaders in delivering outsourced services and workforce solutions across North America. We help clients get work done more efficiently and economically, without compromising quality. System One not only serves as a valued partner for our clients, but we offer eligible employees health and welfare benefits coverage options including medical, dental, vision, spending accounts, life insurance, voluntary plans, as well as participation in a 401(k) plan.

System One is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, age, national origin, disability, family care or medical leave status, genetic information, veteran status, marital status, or any other characteristic protected by applicable federal, state, or local law.

#M2
#LI-AP1
#DI-AP1

Ref: #850-Rockville (ALTA IT)
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.