Lead Gen AI engineer , Remote, $75/HR on W2

  • Posted 12 hours ago | Updated 5 hours ago

Overview

Remote
$70+
Contract - W2
Contract - 5 Month(s)

Skills

Amazon Lambda
Amazon RDS
Amazon SQS
Amazon Web Services
Debugging
Data Extraction
Generative Artificial Intelligence (AI)
Knowledge Base
Machine Learning Operations (ML Ops)
Node.js
Mortgage
Optical Character Recognition
Extraction
GitHub
Prototyping
Python
Step-Functions
PostgreSQL
AWS Bedrock
LLMs

Job Details

Must have skills: AWS Bedrock, LLMs, RAG, Python or Node.js

Description:

Join a specialized GenAI team that will deliver scalable, production-grade solutions to automate mortgage document review, data extraction, and due diligence. You will build, optimize, and maintain workflows leveraging AWS Bedrock (BDA, Blueprints, Knowledge Base, Agents), LLMs, RAG, chunking strategies, and a full suite of AWS serverless services (SQS, Lambda, Aurora RDS, EventBridge, Step Functions, Textract). You ll contribute across engineering, performance optimization, troubleshooting, deployment, and process improvement throughout the workflow lifecycle - all within an existing, cloud-based architecture.

You ll collaborate closely with client's existing AI POD, product owners, and technical leaders to execute high-impact projects end-to-end, from proof-of-concept through production deployment. You ll apply current best practices in enterprise AI for reliability, scalability, security, and compliance.

Responsibilities:

  • Deliver production-grade, end-to-end automation workflows for AI-driven mortgage document OCR, data extraction, and due diligence, using AWS Bedrock, AWS Textract and supporting cloud services
  • Develop, integrate, and optimize GenAI/RAG solutions using Bedrock Prompt Engineering & Management, BDA/Blueprint/Knowledge Base, chunking strategies, and LLM customization
  • Engineer and orchestrate scalable cloud pipelines using Python and/or Node.js: Lambda functions (and Layers), Step Functions, SQS, EventBridge, Aurora RDS, etc.
  • Implement massive-scale document processing (target: up to 2 million docs/hour) - focusing on high reliability, low latency, and robust error handling
  • Design and deploy efficient, observable, and testable workflows - instrument monitoring, error tracking, and metrics for system health and business KPIs
  • Collaborate tightly with product owners, technical leads, and peer developers to accelerate new workflow onboarding, parallelize delivery, and maintain momentum while safeguarding quality
  • Apply best practices for code quality, infrastructure-as-code, security, compliance, and data privacy
  • Use GitHub Copilot, ChatGPT Sandbox, and other approved AI tools to improve productivity and accelerate secure code development

Must-Have Skills & Experience:

  • Production experience with AWS Bedrock, including:
  • Bedrock Data Automation (BDA)
  • BDA Blueprint
  • Bedrock Knowledge Base
  • Bedrock Agents
  • Bedrock Prompt Engineering and Management.
  • Hands-on expertise with LLMs, RAG architectures, and document chunking for retrieval/extraction.
  • Strong Python or Node.js development skills for cloud serverless pipelines.
  • Proven track record building, deploying, and maintaining high-throughput, enterprise-grade solutions using:
  • AWS Lambda Functions & Lambda Layers
  • Step Functions
  • SQS
  • EventBridge.
  • Aurora RDS Serverless (PostgreSQL).
  • Amazon Textract (OCR/Document AI).
  • Direct experience delivering real production solutions (not just prototypes/POCs) in high-volume, high-velocity workflows.
  • Strong knowledge of cloud architecture patterns, security, performance optimization, observability, and cost control for serverless, distributed workloads.
  • Excellent problem solving, debugging, and incident triage skills.
  • Effective collaboration and communication in a remote, fast-moving, cross-functional environment.
  • Adaptability and resilience to maintain velocity and quality under tight timelines and shifting priorities.

Nice-to-Have Skills:

  • Knowledge of mortgage and/or real estate document workflows, compliance, or data models.
  • Experience optimizing or parallelizing workflow orchestration for massive-scale document processing.
  • Familiarity with AI/ML Ops, monitoring tools, and enterprise governance and compliance in cloud environments.
  • Advanced prompt engineering for GenAI/LLMs in document automation scenarios.
  • Experience leveraging GitHub Copilot or comparable AI-powered coding tools in secure, enterprise settings.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.