Data Quality Engineer - W2 (NO C2C1099) - Local to Richmond, VA - Posted by Tauqeer

Overview

On Site
Hybrid
Depends on Experience
Contract - W2
Contract - 12 Month(s)
No Travel Required

Skills

API
Amazon S3
Amazon SageMaker
Amazon Web Services
Amazon RDS
Data Architecture
Data Governance
Data Quality
ELT
Node.js
Orchestration
Process Automation
React.js
Scripting
automation
Snowflake
Python
Cloud Computing
Cloud Security

Job Details

Required Skills

Technical Requirements AWS Services: Glue, S3, Lambda, Step Functions, Lake Formation, CloudFormation, IAM. Snowflake: Consumption layer for analytics and reporting. Python: Core language for ETL scripts, automation, and API development. Data Architecture: Design for scalability, governance, and compliance. APIs: Mandatory for exposing data services. Talent Profile Strong Cloud Data Engineer with: Expertise in AWS-native data solutions. Experience in data quality governance, ETL orchestration, and automation. Proficiency in Python and API development. Ability to build and operate modern data platforms with compliance and monitoring.

Job Description

Apex Systems is seeking a highly skilled Senior Full stack developer with advanced expertise in building AWS/React/NodeJS micro frontend applications and APIs. This role involves implementing technology solutions that enhance business decision-making and processes across all business units, in alignment with the agency's architectural roadmap and enterprise goals.

The Full Stack Developer will engage in a broad range of tasks related to the development, deployment, and maintenance of cloud-based infrastructure and applications. Responsibilities include enabling collaboration, automation, and efficiency in the development of cloud-based micro frontend applications, thereby facilitating the rapid delivery of high-quality software and services to customers.

This role is critical to implementing technology solutions that enhance business decision-making, data governance, and operational efficiency across all business units, in alignment with the client's architectural roadmap, enterprise goals, and data policy framework. This position will support the technology build-out of data capabilities aligned with client s Data Policy, including but not limited

to:

Data Quality Dashboards for monitoring and remediation

Data Classification and Retention mechanisms

User-facing portals for secure login and interaction

Workflow-enabled processes for data lifecycle management

Integration with AWS-native services such as SageMaker Studio, Glue, Athena.

This position requires technical proficiency in AWS cloud development with Data engineering & full stack skills including glue, Athena, RDS, Snowflake Integrations, ELT&ETLs, lambda, API Gateway, node js, reactjs, python etc., adherence to software engineering best practices, appreciation to data management principle and alignment with the agency s technical direction & data policies. Additionally, the role involves collaboration with teams on leveraging the possibilities of common components and technology solutions to ensure successful adoption and implementation.

Additional responsibilities include:

Collaborate with stakeholders to understand business requirements and convert them into

actionable solution components aligned with client's Enterprise Solution architecture patterns,

procedures, and policies.

Design and develop scalable, resilient data service components and engineering solutions using cloud-native services and technologies.

Develop data lakes and CDC services to create cohesive applications within cloud and hybrid

infrastructures.

Create comprehensive solution documentation detailing the design, testing, and production

support processes and procedures.

Set up and maintain continuous integration and continuous deployment (CI/CD) pipelines to

automate updates and feature deployments.

Design and develop cloud-native micro frontend applications and data service APIs using AWS, React, and NodeJS.

Build data quality monitoring dashboards and tools to support data stewardship and policy

compliance.

Implement data classification and retention logic using metadata tagging, lifecycle policies, and automation.

Develop secure user portals with role-based access control (RBAC), Single Sign-On (SSO), and integration with AWS Cognito.

Enable workflow automation for data intake, validation, approval, and archival processes.

Create and maintain solution documentation, including architecture diagrams, data flow,

testing, and support procedures.

Ensure solution completeness through thorough testing and quality assurance processes,

ensuring compliance with security policies and best practices.

Manage configuration drift and ensure consistency across different environments, using tools like AWS Code Build/Deploy and AWS CloudFormation.

Apply DevOps principles to streamline the software development lifecycle (SDLC) and enhance automation.

Deploy solutions to various environments, establish monitoring processes, and ensure ongoing operational stability and application resiliency through system stress testing and feature enhancements.

Collaborate with cross-functional teams to understand change requirements, provide technical expertise, and ensure solutions align with business needs.

Qualifications include:

Expertise in developing AWS full stack applications: Translate business requirements into fully tested applications workflows by designing, implementing, and maintaining technology assets using approved tools and plugins for AWS for python.

Proficiency in data services orchestrion: Strong proficiency in cloud platforms, leveraging native cloud services data orchestration solution.

API Development: Expertise in designing, developing, and maintaining APIs specifically tailored for cloud environments including AWS API Gateway.

Cloud Networking and Security: Understanding of cloud networking concepts including VPCs, subnets, security groups and cloud security best practices, including identity and access

management (IAM). Securing application through role-based authorization, single sign on, and Coginto for user and access management from Data services API and products

Serverless Computing: Experience with serverless computing concepts, event-driven

architecture, and serverless platforms.

Infrastructure as Code (IaC): Automate the provisioning and management of cloud infrastructure using tools like AWS CloudFormation and Terraform.

Scripting and Automation: Ability to write scripts and automate tasks using scripting languages including NodeJS and Python.

Experience with data quality frameworks, data validation, and anomaly detection.

Familiarity with data classification models, metadata management, and retention policies.

Ability to write scripts and automation using Python and NodeJS.

Monitoring and Logging: Experience in setting up alerts, dashboards, and logs for cloud

infrastructure and applications.

DevOps Practices: Strong understanding of DevOps practices, including developing and

optimizing CI/CD pipelines, version control, and collaboration tools including AWS CodePipeline and Github.

End-User Training and Support: Provide support and training to end-users for improved solution literacy and tool usage in a cloud ecosystem.

Business Process Automation: Analyze business process life cycles to identify opportunities for automation and simplification, implementing appropriate solutions.

Collaboration and Communication: Collaborate with product and business owners to deliver

robust cloud-based solutions, demonstrating excellent communication skills (both written and

verbal).

Compliance: Adhere to all policies and procedures of the client.

Additional Considerations:

AWS Data Engineer Associate and Solution architect Professional Certification is desired.

Familiarity with data governance frameworks such as DAMA-DMBOK or NIST Data Management

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.