Data Engineer 100%Remote
• Posted 2 hours ago • Updated 1 hour ago

Verito Solutions
Dice Job Match Score™
⭐ Evaluating experience...
Job Details
Skills
- Skype
- Mentorship
- Data Architecture
- Emerging Technologies
- FOCUS
- ELT
- Data Flow
- SQL
- Apache Hadoop
- Management
- Data Quality
- Analytical Skill
- Process Improvement
- Scalability
- Productivity
- PASS
- Security Clearance
- Computer Science
- Data Science
- Statistics
- Software Engineering
- Databricks
- Python
- TypeScript
- Cloud Computing
- Amazon Redshift
- Apache Spark
- Automated Testing
- Testing
- RESTful
- GraphQL
- Cloud Architecture
- Network Security
- Orchestration
- Docker
- Kubernetes
- Terraform
- Git
- GitHub
- Continuous Integration
- Continuous Delivery
- Performance Monitoring
- Grafana
- Agile
- Communication
- NoSQL
- Database
- PostgreSQL
- MySQL
- MongoDB
- Apache Kafka
- Amazon Kinesis
- RabbitMQ
- Redis
- Caching
- Authentication
- Authorization
- OAuth
- SAML
- Active Directory
- Amazon Web Services
- Extract
- Transform
- Load
- Workflow
- React.js
- JavaScript
- Node.js
- Flask
- Access Control
- Identity Management
- Data Governance
- Health Care
- Specification Gathering
- Collaboration
- System Integration
- System Testing
- Technical Support
- Training
- Documentation
Summary
Lead and mentor data engineers and related roles, promoting a culture of technical
excellence and collaboration.
Identify and own enterprise-wide data architecture technical solutions, ensuring
alignment with business growth and emerging technologies.
Design, build, and evolve scalable data architectures and pipelines with a focus on
simplicity, performance, and reliability.
Develop and optimize ETL/ELT processes, data flows, and infrastructure leveraging
AWS, SQL, and platforms such as Databricks, Redshift, Hadoop and Airflow.
Assemble and manage large, complex datasets to meet functional and non-functional
business requirements.
Implement best practices in data quality, profiling, anomaly detection, and performance
monitoring.
Deliver analytical tools and data products that provide actionable insights into business
performance and customer behavior.
Partner with cross-functional teams-including engineers, analysts, data scientists,
product, and government stakeholders-to address data challenges.
Drive process improvements to enhance scalability, automation, and developer
productivity.
Promote engineering best practices through testing, CI/CD, Infrastructure as Code, and
code reviews.
Requirements
Minimum Requirements:
All candidates must pass public trust clearance through the U.S. Federal Government. This
requires candidates to either be U.S.
that candidates have lived within
the United States for at least 3 out of the previous 5 years, have a valid and non-expired
passport from their country of birth and
appropriate VISA/work permit documentation.
Bachelor's degree in Computer Science, Software Engineering, Data Science, Statistics,
or related technical field.
Minimum 10 years of relevant experience in software engineering.
Minimum 2 years working in large scale Databricks implementation.
Proficiency in at least one of the following languages: Python, TypeScript or JavaScript.
Proven experience working on large-scale system architectures and Petabyte-level data
systems.
Experience with cloud-native data tools and architectures (e.g., Redshift, Glue, Airflow,
Apache Spark).
Proficient in automated testing frameworks (PyTest, Playwright or Jest) and testing best
practices.
Experience developing, testing, and securing RESTful and GraphQL APIs.
Proven track record with AWS cloud architecture, including networking, security, and
service orchestration.
Experience with containerization and deployment using Docker, and infrastructure
automation with Kubernetes and Terraform/Terragrunt.
Proficiency with Git, Git-based workflows, and release pipelines using GitHub Actions
and CI/CD platforms.
Knowledge of performance monitoring tools like Grafana, Prometheus, and Sentry.
Comfortable working in a tightly integrated Agile team (10 or fewer people).
Strong written and verbal communication skills, including the ability to explain technical
concepts to non-technical stakeholders.
Desired Qualifications:
Deep knowledge of working with relational and NoSQL databases (PostgreSQL,
MySQL, MongoDB).
Knowledge of event-driven architectures and systems like Kafka, Kinesis or RabbitMQ.
Familiarity with Redis for caching or message queuing.
connecting data domains securely.
Experience working with authentication/authorization frameworks like OAuth, SAML,
Okta, Active Directory, and AWS IAM (ABAC).
Experience exploring or building ETL pipelines and data ingestion workflows.
Experience with modern frameworks such as React.js, Next.js, Node.js, Flask.
Strong grasp of access control, identity management, and federated data governance.
CMS and Healthcare Expertise: In-depth knowledge of CMS regulations and experience
with complex healthcare projects; in particular, data infrastructure related projects or
similar.
Job description :
Location Marysville, OHIO
Key Responsibilities:
Develop and modify Yaskawa robot programs based on project specs.
Diagnose and troubleshoot robotic issues.
Collaborate with engineering and production for system integration.
Perform system testing and validation.
Provide technical support and training to team members.
Maintain documentation for robotic systems and programming changes.- Dice Id: 91170457
- Position Id: 2026-19337
- Posted 2 hours ago
Company Info
About Verito Solutions
At Verito Solutions, our core mission is to be an essential partner in our clients’ success. With a strong vision to become a global leader in delivering innovative and value-driven technology solutions, we are committed to exceeding expectations at every step. Our team is fueled by passion, expertise, and an unwavering determination to provide cutting-edge solutions tailored to the evolving needs of businesses.
We understand the challenges organizations face in today’s fast-paced digital landscape. That’s why we focus on delivering technology solutions that not only enhance efficiency but also save our clients valuable time, money, and effort. Whether it’s optimizing workflows, strengthening cybersecurity, or driving digital transformation, Verito Solutions is dedicated to empowering businesses with seamless, scalable, and future-ready technology.


Similar Jobs
It looks like there aren't any Similar Jobs for this job yet.
Search all similar jobs