Requirement for Visualization Engineer ---- Remote
Remote in Los Altos, CA, US • Posted 10 hours ago • Updated 45 minutes ago

Kairos
Dice Job Match Score™
🫥 Flibbertigibetting...
Job Details
Skills
- Cloud Storage
- Data Transfer Process
- LOS
- Internet
- Meta-data Management
- Digital Asset Management
- Frontend Development
- Usability
- Uploading
- Communication
- Analytical Skill
- Adaptability
- Attention To Detail
- Management
- Multitasking
- Cloud Computing
- Amazon S3
- Amazon EC2
- Flask
- Relational Databases
- PostgreSQL
- MySQL
- Data Visualization
- Plotly
- Orchestration
- Collaboration
- Training
- Onboarding
- Mentorship
- Document Review
- Visualization
- FAR
- Python
- SQL
- React.js
- Dashboard
- Workflow
- Amazon Web Services
- Data Engineering
- Machine Learning (ML)
- FOCUS
- Problem Solving
- Conflict Resolution
- SANS
- EXT
Summary
Hi ,
Please let me know if you're comfortable with the position detailed below. This position is an urgent hire.
Location : Los Altos, CA ---- Remote
Duration : 15 Months contract
Need only on 1099
Data Platform & Visualization Engineer
Python-focused data platform engineer with experience building reliable ingestion pipelines, SQL-backed metrics systems, backend APIs, and web-based dashboards for ML and experimentation workflows in AWS environments.
Python-focused data platform engineer with experience building reliable ingestion pipelines, SQL-backed metrics systems, backend APIs, and web-based dashboards for ML and experimentation workflows in AWS environments.
The company collects a lot of data on a monthly trip and potentially from simulations.
Currently, they are not handling much of it.
The main target is that their monthly trip collects a large amount of vehicle data.
Previously, it used to be mostly just vehicle numbers.
They are starting to add a lot of camera and LIDAR data, which means that
the scale of the data is getting larger.
Looking for someone junior to help with the existing ingestor system that ingests the
data and uploads it to S3.
Bandwidth is very limited (100 Mbps) because they go to a remote location, and
they are trying to upload 500 GB of data per night.
They want some sort of orchestration logic for sequencing and prioritizing certain
steps.
Once the data is up there, they are looking for someone with front-end experience to
navigate, filter, and create metadata.
This will require some back-end experience as well to handle the data.
Current Process:
It's not very smooth, but they make it work.
If they cannot finish uploading the data on the spot, they have to wait hours and
then come back to the office where there is fast data and upload it.
Hoping to streamline and optimize the process.
Clarification:
There is a queuing system where they have a bunch of stuff stored locally and
need to peter it out to the cloud storage.
That system needs to be refined and maintained.
Data Sources
Data is primarily from actual vehicles.
The system is deployed in California.
Data Transfer Process
Currently, data transfer involves manually taking out a hard drive, pressing buttons, and
uploading.
TRI wants more sophisticated lambda functions to be triggered once the data is
present.
Location
The HQ is in Los Altos.
Data is collected approximately three hours from Northern California, near Willows,
California.
The industrial system and internet bottleneck are located there.
Role Summary
The speaker wants a data engineer to optimize the workflow of getting data from
vehicles/simulations into AWS.
The engineer should leverage AWS tooling for further processing, including metadata
and lambda functions.
The role involves pre and post-processing data and tagging it.
The speaker wants a dashboard to manage data, as they currently use Notepad.
This is described as digital asset management.
The role is more back-end heavy, but with some light front-end development for
usability.
SQL expectations are high.
Data Schema
The team has some schema defined for each of the datasets they are imagining.
They currently use Google Sheets, but it is a "little bit janky when it comes to like sorting
and retrieving the results."
Uploading is also a little janky.
Strong collaboration and communication, ownership mindset, analytical problem-solving, adaptability, attention to detail, proactive initiative, and effective time management.
Ability to multitask effectively and efficiently.
Nice to have: experience with ML experiment tracking, data provenance systems, workflow orchestration frameworks, advanced data visualization, LLM-based automation, and cloud-deployed Python services.
Python, SQL, AWS (S3, EC2), FastAPI/Flask, React, relational databases (Postgres/MySQL), data visualization libraries (Plotly/Vega), custom orchestration pipelines, and LLM-based annotation tooling.
Data Platform & Visualization Engineer Typical Day:
Build and maintain Python data pipelines, backend APIs, and React dashboards; monitor and troubleshoot ingestion and metrics workflows; collaborate with researchers; ensure data reliability, traceability, and actionable visual insights.
Training will be a blend of hands-on onboarding, SME mentoring, and documentation review, focused on understanding existing pipelines, dashboards, AWS infrastructure, metrics systems, and LLM integration, with ongoing learning as the platform evolves.
No certifications are required.
Relevant AWS, data engineering, or Python certifications are nice-to-have and can make a candidate stand out, but hands-on experience is far more important.
For the Data Platform & Visualization Engineer role, hands-on experience is far more important than certifications. Candidates with demonstrated skills in Python data pipelines, SQL-backed metrics systems, backend APIs, React dashboards, and AWS workflows should be prioritized - even if they do not hold formal certifications.
Certifications (AWS, data engineering, ML) are optional and considered nice-to-have, but lack of certification should not exclude qualified candidates. Focus on practical experience, problem-solving ability, and project ownership.
Laxman Andoli | Lead TAG | Kairos Technologies Inc
M : | O: Ext 302 | E:
- Dice Id: 90879415
- Position Id: KTL - 7228-6230-1770247781
- Posted 10 hours ago
Company Info
Kairos is a Dallas based IT services firm operating with a mission to help organizations across the world effectively use cloud, mobile and social technologies for their strategic advantage.
Founded in 2003, we have successfully helped several organizations in building the technology backbone that would help them compete effectively in tough economic environments. We have been able to do this by forging strategic partnerships with leading technology providers, constantly building the skill set and competencies of our consulting team and by upholding the very best practices in customer engagement and delivery.
Kairos partners with the leading technology providers in cloud, mobile and social space. We work hard in nurturing and growing relationship with our partners.
We are also constantly seeking partnerships that would enable us to provide cutting-edge and best-in-class solutions to our customers. We understand that we are in the business of talent. We constantly lookout for smart individuals with a passion towards technology and provide them with the right mentorship and opportunities towards a stellar and rewarding career. We make sure that our consultants are not just technically equipped, but also have the right attitude towards ensuring that our engagements are successful.
And finally, our success entirely depends upon the success of our customers. We constantly strive to provide the best solutions in the most effective manner possible. Our entire process is built keeping our customers in mind and we constantly adopt and deploy the best practices and models in customer engagement and delivery. We are headquartered in Dallas, Texas and have a global delivery center in Hyderabad, India. We serve customers across United States, Europe and Asia.
Careers
Similar Jobs
It looks like there aren't any Similar Jobs for this job yet.
Search all similar jobs

