Data EngineerW2

Overview

Hybrid
$60 - $70
Contract - W2
Contract - Independent
Contract - 6 Month(s)

Skills

Cloud Computing
Apache Airflow
Conflict Resolution
Documentation
Docker
Database Design
Communication
Apache Spark
Extract
Transform
Load
ELT
File Systems
Attention To Detail
Amazon Lambda
Good Clinical Practice
Apache Kafka
Microsoft Azure
Database
Git
Computer Science
Amazon Web Services

Job Details

Job Description -

Data Engineer

Client: Intuitive Surgical

Location: Santa Clara, CA with a hybrid schedule

Interview process: 2 video interviews

4-9 months project

Job Description:

We are seeking a skilled Data Application Engineer to design, build, and maintain data-driven applications and pipelines that enable seamless data integration, transformation, and delivery across systems.

The ideal candidate will have a strong foundation in software engineering, database technologies, and cloud data platforms, with a focus on building scalable, robust, and efficient data applications.

Key Responsibilities:

  • Develop Data Applications: Build and maintain data-centric applications, tools, and APIs to enable real-time and batch data processing.
  • Data Integration: Design and implement data ingestion pipelines, integrating data from various sources such as databases, APIs, and file systems.
  • Data Transformation: Create reusable ETL/ELT pipelines to process and transform raw data into consumable formats using tools like Snowflake, DBT, or Python.
  • Collaboration: Work closely with analysts, and stakeholders to understand requirements and translate them into scalable solutions.
  • Documentation: Maintain comprehensive documentation for data applications, workflows, and processes.

Required Skills and Qualifications:

  • Education: Bachelor s degree in Computer Science, Engineering, or a related field (or equivalent experience).
  • Programming: Proficiency in programming languages Python, C# , ASP.NET (Core)
  • Databases: Strong understanding of SQL, database design, and experience with relational (e.g. , Snowflake, SQL Server) databases
  • Data Tools: Hands-on experience with ETL/ELT tools and frameworks such as Apache Airflow (DBT - Nice to Have)
  • Cloud Platforms: Familiarity with cloud platforms such as AWS, Azure, or Google Cloud, and their data services (e.g. , S3, AWS Lambda etc.).
  • Data Pipelines: Experience with real-time data processing tools (e.g. , Kafka, Spark) and batch data processing.
  • APIs: Experience designing and integrating RESTful APIs for data access and application communication.
  • Version Control: Knowledge of version control systems like Git for code management.
  • Problem-Solving: Strong analytical and problem-solving skills, with the ability to troubleshoot complex data issues.

Preferred Skills:

  • Knowledge of containerization tools like Docker and orchestration platforms like Kubernetes.
  • Experience with BI tools like Tableau, Power BI, or Looker.
  • Soft Skills:
  • Excellent communication and collaboration skills to work effectively in cross-functional teams.
  • Ability to prioritize tasks and manage projects in a fast-paced environment.
  • Strong attention to detail and commitment to delivering high-quality results.

The Data Engineer (Data Application Engineer) at Intuitive Surgical will design, build, and maintain data-driven applications and pipelines that integrate, transform, and deliver data across multiple systems. This role requires a mix of software engineering, data architecture, and cloud platform expertise, emphasizing scalability, automation, and performance.

Key Responsibilities

  • Develop and Maintain Data Applications: Build APIs, tools, and services for real-time and batch data processing.
  • Data Integration: Design ingestion pipelines to connect databases, APIs, and file systems.
  • ETL/ELT Development: Create reusable data pipelines using Python, Snowflake, and DBT to transform raw data into analytical formats.
  • Collaboration: Partner with analysts and stakeholders to translate business requirements into scalable data solutions.
  • Documentation: Maintain detailed documentation for workflows, data processes, and system integrations.

Core Technical Skills

  • Programming: Python, C#, and ASP.NET Core
  • Databases: Strong SQL skills; experience with Snowflake and SQL Server
  • Data Tools: ETL/ELT frameworks (Airflow, DBT preferred)
  • Cloud Platforms: AWS, Azure, or Google Cloud Platform (S3, Lambda, etc.)
  • Data Processing: Experience with real-time tools like Kafka and Spark
  • APIs: Skilled in designing and integrating RESTful APIs
  • Version Control: Git proficiency
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Tek Inspirations LLC