Data Engineer

$110,000 - $130,000

Full Time

  • No Travel Required

Skills

  • Apache Kafka
  • Data engineering
  • Data architecture
  • Data modeling
  • ELT
  • ETL
  • React.js
  • SaaS
  • Python
  • Data integration

Job Description

We are KCS – a cloud and data solution providing company based at Milpitas, CA. We’re a passionate team of 500+ across US, UK & South Africa who build & drives business forward with Data. Currently, we are seeking an experienced dynamic Data Engineer to join our team. If you love writing data pipelines, building a data warehouse and monitoring the processes, If you are a team player to support and mentor the team members in on-shore and off-shore teams ( India), you are a good  for this role and we want to talk to you!

Location: REMOTE

Responsibilities:

    • Develops and maintains scalable data pipelines.
    • Collaborates with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization.
    • Implements processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.
    • Writes unit/integration tests, contributes to engineering wiki, and documents work.
    • Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues.
    • Works closely with a team of frontend and backend engineers, product managers, and analysts.
    • Designs data integrations and data quality framework.
    • SQL Performance & Tuning
    • Designs and evaluates open source and vendor tools for data lineage.
    • Works closely with business & engineering teams to develop strategy for long term data architecture.
    • Provides operational support for the data pipeline
    • Work closely with off-shore team (India) to drive the development
    • Builds out new API integrations to support continuing increases in data volume and complexity.
    .

Requirements:

  • 5 to 7 years of total professional experience
  • 4+ years of Python experience developing data pipeline and data integrations
  • 2+ years of data pipeline experience working with Spark or similar framework
  • 3+ years of SQL experience (No-SQL experience is a plus)
  • 3+ years of experience with schema design and dimensional data modeling
  • 1+ years of experience working with any ETL tool to build ETL/ELT pipeline
  • 2+ years of experience working with REST API is a big plus.
  • KAFKA experience is a big plus
  • React JS and Node JS experience is a big plus
  • CI/CD experience is a big plus
  • Experience working with unstructured or semi-structured datasets
  • Ability in managing and communicating data warehouse plans to internal clients
  • Experience designing, building, and maintaining data processing systems
  • eCommerce industry experience or Logistics domain experience is a big plus
  • Data integration experience with Google Analytics, Google Adwords, BING, LinkedIn, ZOHO and other such SaaS providers is a big plus
  • Experience with Scheduling and orchestration tools such as Airow is a plus
  • Understanding of data security strategies, network protocols, and encryption technologies is a big plus.
  • Experience with or knowledge of Agile Software Development methodologies
  • Excellent problem solving and troubleshooting skills
  • Must be willing to learn new technologies especially open source