Job Details
Snowflake AI Data Engineer
Long Term Contract / Direct Hire
Sunnyvale, CA (Onsite)
Data Engineer - Snowflake + AI "The team engineers high-quality, scalable and resilient distributed systems on cloud that power data exploration, analytics, reporting and production models. Its core systems are diverse and come with an unusual intersection of high data volumes with systems distributed across cloud and on-premise infrastructure.
This role involves building solutions that integrate open source software with the internal ecosystem. The individual in this role will drive development of new components and features from concept to release: designing, building, testing, and shipping at a regular cadence. They will work closely with internal customers to understand their requirements and workflows, and propose new features and ecosystem changes to streamline their experience of using the solutions on the platform.
This is a challenging software engineering role, where a large part of an engineer’s time is spent in writing code and designing/developing applications on cloud, with the remainder being spent on tuning and debugging codebase, supporting production applications and supporting application end users. This role requires in-depth knowledge of innovative technologies and cloud data platform with the ability to independently learn new technologies and contribute to the success of various initiatives.
Minimum Qualification
Knowledge of BI concepts and implementation experience on Cloud with databases like Snowflake or Big Query
Programming experience in building high-quality software with at-least one of the following programming languages - Python, Scala or Java.
Experience in developing highly optimized SQLs, procedures & semantic process for distributed data applications.
Bachelor''s degree in Computer Science or equivalent experience
Preferred Qualification
Strong experience building enterprise-level data applications on distributed systems
Hands-on experience in designing and development of cloud-based applications that include compute services, database services, APIs to design RESTful services, ETL, queues and notification services.
Experience in cloud data warehousing platforms like Snowflake is highly valued
Experience developing Big Data applications using Java, Spark, Kafka is a huge plus
Understanding of fundamentals of object-oriented design, data structures, algorithm design, and problem solving
Cloud technology experience on platforms like AWS, Microsoft Azure, Google Cloud
Data Visualization Tools: experience in software such as Streamlit, Superset, Tableau, Business Objects, and Looker
Data Insights and KPIs: Working experience on generating and visualizing data insights, metrics, and KPIs. Usage of basic ML models in the space of anomaly detection, forecasting, GenAI.