Specialist Architect - Data Engineering/ML

Overview

Remote
USD 134,000.00 - 187,400.00 per year
Full Time

Skills

Machine Learning (ML)
Cloud Architecture
Technical Sales
Professional Services
Estimating
Use Cases
Writing
Go-To-Market Strategy
PS
PostScript
Collaboration
Product Management
Marketing
Sales
Impromptu
Presentations
Data Engineering
Cloud Computing
Amazon Web Services
Microsoft Azure
Google Cloud
Google Cloud Platform
Databricks
Apache Spark
Scripting
SQL
Python
Java
Scala
Pandas
PyTorch
TensorFlow
scikit-learn
Computer Science
Mathematics
SAS
Statistics
Apache NiFi
Data Science
Extract
Transform
Load
RDBMS
Data Warehouse
Enterprise Software
Retail
Manufacturing
Data Security
Innovation
Snow Flake Schema

Job Details

Where Data Does More. Join the Snowflake team.

We are looking for people who have a strong background in data engineering and cloud architecture to join our Professional Services team and be the technical seller for exciting new offerings and capabilities for our customers! This role will be part of a specialty team in Professional Services that helps our sellers in PS position and scope complex engagements which require deep expertise in specific technology areas. You will work directly with customers to understand and scope data engineering use cases using Snowflake's features and its extensive partner ecosystem. The role will also be strategic in determining how to scale technical sales knowledge throughout the Professional Services sales team through packaging of sales plays and delivery of enablement. You will be responsible for understanding Snowflake data engineering features in order to design high level solutions and proposals based on customer requirements.

AS A DATA ENGINEERING PRE-SALES ARCHITECT AT SNOWFLAKE, YOU WILL:
  • Be a technical expert on all aspects of Snowflake in relation to data engineering workloads.
  • Work with Professional Services Practice Directors and Managers on sales pursuits to:
    • Understand customer requirements
    • Present high level architecture solutions using Snowflake
    • Scope and present project plans and effort estimates to deliver data engineering solutions
  • Consult with customers in sales workshops where needed using SQL, Python, Java and/or Scala to understand their data engineering workload / use case.
  • Understand best practices related to Snowflake data engineering capabilities.
  • Maintain deep understanding of competitive and complementary technologies and vendors within the data engineering space, and how to position Snowflake in relation to them
  • Understand our partner system relationships in order to properly position them in the scoping and delivery of data engineering use cases.
  • Provide guidance on how to resolve customer-specific technical challenges.
  • Assist in writing Statements of Work
  • Identify selling patterns within the data engineering space, creating go to market options for our PS Sellers to leverage
  • Enable PS Sellers with technical knowledge to scale our ability to sell data engineering solutions
  • Collaborate with Product Management, Engineering, and Marketing to continuously improve Snowflake's products and marketing.

OUR IDEAL DATA ENGINEERING PRE-SALES ARCHITECT WILL HAVE:
  • Minimum 5 years experience working with customers in a pre-sales or post-sales technical role.
  • Outstanding skills presenting to both technical and executive audiences, whether impromptu on a whiteboard or using presentations and demos
  • Thorough understanding of the complete Data Engineering life-cycle
  • Experience and understanding of at least one public cloud platform (AWS, Azure or Google Cloud Platform)
  • Experience with Databricks/Apache Spark
  • Hands-on scripting experience with SQL and at least one of the following; Python, Java or Scala.
  • Experience with libraries such as Pandas, PyTorch, TensorFlow, SciKit-Learn or similar.
  • University degree in data science, computer science, engineering, mathematics or related fields, or equivalent experience

BONUS POINTS FOR HAVING:
  • Experience with Snowflake Snowpark
  • Experience with SAS (Statistical Analysis System)
  • Experience with Apache Nifi
  • Experience with dbt
  • Experience in Data Science
  • Experience implementing data pipelines using ETL tools
  • Experience working with RDBMS data warehouses
  • Proven success at enterprise software
  • Vertical expertise in a core vertical such as FSI, Retail, Manufacturing etc.

Every Snowflake employee is expected to follow the company's confidentiality and security standards for handling sensitive data. Snowflake employees must abide by the company's data security plan as an essential part of their duties. It is every employee's duty to keep customer information secure and confidential.

Snowflake is growing fast, and we're scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake.

How do you want to make your impact?

For jobs located in the United States, please visit the job posting on the Snowflake Careers Site for salary and benefits information: careers.snowflake.com
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.