There is only one Data Cloud. Snowflake’s founders started from scratch and designed a data platform built for the cloud that is effective, affordable, and accessible to all data users. But it didn’t stop there. They engineered Snowflake to power the Data Cloud, where thousands of organizations unlock the value of their data with near-unlimited scale, concurrency, and performance. This is our vision: a world with endless insights to tackle the challenges and opportunities of today and reveal the possibilities of tomorrow.
To support our explosive growth we have built an expansive Partner Ecosystem that includes technology vendors such as Datarobot, Dataiku, Tableau, Informatica, Fivetran, and many more. These technology partners play a key role in extending native Snowflake functionality and also building innovative joint solutions for our customers across many use-cases such as Business Intelligence, Data Integration, Data Science, and Data Governance.
We are looking for a Partner Sales Engineer who is passionate about Data Science to work with our Data Science technology partners such as DataRobot, Dataiku, H2O.ai, and Zepl to help them build cutting edge integrations and solutions for our joint customers. In this role, you will work directly with the partner management team and technology partners to understand the needs of our customers, strategize on product integrations with partners, provide technical guidance, evangelize the joint solutions, and ultimately be the trusted advisor for the partners.
As a Snowflake Partner Sales Engineer you must share our passion for reinventing the Data Cloud, thrive in a dynamic environment, and have the flexibility and willingness to jump in and get things done. You are equally comfortable in both a business and technical context, interacting with executives and talking shop with technical audiences.
- Act as a Snowflake technical leader and trusted adviser for our partners
- Be a subject matter expert in Data Science, Machine Learning (ML), and Artificial Intelligence (AI) with a solid understanding of the ecosystem
- Shape and build the data science strategy with our partners such as DataRobot, Dataiku, H2O.ai, and Zepl
- Provide architectural and technical guidance to partners and promote successful integrations with Snowflake
- Understanding the partners business model and current capabilities whilst developing new strategies and integrations
- Collaborate with the Partner management team to provide technical guidance and support their strategic initiatives
- Engage with the Product team and influence product roadmap based on customer needs and strategic initiatives
- Build technical assets such as white papers, blog posts, best practices documents, virtual hands-on labs, and quickstarts to accelerate the adoption of joint solutions
- Evangelize joint solutions and product integrations by speaking at conferences, webinars, and other marketing events
- Present Snowflake technology and vision to executives and technical contributors at prospects and customers
- Immerse yourself in the ever-evolving industry, maintaining a deep understanding of competitive and complementary technologies and vendors and how to position Snowflake in relation to them.
- Collaborate with Product Management, Engineering, and Marketing to continuously improve Snowflake’s products and marketing
- Build assets to enable and train Snowflake Sales Engineers on partner integrations and solutions
- 5-6 years of industry experience with a minimum of 2 within a pre-sales environment
- Deep understanding and experience of the various tools and methodologies that make up the Machine Learning ecosystem
- Real-world experience in data preparation, data exploration/understanding, feature engineering, model development, model deployment, and model management
- Up to date knowledge on the important players and key trends in the data science ecosystem which includes but not limited to frameworks such as Tensor Flow, Apache MXNet, Keras, libraries such as Pandas and scikit-learn, tools like Jupyter and MS VSCode and Machine Learning Ops.
- Experienced in programming and debugging, at least one of: Python, Scala, Java, or R
- Broad range of experience within large-scale Database and/or Data Warehouse technology including but not limited to Amazon Redshift, Google Bigquery, Oracle databases, Teradata, Netezza, MongoDB, Cassandra, Elasticsearch, Couchbase, PostgreSQL, MySQL, SQL Server, Hadoop, EMR, Hive, Apache Spark
- Deep understanding of cloud architectures involving SaaS solutions with any of the popular cloud providers such as Amazon Web Services, Google Cloud, and Microsoft Azure is preferred