- Home
- /
- Snowflake Events April 2025: Your Burning Questions, Answered!
During the Snowflake events in Vilnius and Riga through April 8-10, 2025, attendees were given the opportunity to pose questions using Slido. While not all questions received responses, we’ve compiled them below. A special thanks to our Snowflake speaker, Olli Ek for providing the answers.
Event presentations and Customers – PAYSTRAX and TELE 2 – success stories with Snowflake: here.
Security topics – if authorities comes from the country where server exists – can they take over company + customer data?
Answer: Snowflake, like any other company, will comply with local laws. In the case of a request from a government to get the data the law will dictate what we do. But, Snowflake offers “Tri-Secret Secure” which gives our customers the possibility to use your own encryption keys and data is always encrypted using those keys. We never store the keys in our service permanently so even in the extremely rare case of such a request, if the data is encrypted by your keys there’s no way that data can be opened by someone other than the one who holds the encryption keys (that being the customer).
Does Snowflake Horizon allow us to view stored procedures in lineage?
Answer: This is on our roadmap, the things we’re developing currently in this context are:
How are schema changes to shared datasets in Snowflake communicated to subscribers? Are there automatic notifications or protocols for adaptation?
Answer: There are multiple ways this can be setup, depending if it’s a direct share, a listing, marketplace product etc. We’re happy to have a deeper dive discussion to understand your needs and explore more into this.
For starting point this is a good place to check for the basics here.
Does Snowflake runs/uses Apache Spark in any capacity behind the scenes in the platform?
Answer: Snowflake currently doesn’t run Spark on its service, but has a set of python libraries under our “Snowpark” (Spark like features) stack which allows you to build very similar pipelines than what you’d use with Spark (using Python, Scala, Java etc). See more from here>
Also, we have connector to Spark here.
And, something we cannot disclose publicly yet, interesting things on the roadmap around this topic and likely we can tell more after the Summit in June.
If Snowflake connects to other Snowflake, will we eventually get an avalanche?:)
Answer: Nope, with Snowflake you’ll get the perfect fresh snow that’s offers smooth rides.
Does Snowflake provides connectors to various SAP systems allowing to bypass SAP BW?
Answer: We have have 100’s of customers loading data from SAP systems to Snowflake already today. There are various ways we do it and we’re happy to talk more; please reach out to Infotrust/Snowflake teams and we can arrange a meeting on this. As part of our near-term roadmap we have said publicly that we’re working with the next generation connectivity options together with SAP and announcements on this will most likely be done in near term future
And does semantic model translates SAP data structures into human language?
Answer: The semantic model can help with that sort of thing, yes! There is also a SAP connector that is coming soon (see above), and the rumor is that there will be semantic help there too.
Any specific tips on optimizing Iceberg files that are used by Snowflake tables?
Answer: There is a list of best practices for Iceberg tables used by Snowflake here: https://docs.snowflake.com/en/user-guide/tables-iceberg-best-practices
What’s Snowflake’s view on where DQ (data quality) should comes in? Maybe you already have any framework in place?
Answer: Snowflake Data Quality / Data Metric Functions are our own Framework which are included on the platform – more is here.
Will Snow offer (or plan to offer) code as a service to be used in code migration? Instead of converting source code use Snow native alternative?
Answer: Snowconvert can automatically convert code from one system to Snowflake and on top of that, together with Infotrust & Snowflake Services we do offer more custom code conversion services than can be adjusted based on your needs
Do you have some recommendations on how to conveniently manage Snowflake-Postgres connector as we intend to onboard more than 100 data sources? When will Datavolo functionalities be integrated into Snowflake? Will there be any additional costs? If the initial snapshot through the Snowflake-Postgres connector fails, can the insert task still be triggered?
Answer: We’re looking into this and Account Team will contact you when we hear back from our specialists
What data historization options are available? Does it have anything similar to Teradata temporal, where the platform manages data historization automatically?
Answer: No, there isn’t a temporal feature in Snowflake. What customers typically do instead: Snowflake is very efficient with insert-only load patterns. So if you have a large fact table, you would have one column called something like UPDATE_DT with a timestamp, and your jobs just INSERT to the table and do not try to MERGE/UPDATE. Then any SELECT queries on the table use a pattern like SELECT * FROM mytable QUALIFY ROW_NUMBER() OVER (PARTITION BY primarykeycolumns ORDER BY update_dt DESC) = 1.
Also: especially in short term (90 days or less) use cases, using the time travel feature can mimic the transaction time functionality.
Where and how can I test the Snowflake platform?
Answer: You can open your free Snowflake trial account from HERE.
***
ZERO TO SNOWFLAKE WORKSHOPS for beginners in the end of April – PLEASE REGISTER and JOIN ONSITE (free entrance):
Any questions: Lina Kriukeliene, l.kriukeliene@theinfotrust.com.