exam questions

Exam SnowPro Core All Questions

View all questions & answers for the SnowPro Core exam

Exam SnowPro Core topic 1 question 734 discussion

Actual exam question from Snowflake's SnowPro Core
Question #: 734
Topic #: 1
[All SnowPro Core Questions]

What are benefits of using Snowpark with Snowflake? (Choose two.)

  • A. Snowpark uses a Spark engine to generate optimized SQL query plans.
  • B. Snowpark automatically sets up Spark within Snowflake virtual warehouses.
  • C. Snowpark does not require that a separate cluster be running outside of Snowflake.
  • D. Snowpark allows users to run existing Spark code on virtual warehouses without the need to reconfigure the code.
  • E. Snowpark executes as much work as possible in the source databases for all operations including User-Defined Functions (UDFs).
Show Suggested Answer Hide Answer
Suggested Answer: CE 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
JRayan
4 days, 11 hours ago
Selected Answer: CE
https://docs.snowflake.com/en/developer-guide/snowpark/index Support for pushdown for all operations, including Snowflake UDFs. This means Snowpark pushes down all data transformation and heavy lifting to the Snowflake data cloud, enabling you to efficiently work with data of any size. No requirement for a separate cluster outside of Snowflake for computations. All of the computations are done within Snowflake. Scale and compute management are handled by Snowflake.
upvoted 1 times
...
0e504b5
6 months, 1 week ago
Selected Answer: CE
https://www.snowflake.com/en/data-cloud/snowpark/spark-to-snowpark/ https://www.snowflake.com/en/data-cloud/snowpark/ https://docs.snowflake.com/en/developer-guide/snowpark/index https://medium.com/snowflake/pyspark-versus-snowpark-for-ml-in-terms-of-mindset-and-approach-8be4bdafa547#:~:text=Snowpark%20pushes%20all%20of%20its,leverage%20the%20power%20of%20Snowflake. https://www.snowflake.com/blog/snowpark-designing-performant-processing-python-java-scala/ https://docs.snowflake.com/en/user-guide/warehouses-snowpark-optimized
upvoted 3 times
...
Rana1986
7 months, 2 weeks ago
CE as per documentation. Pushdown means pushing as much as code to Source DB.
upvoted 2 times
...
[Removed]
9 months, 3 weeks ago
Selected Answer: CE
There is no mention of D anywhere.. you need to migrate the job code
upvoted 3 times
...
[Removed]
10 months ago
Selected Answer: CE
D is not an answer, document does not say this and snowpark has its own API which you need to use and cannot run spark code directly.. need to be customised
upvoted 2 times
...
JG1984
11 months, 3 weeks ago
Selected Answer: CD
Option E :In general, Snow park will try to execute as much work as possible in the source databases, but there are some cases where it will need to transfer data to the server. The specific cases will depend on the operations that you are performing and the data that you are accessing. Let's say you want to join two tables in Snowflake. If the two tables are in the same database, then Snow park can execute the join operation in the source database. However, if the two tables are in different databases, then Snow park will need to transfer the data from one database to the other before it can execute the join operation.
upvoted 3 times
...
ukpino
1 year, 1 month ago
Selected Answer: CE
https://docs.snowflake.com/en/developer-guide/snowpark/index Benefits When Compared with the Spark Connector In comparison to using the Snowflake Connector for Spark, developing with Snowpark includes the following benefits: Support for interacting with data within Snowflake using libraries and patterns purpose built for different languages without compromising on performance or functionality. Support for authoring Snowpark code using local tools such as Jupyter, VS Code, or IntelliJ. Support for pushdown for all operations, including Snowflake UDFs. This means Snowpark pushes down all data transformation and heavy lifting to the Snowflake data cloud, enabling you to efficiently work with data of any size. No requirement for a separate cluster outside of Snowflake for computations. All of the computations are done within Snowflake. Scale and compute management are handled by Snowflake.
upvoted 3 times
...
MultiCloudIronMan
1 year, 1 month ago
Selected Answer: CD
Correct
upvoted 2 times
MultiCloudIronMan
1 month, 3 weeks ago
CE not CD
upvoted 1 times
...
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago