A Data Engineer is working on a continuous data pipeline which receives data from Amazon Kinesis Firehose and loads the data into a staging table which will later be used in the data transformation process. The average file size is 300-500 MB.
The Engineer needs to ensure that Snowpipe is performant while minimizing costs.
How can this be achieved?
DIPARJ
6 days, 21 hours agoSnow_P
4 months, 2 weeks agostopthisnow
4 months, 3 weeks ago