You plan to implement an Azure Data Lake Storage Gen2 container that will contain CSV files. The size of the files will vary based on the number of events that occur per hour.
File sizes range from 4 KB to 5 GB.
You need to ensure that the files stored in the container are optimized for batch processing.
What should you do?
VeroDon
Highly Voted 5 months agoauwia
1 year, 8 months agoBouhdy
5 months, 3 weeks agoMassy
2 years, 10 months agoanks84
2 years, 5 months agobhrz
2 years, 5 months agoCanary_2021
Highly Voted 5 months agoIMadnan
Most Recent 1 week, 2 days agosamianae
3 weeks, 6 days agoRMK2000
1 month, 3 weeks agomoize
2 months, 3 weeks agoEmnCours
2 months, 4 weeks agoroopansh.gupta2
5 months agod39f475
9 months agoDusica
9 months, 4 weeks agof214eb2
10 months, 1 week agoCharley92
10 months, 2 weeks agoda257c2
10 months, 2 weeks agoKhadija10
1 year agoalphilla
1 year, 1 month agoJoanna0
1 year, 1 month agoll94
1 year, 1 month ago