You plan to implement an Azure Data Lake Storage Gen2 container that will contain CSV files. The size of the files will vary based on the number of events that occur per hour.
File sizes range from 4 KB to 5 GB.
You need to ensure that the files stored in the container are optimized for batch processing.
What should you do?
VeroDon
Highly Voted 7 months agoauwia
1 year, 10 months agoBouhdy
7 months, 2 weeks agoMassy
2 years, 11 months agoanks84
2 years, 7 months agobhrz
2 years, 7 months agoCanary_2021
Highly Voted 7 months agoJolyboy
Most Recent 3 weeks, 1 day agoJanuaz
4 weeks, 1 day agoIMadnan
2 months agosamianae
2 months, 3 weeks agoRMK2000
3 months, 2 weeks agomoize
4 months, 2 weeks agoEmnCours
4 months, 3 weeks agoroopansh.gupta2
7 months agod39f475
10 months, 3 weeks agoDusica
11 months, 3 weeks agof214eb2
1 year agoCharley92
1 year agoda257c2
1 year agoKhadija10
1 year, 2 months agoalphilla
1 year, 2 months ago