Hi,
I'd like to have your feedback on what kind of performance people are getting with the distribution of big application with a vast amount of files. For example, I'm currently deploying Autodesk Softimage 2015 which weights 2,308 MB and has a total of 17,690 files spread across 1,370 folders.
This deployment will take more than 4 hours to install while copying the installer locally and running it takes a fraction of this time. I've noticed the client won't be downloading anything for hours while the status in software center stays at 0%. The ccmcache folder isn't even growing in size during all this time.
We have a 20GB link between the client and the distribution point and both client and the DP's hardware resources are more than sufficient.
Please note that in the end the deployment works and for smaller apps with a reasonable amount of files the performance is perfect. Basically, what is SCCM doing while the download status is stuck at 0% for hours? Is HTTPS the right protocol to transfer such a big amount of files? Is the client validating the data during at this time? How can we optimize that to make sure I don't end up having to point users to a share and install apps with a batch file?
Thank you,
-Pierre