66945 Posted October 5, 2015 Report post Posted October 5, 2015 Hi, Almost all of the guides I've read online regarding OS deployment involves building and capturing a 'baseline' OS image and then using task sequences to deploy the captured/baseline image. It is almost like installing OS manually, configuring it - then backing it up and restoring to a new machine (just without the customization ability via task sequences). I took a different approach and simply deploy the standard (unmodified) Windows image and use only task sequences to modify the resulting installation into a final, user-ready state. This achieves 99% of what I need. I don't 'build and capture' anything at all. It also gives me great flexibility/agility since I do not have to re-build and re-capture and re-distribute the 'baseline' image should there be any changes required - I just make changes to the task sequences, click Apply, wait a minute or two for the change to replicate across the organization and Done! I am ready to deploy the updated task sequence in just minutes. Am I missing something? What are the benefits/use cases for 'building and capturing'? Quote Share this post Link to post Share on other sites More sharing options...
anyweb Posted October 5, 2015 Report post Posted October 5, 2015 the benefits are varied but you could think of it like this, imagine you are deploying Windows 7 (yes a lot of people still are), well the WIM file for that doesn't contain all the software updates that have been released in the last few years which means that you either patch it during deployment, or after first login, neither would be a good experience (think of time etc) to mitigate that you can build and capture the image including software updates which will mean the image is relatively stable security wise when being installed, that's just one example of why people use the build and capture method to create reference images. cheers niall Quote Share this post Link to post Share on other sites More sharing options...
66945 Posted October 5, 2015 Report post Posted October 5, 2015 the benefits are varied but you could think of it like this, imagine you are deploying Windows 7 (yes a lot of people still are), well the WIM file for that doesn't contain all the software updates that have been released in the last few years which means that you either patch it during deployment, or after first login, neither would be a good experience (think of time etc) to mitigate that you can build and capture the image including software updates which will mean the image is relatively stable security wise when being installed, that's just one example of why people use the build and capture method to create reference images. cheers niall Hi Niall, Is the above scenario still valid if I combine offline servicing (for the WIM) and online windows updates (during TS)? I always assumed this will mean the OS will be fully patched / most up to date once the TS completes. Generally I configure ADR to remove superseded updates. I will agree though that the purely task sequence method of deployment takes much longer (from deployment start till end) - however zero/lite-touch it may be. Quote Share this post Link to post Share on other sites More sharing options...
anyweb Posted October 5, 2015 Report post Posted October 5, 2015 you can absoultely do offline servicing and combine it with in task sequence applied updates, i was only giving one example of how it's beneficial, there are more, such as size of apps (fat apps like Office, SAP, SEP) which take time to download and install, if they are already baked into the image, that can cut some time off of the deployment either way it's a flexible process and no one is telling you that you have to use one method over another, do what works for you ! cheers niall Quote Share this post Link to post Share on other sites More sharing options...