MDT2013 – Powershell ‘BESERK’ mode, configure everything with Powershell!!!

Published / by Rens Hollanders / 5 Comments on MDT2013 – Powershell ‘BESERK’ mode, configure everything with Powershell!!!

Very recently, deploymentreseach’ Johan Arwidmark, blogged about configuring your DeploymentShare properties (which are saved in .\Control\Settings.xml) with Powershell.

Some time ago I started working on some Powershell scripts myself, mostly oriented on configuring the bootstrap.ini and customsettings.ini, and creating some folders within the MDT Workbench.

While I never had gotten the time and energy to finish it, Johan’s blog basically pushed me over the edge, and then it happened… I went BESERK /cannot compute 😛

beserk

The result you can find in the following script which not only configures your deployment share properties, but also configures:

  • Bootstrap.ini
  • CustomSettings.ini
  • Modifies “.\Scripts\Litetouch.wsf” to use the _SMSTSPackageName variable
  • Configure DeploymentShare Properties
  • Creates a folderstructure within MDT
  • Creates selection profiles within MDT

The script! Updated @ 2014-03-20

Basically this script can be divided into five pieces:

1. Constants

The constants that I have declared are the variables that represent the value that is shown, by modifying these values, you can customize and personalize this script for your own purposes. This script is primarily intended for new to be configured deployment environments, so keep in mind that using this script on existing environments will overwrite all content within the bootstrap.ini and customsettings.ini!

2. Configuring Bootstrap.ini and Customsettings.ini

Basically I use a multiline string for both my Bootstrap.ini and CustomSettings.ini and within that line of code, on certain places I call my variables. This way the variables will be used for input, instead of having to change a server name every single time on every single row of code.

3. Configuring the DeploymentShare properties

This part we already know from Johan / deploymentreseach, however I’ve took the liberty to customize it to my wishes, and fully write out all the available settings in .\Control\Settings.xml

4. Configure a logical folderstructure within the MDT Workbench

This part is quite new, I now know again thanks to Johan and Roger the Young a PSDrive can be browsed from a powershell command prompt, just type: dir ds001: and you can actually see and browse the MDT Workbench ‘folder structure’.

figure 1.1: Browse PSdrive

psdrive

If you want to browse any further, just type dir DS001:\Applications or “DS001:\Operating Systems” (note that if you use spaces in your folders, you need to use quotes “”)

5. Configure selection profiles for Driver and Packages selection

The last thing I wanted to do is configure pre-configured selection profiles which can be uses for injecting certain drivers, targeting packages, creating media etc.

What has changed since 2014-03-20

The script needed some simplification to avoid having to enter the same values at different places. Also now the script creates an entire deployment share for you. So now there is no need to create an deploymentshare with MDT before executing this script. Lastly the ZTIDiskpart.wsf will be modified to calculate with binary metrics. So a 100 Gb disk partition specified will actually end up as an 100 Gb disk partition instead of 98,7 Gb.

Closing

Is this the right way to do it, I have no clue, what I do know is that I never want to configure a deploymentshare manually ever again, so this script definitely helps me. Also, adjusting the values of the constants declared is a fraction of the effort I have to put in vs. going through “Create folder” wizards for the 50th time.

Special thanks go out to scripting wizard Roger the Young a.k.a. “DrillSergeant” and Johan Arwidmark, for giving me renewed inspiration to continue to develop this script in what it has become.

ps I. forgive my powershell language, using words as constants, variables, properties and values, are perhaps not used in the correct way! -im still trying 🙂

ps II. also I can imagine this script can be simplified drastically, and I don’t mind if you do, as long as you share it back with me, like I shared it here with you 😀 Or as the Germans say: “wiedersehen macht freude”

Finally, you can download the script here:

zip
ConfigureDeploymentShare.zip

 

MDT 2013 – Configuring your environment for Bitlocker deployments with TPM, Windows 8.1 and MDT 2013

Published / by Rens Hollanders / 8 Comments on MDT 2013 – Configuring your environment for Bitlocker deployments with TPM, Windows 8.1 and MDT 2013

In a previous post I explained how we could deploy the HP Elitepad 900 with the Microsoft Deployment Toolkit.

For that same project that I have recently worked on, it was a requirement that this tablet would be deployed unattended, securely and reproducible.

I defined the following actions that needed to be done:

  1. Extending the AD Schema
  2. Update policy templates (since we where running Server 2008 R2)
  3. Configure ‘Bitlocker’ Group Policy Settings
  4. Configure CustomSettings.ini
  5. Configure Task Sequence
  6. Configure Unattended.xml
  7. Use a domain account
  8. Perform a test deployment

1. Extending the AD Schema

On the internet there was a lot of information to find on how to achieve this. The information that I found useful was mostly from Microsoft’s own blog sites and was very helpful in configuring this to get it to work first time right.

The blogs that helped me achieve this:

From the link below a complete documentation guide and 4 vbs scripts help you configure the Active Directory Domain Environment to be prepped for storing Bitlocker information into Active Directory.

Requirements

The basic requirements on how to achieve having bitlocker write information into active directory, can be derived from the document: “Configuring Active Directory to Back up Windows BitLocker Drive Encryption and Trusted Platform Module Recovery Information.doc” which can be downloaded from the link I have provided.

2. Update policy templates

Updating the policy templates makes sure, that the Group Policy Manager can posses over the latest available policy templates out there. When running a Server 2012 R2 domain controller, these templates are already available, but if you’re running an earlier version of Windows Server (from 2003 sp2 up to 2008 R2), it is recommended that the policy templates are updated.

This can be done by:

  1. Downloading the Administrative Templates (.admx) for Windows 8.1 and Windows Server 2012 R2
  2. Updating the current templates with the new templates

Step 2. is actually quite easy, type in your FQDN followed by “\SYSVOL\Policies” which brings you to the folder where the policy templates are located. Just in case before you do anything, creating a back-up of the current policy files might come in handy in case you want to rollback or something goes wrong.

Just paste the new templates in the Policies folder, to find the new Server 2012R2 and Windows 8.1 policies available in the Group Policy Manager straightaway.

3. Configure ‘Bitlocker’ Group Policy Setting

Configuring the required group policy settings for Bitlocker, makes sure all the necessary information about the computer object will be stored in Active Directory that is being deployed. In the zip file at the bottom of this page you will find the desired GPO configuration in HTML, needed to store the information Active Directory. Also these policies are perfectly explained in the referenced document above, and in the provided ‘useful links’  section at the bottom of this page. And to get you started, I have provided a screenshot of those policies right here:

figure 1.1: Bitlocker GPO Configuration

bitlocker-policy

4. Configure CustomSettings.ini

Configuring the CustomSettings.ini. Basically there is enough information to find in the documentation of MDT itself on how to configure the properties for bitlocker, and which properties you can configure and what their values are. However I did some investigation, and came up with the following configuration:

figure 1.2: DeploymentShare properties, Rules (customsettings.ini)

bitlocker-csini

codeblock 1.1: customsettings.ini rules

As you can see I have set my priority on Model 1st and Default 2nd.

So all rules stated under  HP Elitepad 900 overrule the Default section, and only apply for this model.

For clarification I often comment my customsettings.ini, since the people who are going to work with it, may want to understand why a certain setting is set.

BDEInstallSuppress=NO
BDEWaitForEncryption=FALSE
BDEDriveLetter=S:
BDEDriveSize=2000
BDEInstall=TPMKey
BDERecoveryKey=AD
BDEKeyLocation=C:\Windows\BDEKey

5. Configure Task Sequence

When the CustomSettings.ini is configured, the next thing we need to do is make some adjustments in the task sequence on the Bitlocker part:

figure 1.3: Task Sequence properties, configuring bitlocker

bitlocker-ts

In the ‘State Restore’ section, click on the “Enable Bitlocker” step, and check the following:

  • Current Operating System Drive
  • TPM Only
  • Choose where to create the recovery key
  • In Active directory

Alternatively you may check: “Wait for bitlocker to complete the drive encryption process on all drives before continuing the task sequence execiution

This means, that the Task Sequence will wait until the entire drive is encrypted, then perform a reboot, and continue with the task sequence.

6. Configuring Unattended.xml

Configuring the Unattended.xml has little to nothing to do with configuring bitlocker, however, to achieve a fully unattended installation. It is recommended you extend your Windows 8.1 Unattended.xml in the TaskSequenceID folder with the following additions:

codeblock 1.2: Windows 8.1 unattended.xml additions to suppress Windows 8.1 setup wizard

The following strings make sure the Windows 8.1 setup will not interfere with the process.

7. Use a domain account

Since we are configuring deployments to work with Bitlocker and storing the recovery password into Active Directory we at least need some form of authentication. My experiences are, that the domain join account which is used to join the machine to the domain, has enough privileges to first: create the computer object in Active Directory and second: write the bitlocker recovery key and TPM owner information into Active Directory on the same computer object.

A domain account does not need all kind of fancy privileges and certainly not needs to be an Domain Admin. To see which privileges are required, please visit the following two blogs which explain it perfectly:

8. Perform a test deployment

The only thing that remained was performing a deployment test, which of-course I did, and the results were very satisfying 🙂

figure 1.4: trace64.exe – bdd.log

trace64-tmp

figure 1.5: computer object properties – active directory

computer-object-properties

figure 1.6 computer object properties – bitlocker-recover

computer-object-bitlocker

Usefull links

These links helped me on my way achieving this:

Find attached the resultant set of policy that has been configured in Group Policy Manager, a copy of the BDD.log of a successful deployment, the screenshots used in this blog, and a copy of my customsettings.ini rules that I have used.

zip
BlogContents.zip

If there are any questions or improvements you’d like to share, please feel free to contribute in the comment section!

Thanks for reading this blog! 😀

ps. forgive me for the Dutch computer object property screenshots, this is just for the moment until I can retrieve some English looking panes.

 

HP Softpaq Download Manager – Great tool, but there is room for improvement

Published / by Rens Hollanders / Leave a Comment

Today I needed to accumulate some hardware drivers for a certain range of models to be supported in my Operating System Deployment environment which is based on MDT 2013.

Large hardware vendors like Hewlett-Packard and DELL have great support for drivers and driver packs suitable for enterprise client distribution methods like System Center Configuration Manager or Microsoft Deployment Toolkit.

DELL features a great webpage on their website, from where all hardware drivers for business and enterprise supported hardware can be downloaded.

Dell Driver CAB files for Enterprise Client OS Deployment

Hewlett-Packard features the software download program “HP Softpaq Download Manager”

HP SoftPaq Download Manager

A tool from where all related software, drivers, utilities, BIOS’ and more can be downloaded directly from HP.

figure 1.1: HP Softpaq Download Manager

HP Softpack Interface

Along the way, using this tool I encountered some things that I think would be a great improvement for an upcoming release.

Firstly, the possibility to apply a filter on the categories. So that I can choose to only see “Manageability – Driver Pack”, or Driver’s.

figure 1.2: HP Softpaq Download Manager – Configuration Options

HP Softpack Configuration Options - Columns Filter

so in stead of this, I would like to see the following added to the configuration options:

figure 1.3: Category Filter

HP Softpack Improvement - Filter

which would then have the following options:

figure 1.4: Category Options

HP Softpack Configuration Options - Columns Filter2

Another thing I encountered, is that my machine had already gone into energy save mode and powered off my screen. So upon mouse click, accidentally the download was cancelled, because the mouse cursor had not been moved, since I initiated the download. Resulting in the cancellation of the software I was currently downloading.

Of course it was my mistake to leave the cursor right on top of that button, on the other hand, what’s a little extra wizard pane going to do harm, by asking if I’m really sure I want to cancel the download?

figure 2.1: Cancel button, and Column filter option

HP Softpack Improvement

And last but not least, why not throw in another option, to only download the latest, most recent version of any available software, instead of offering an entire history of versions which can be downloaded for a particular piece of software.

To me it doesn’t makes sense, that when I’m downloading drivers, or BIOS’, the tool presents me all the content there is. Just offer it as an option: Do you want to see the latest available software, or the entire library?

A quick peek in the HP Forums, on this topic, brings under the attention that this functionality is already available, but only, if the HP Softpaq Download Manager is installed on a HP computer.

So please give this blog a boost, to bring this under the attention of HP. Because as I said, it’s a great tool, but there is -always- room for improvement! 😀

If it’s not to much trouble, please support this cause at the softpaq download manager forum section

Thanks for reading!

Any suggestions, please feel free to post them in the comment section.

 

 

OSD – Live from the field my ‘Best Practices’ for Operating System Deployment

Published / by Rens Hollanders / 6 Comments on OSD – Live from the field my ‘Best Practices’ for Operating System Deployment

Working with technique often requires a certain focus to maintain. This focus lays mostly on the technical part.

When working with tools such as Microsoft Deployment Toolkit or System Center Configuration Manager, -the people who are working with such tools are mostly technical persons- you are already building, scripting, testing etc. And what I have learned since I started working with Microsoft Deployment Toolkit almost 4 years ago, and System Center Configuration Manager almost 3 years ago. Is that sometimes you need to get out of your (technical) zone.

The zone that I’m working in, was trying to find solutions from a technical standpoint. Solving everything with scripts, command lines, etc.

This is why today, I’m not writing a technical blog, but a blog which hopes to achieve to bring the logical and more sensible things to the table when it comes to Operating System Deployment. In my presentation which I did for my Microsoft Certified Trainer course, which was about Operating System Deployment, I accumulated some ‘best practices’ that I’d like to share, based on my own experiences.

For the TLDR people among us, you can read my ‘Best Practices’ here:

  1. Does the current state of the company’s infrastructure play a role?
  2. What release cycle do you want to maintain for your reference image
  3. What content is worthwhile putting in your reference image
  4. Copying raw data is much faster than installation time
  5. Microsoft Deployment Toolkit and System Center Configuration Manager -when it comes to Operating System Deployment- are tools for deployment, not the solution to World Peace!

1. Does the current state of the company’s infrastructure play a role?

At one of the projects where I was involved in creating a Operating System Deployment solution, the network capacity was a big constraint to use conventional solutions to distribute the product that was built. The company had a SCCM infrastructure for 22.000 clients based on a SCCM topology with 1 central site, 3 primary sites on each large continent (EMEA, APAC and AMAC) and approx. 120 distribution points all around the world.

Since we had to support 25 hardware models, Windows 7 Enterprise with 7 additionally embedded language and office language packs, the total size of the product exceeded 32 Gb. Far more then supported in SCCM 2007 R2 SP2 which lead us to a few challenges:

  • SCCM vs. MDT
  • Offline distribution instead of online to relieve the network becoming
  • Other limitations in SCCM

Since SCCM operates very differently opposed to MDT when it comes to distributing content and taking into account that in such a large organization you have to deal with other departments that are using the same infrastructure , and other departments that are sharing the same usage of SCCM as we do. You might want to reconsider if it would be wise to build your reference image with SCCM!

For example, when we first built our reference image on the company’s infrastructure based on a SCCM and Hyper-V basis, it sometimes took over 14 hours to run the entire task sequence, and an additional 10 hours to capture the image (Told you the network was a big constraint) so after some time, I optioned to build the reference image with MDT, since it has less dependency with the production network, and it could be done in any environment, bringing back the build and capture time from 24 hours to 8 hours. Which is a big win.

We decided to use this approach in the future: To build the reference image in MDT and then import in SCCM for distribution. Which I believe is also best practice Microsoft advises you to do.

Then a next obstacle arose, distribution, the network wasn’t capable of staging 22.000 clients, so we had to use offline media. A functionality that is both present in MDT and SCCM. And where MDT has as far as I know no constraints, SCCM 2007 has the following constraints:

Constraint 1

USB sticks need to be NTFS formatted on a Windows 7/Server 2008 system, then the media needs to be created on a Windows XP/Server 2003 console. Ridiculous right? Try to create standalone media on a Windows 7/Server 2008 system, the 2007 SCCM console will format your USB during the creation of the standalone media right back to FAt32, which can’t handle (image) files larger than 4 Gb!

Constraint 2

USB sticks used for standalone media with SCCM 2007 may not exceed 32 Gb size, SCCM can’t handle USB that exceeds 32 Gb. Just a fact, nothing you can do about it, or is it? In this case after consulting the stakeholders of the project we optioned to use ‘dummy’ driver package’s. Empty folders with only a text file present stating the name of a hardware model. So we could replace the contents in this folder with the actual driver contents that needed to be there. A least desirable option, since it requires manual operations, and wherever people are involved, mistakes are made. Could we have solved this any other way? -Yes, by creating multiple USB products, stick 1 would support hardware from A to M and stick 2 to support hardware from N to Z. Also not very efficient.

Constraint 3

Task Sequence XML files may not exceed certain amount of size in SCCM 2007, 2012, 2012SP1 and 2012 R2, again just a fact, nothing we can do about it. Does this also apply for MDT? not as far as I know.

This is something that need’s to be tested. On the other hand, if you have task sequences that large, you might want to reconsider if there are things that can be consolidated or simplified by using Task Sequence Variables or if you can’t avoid steps, scripting some additional actions into VB or PoSh.

I think I have proven my point enough already, sometimes you just need an open and clear view on the situation and ask yourself: “is this really the way to go?”

2. What release cycle do you want to maintain for your reference image

Aaahhh, point 2, this one can be a bit shorter (I hope)

Release cycles, release management, all large (mostly Enterprise) company’s have processes tied to the release of their IT operation. Windows updates, general application updates, new releases of an operating system build etc. Chance is, you need to conform to a release calendar, freeze windows etc.

These things don’t have to be an obstacle. Imagine the release calendar options for quarterly windows to release an operating system update, or half-yearly. There is enough time to develop and test the upcoming build. And there is also enough time to decide which content is worth putting into the reference image. Which brings me at point 3.

3. What content is worthwhile putting in your reference image

A picture says more than a thousand words, so here we go:

figure 1.1: OSD components stack

Pizza Time!

Building your image, isa like making a pizza (Super Mario accent)

Making a pizza is about having good dough, sauce, cheese and other delicious ingredients. When you have made the pizza to your taste, you put it in the oven, so that when the pizza is finished, you sprinkle some Italian herbs and other toppings over it, which finishes your pizza of.

In relation to my previous point, which content is worth putting into your reference image? Imagine, you have decided to embed Adobe Reader into your image, as we all know, Adobe Reader just like Adobe Flash, Adobe Shockwave, Microsoft Silverlight, Oracle Java, are generic applications which have a rapid release cycle. Almost every month an update for any of these applications is released. Would it be wise to put these applications in the image? Not immediately, -but hey, if there is any reason why it should be in there, that’s a decision for you to make.

figure 1.2: OSD Concept Build and Deploy

Build & Deploy

So back to my pizza making: The same thing counts when creating a reference image. The Windows operating system forms the dough. The generic middleware applications everyone uses acts as the sauce and cheese. The office suite are the mushrooms, and then the image goes in the oven for a little sysprep and capture 😉 and when the reference pizza is ready, you install some little Adobe Reader over it!

Never mind, I’m crazy about pizza! 😀

4. Copying raw data is much faster than installation time

In relation to point 2 and 3. Sometimes it can be convenient to put certain data into your reference image. For example to decrease deployment time. Regardless if you install your clients over the network or via USB media, copying raw data is much faster than installing applications from a distribution point or USB media.

Imagine you have a generic business application which has a huge front-end, a tool that everyone uses in a Enterprise environment, I’m just gonna shout something crazy: “SAP Gui”, a massive application which with a little bit of bad luck exceed’s more than 500mb of disk space when the application is installed. What do you think is faster? Installing this application during the deployment of your target computers, so you can sit and wait untill all DLL’s and other files are registered and configured on the system, or embedding the application in the image? Exactemundo!!!

5. Microsoft Deployment Toolkit and System Center Configuration Manager –when it comes to Operating System Deployment– are tools for deployment, not the solution to World Peace!

Last but certainly not least, I think even my most important rule, piece of advise: Deployment tools, are deployment tools, not the solution to World Peace. I once experienced a customer doing a project, where I was building a Operating System Deployment solution for, and he asked me:

Hey Rens, can we also enable Remote Desktop Protocol with MDT“, I said yes we can, I have the PoSh command right here. “OK, great“, next up he asked me: “Hey Rens, can we also configure the firewall rule for the Remote Desktop Protocol“, I reacted, yes we can I have the PoSh command right here “OK great“. Next question: “Hey Rens, can you also set the wallpaper to pink?“, I reacted, yes we can I have the script right here!

Suddenly I was configuring all kinds of stuff in MDT that had nothing to do with Operating System Deployment. So I told the customer, that these things could be better managed with Group Policy Objects.

The lesson which can be learned from this is that you need to use tools what they are intended for, and never let yourself be tempted or lose sight of what you are doing, and how much it affects the big picture, or have impact on the situation, environment, etc.

Using these tools is not only about having the technical skills, it’s also about, using logical sense, gut-feeling, intuition, and your experience in the field of Operating System Deployment to make the right and necessary decisions to perform your work to the best of your abilities!

So next time when you are stuck with a technical decision or issue. Take one step backwards, step out of the technical zone, leave the OSD tunnel and try to look at it from a different point of view, a fresh perspective to do what’s right and make the right decision.

There is no right and no wrong, only what’s best and what’s better!

If you have any ‘best practices’ to share, please feel free to contribute in the comment section!

Thanks for reading 🙂

 

MDT – Unattended.xml, CustomSettings.ini, Task Sequence Variable, which setting takes precedence over which setting?

Published / by Rens Hollanders / 43 Comments on MDT – Unattended.xml, CustomSettings.ini, Task Sequence Variable, which setting takes precedence over which setting?

I sometimes see a question on social.technet, which gets me thinking, this could be good material for a blog.

So was this, a question about which specified values take precedence over each other.

As some of you might know, MDT works with, and can be provided with certain values that are needed as input for an Operating System Deployment.

Normally certain values can be and will be provided during deployment. For example, when the MDT wizard pages are displayed, to ask for a specific information, like “Computer Details”, where the OSDComputerName of the machine can be injected.

figure 1.1: Deployment Wizard – Computer Details

osdcomputername_cs.ini

This information can also be specified elsewhere, specified in three different ways:

  • Unattended.xml
  • CustomSettings.ini
  • Task Sequence Variable

The question however is, which setting takes precedence over which setting? Only one way to find out 🙂

So I have set-up my MDT environment like this:

figure 1.2: Deployment Share Properties – Rules

cs_rules

As we can see here, the Wizard pane “SkipComputerName” is set to NO, which means it should show the computer details wizard. The value underneath “OSDComputerName” already reveals, that if this value is specified in the customsettings.ini this value will be used as the designated name for this particular machine.

However, we can also specify the OSDComputerName property somewhere else, and what happens if it is specified twice or triple?

Therefore I have created the following step in my deployment task sequence:

figure 1.3: Task Sequence Variables

osdcomputername_tsvar

a Task Sequence variable which holds the following information: Task Sequence Variable (Name) and (Task Sequence Variable) Value. As we can see the value is: “OSBUILD-TSVAR” while in my CustomSettings.ini the value for OSDComputername is OSBUILD-CSINI.

Now when I start a deployment I can see the following in my logfile (which can be perfectly read with trace64.exe or trace32.exe (depending on the version of the operating system architecture):

figure 1.4: BDD.log

trace_csini_vs_tsvar

In the logfile we see, that at first the value which resides in the CustomSettings.ini is used while a little bit after, the task sequence variable value specified in the task sequence is being used.

Lastly there is the Unattended.xml a file we haven’t discussed before, but I think along the way we all know the answer, since it’s already hidden (or visible 😉 ) in the logfile:

figure 1.5: BDD.log

trace_unattended

Because when we look in the logfile of a running or completed deployment we will find values like the following everywhere: “Updated C:\MININT\Unattend.xml with BitsPerPel=32 (value was 32)” which means, that any value specified in either CustomSettings.ini or Task Sequence Variable, or settings stored in a database, will take precedence over a specified value in the Unattended.xml. So that right after the deployment has started, the gather step will run, and will collect information from all available ‘sources’, CustomSettings.ini, Task Sequence Variable or the default Unattended.xml which resides in your “DeploymentShare\Control\TaskSequenceID” folder:

figure 1.6: Running Lite Touch Installation

 

rh

So in talking terms of priority:

  1. Unattended.xml goes above nothing
  2. CustomSettings.ini goes above Unattended.xml
  3. Database goes above CustomSettings.ini and Unattended.xml
  4. Task Sequence Variable, goes above Database, CustomSettings.ini and Unattended.xml

And then there’s also the possibility to influence values in the CustomSettings.ini with the “Priority” parameter, which I will discuss in a later blog.

I hope this can be an eye opener for everyone who is working with these three types of files and configuration settings, and hopefully it will enable you to create a flexible and transparent way to deploy your operating system to multiple computers, hardware, different types of machine. As long as you remember, the greatest common divisor (or denominator) of a setting, should always be provided, on an area where it’s impact is the smallest.

Meaning, that it has no use to modify the OSDComputerName variable each time when doing a new computer deployment. If you are going to deploy 1000 machines, leave the field empty. If you need to specify a workgroup for a particular task sequence or type of deployment, specify it in the customsettings.ini (or in the task sequence itself) whatever suits you best.

I think a nice goal to strive to, is to adjust both your task sequence and/or customsettings.ini as little as possible or as necessary.

Thanks for reading and keep on automating my padawan learners 😀