Tag Archives: sccm

OSD – Live from the field my ‘Best Practices’ for Operating System Deployment

Published / by Rens Hollanders / 6 Comments on OSD – Live from the field my ‘Best Practices’ for Operating System Deployment

Working with technique often requires a certain focus to maintain. This focus lays mostly on the technical part.

When working with tools such as Microsoft Deployment Toolkit or System Center Configuration Manager, -the people who are working with such tools are mostly technical persons- you are already building, scripting, testing etc. And what I have learned since I started working with Microsoft Deployment Toolkit almost 4 years ago, and System Center Configuration Manager almost 3 years ago. Is that sometimes you need to get out of your (technical) zone.

The zone that I’m working in, was trying to find solutions from a technical standpoint. Solving everything with scripts, command lines, etc.

This is why today, I’m not writing a technical blog, but a blog which hopes to achieve to bring the logical and more sensible things to the table when it comes to Operating System Deployment. In my presentation which I did for my Microsoft Certified Trainer course, which was about Operating System Deployment, I accumulated some ‘best practices’ that I’d like to share, based on my own experiences.

For the TLDR people among us, you can read my ‘Best Practices’ here:

  1. Does the current state of the company’s infrastructure play a role?
  2. What release cycle do you want to maintain for your reference image
  3. What content is worthwhile putting in your reference image
  4. Copying raw data is much faster than installation time
  5. Microsoft Deployment Toolkit and System Center Configuration Manager -when it comes to Operating System Deployment- are tools for deployment, not the solution to World Peace!

1. Does the current state of the company’s infrastructure play a role?

At one of the projects where I was involved in creating a Operating System Deployment solution, the network capacity was a big constraint to use conventional solutions to distribute the product that was built. The company had a SCCM infrastructure for 22.000 clients based on a SCCM topology with 1 central site, 3 primary sites on each large continent (EMEA, APAC and AMAC) and approx. 120 distribution points all around the world.

Since we had to support 25 hardware models, Windows 7 Enterprise with 7 additionally embedded language and office language packs, the total size of the product exceeded 32 Gb. Far more then supported in SCCM 2007 R2 SP2 which lead us to a few challenges:

  • SCCM vs. MDT
  • Offline distribution instead of online to relieve the network becoming
  • Other limitations in SCCM

Since SCCM operates very differently opposed to MDT when it comes to distributing content and taking into account that in such a large organization you have to deal with other departments that are using the same infrastructure , and other departments that are sharing the same usage of SCCM as we do. You might want to reconsider if it would be wise to build your reference image with SCCM!

For example, when we first built our reference image on the company’s infrastructure based on a SCCM and Hyper-V basis, it sometimes took over 14 hours to run the entire task sequence, and an additional 10 hours to capture the image (Told you the network was a big constraint) so after some time, I optioned to build the reference image with MDT, since it has less dependency with the production network, and it could be done in any environment, bringing back the build and capture time from 24 hours to 8 hours. Which is a big win.

We decided to use this approach in the future: To build the reference image in MDT and then import in SCCM for distribution. Which I believe is also best practice Microsoft advises you to do.

Then a next obstacle arose, distribution, the network wasn’t capable of staging 22.000 clients, so we had to use offline media. A functionality that is both present in MDT and SCCM. And where MDT has as far as I know no constraints, SCCM 2007 has the following constraints:

Constraint 1

USB sticks need to be NTFS formatted on a Windows 7/Server 2008 system, then the media needs to be created on a Windows XP/Server 2003 console. Ridiculous right? Try to create standalone media on a Windows 7/Server 2008 system, the 2007 SCCM console will format your USB during the creation of the standalone media right back to FAt32, which can’t handle (image) files larger than 4 Gb!

Constraint 2

USB sticks used for standalone media with SCCM 2007 may not exceed 32 Gb size, SCCM can’t handle USB that exceeds 32 Gb. Just a fact, nothing you can do about it, or is it? In this case after consulting the stakeholders of the project we optioned to use ‘dummy’ driver package’s. Empty folders with only a text file present stating the name of a hardware model. So we could replace the contents in this folder with the actual driver contents that needed to be there. A least desirable option, since it requires manual operations, and wherever people are involved, mistakes are made. Could we have solved this any other way? -Yes, by creating multiple USB products, stick 1 would support hardware from A to M and stick 2 to support hardware from N to Z. Also not very efficient.

Constraint 3

Task Sequence XML files may not exceed certain amount of size in SCCM 2007, 2012, 2012SP1 and 2012 R2, again just a fact, nothing we can do about it. Does this also apply for MDT? not as far as I know.

This is something that need’s to be tested. On the other hand, if you have task sequences that large, you might want to reconsider if there are things that can be consolidated or simplified by using Task Sequence Variables or if you can’t avoid steps, scripting some additional actions into VB or PoSh.

I think I have proven my point enough already, sometimes you just need an open and clear view on the situation and ask yourself: “is this really the way to go?”

2. What release cycle do you want to maintain for your reference image

Aaahhh, point 2, this one can be a bit shorter (I hope)

Release cycles, release management, all large (mostly Enterprise) company’s have processes tied to the release of their IT operation. Windows updates, general application updates, new releases of an operating system build etc. Chance is, you need to conform to a release calendar, freeze windows etc.

These things don’t have to be an obstacle. Imagine the release calendar options for quarterly windows to release an operating system update, or half-yearly. There is enough time to develop and test the upcoming build. And there is also enough time to decide which content is worth putting into the reference image. Which brings me at point 3.

3. What content is worthwhile putting in your reference image

A picture says more than a thousand words, so here we go:

figure 1.1: OSD components stack

Pizza Time!

Building your image, isa like making a pizza (Super Mario accent)

Making a pizza is about having good dough, sauce, cheese and other delicious ingredients. When you have made the pizza to your taste, you put it in the oven, so that when the pizza is finished, you sprinkle some Italian herbs and other toppings over it, which finishes your pizza of.

In relation to my previous point, which content is worth putting into your reference image? Imagine, you have decided to embed Adobe Reader into your image, as we all know, Adobe Reader just like Adobe Flash, Adobe Shockwave, Microsoft Silverlight, Oracle Java, are generic applications which have a rapid release cycle. Almost every month an update for any of these applications is released. Would it be wise to put these applications in the image? Not immediately, -but hey, if there is any reason why it should be in there, that’s a decision for you to make.

figure 1.2: OSD Concept Build and Deploy

Build & Deploy

So back to my pizza making: The same thing counts when creating a reference image. The Windows operating system forms the dough. The generic middleware applications everyone uses acts as the sauce and cheese. The office suite are the mushrooms, and then the image goes in the oven for a little sysprep and capture 😉 and when the reference pizza is ready, you install some little Adobe Reader over it!

Never mind, I’m crazy about pizza! 😀

4. Copying raw data is much faster than installation time

In relation to point 2 and 3. Sometimes it can be convenient to put certain data into your reference image. For example to decrease deployment time. Regardless if you install your clients over the network or via USB media, copying raw data is much faster than installing applications from a distribution point or USB media.

Imagine you have a generic business application which has a huge front-end, a tool that everyone uses in a Enterprise environment, I’m just gonna shout something crazy: “SAP Gui”, a massive application which with a little bit of bad luck exceed’s more than 500mb of disk space when the application is installed. What do you think is faster? Installing this application during the deployment of your target computers, so you can sit and wait untill all DLL’s and other files are registered and configured on the system, or embedding the application in the image? Exactemundo!!!

5. Microsoft Deployment Toolkit and System Center Configuration Manager –when it comes to Operating System Deployment– are tools for deployment, not the solution to World Peace!

Last but certainly not least, I think even my most important rule, piece of advise: Deployment tools, are deployment tools, not the solution to World Peace. I once experienced a customer doing a project, where I was building a Operating System Deployment solution for, and he asked me:

Hey Rens, can we also enable Remote Desktop Protocol with MDT“, I said yes we can, I have the PoSh command right here. “OK, great“, next up he asked me: “Hey Rens, can we also configure the firewall rule for the Remote Desktop Protocol“, I reacted, yes we can I have the PoSh command right here “OK great“. Next question: “Hey Rens, can you also set the wallpaper to pink?“, I reacted, yes we can I have the script right here!

Suddenly I was configuring all kinds of stuff in MDT that had nothing to do with Operating System Deployment. So I told the customer, that these things could be better managed with Group Policy Objects.

The lesson which can be learned from this is that you need to use tools what they are intended for, and never let yourself be tempted or lose sight of what you are doing, and how much it affects the big picture, or have impact on the situation, environment, etc.

Using these tools is not only about having the technical skills, it’s also about, using logical sense, gut-feeling, intuition, and your experience in the field of Operating System Deployment to make the right and necessary decisions to perform your work to the best of your abilities!

So next time when you are stuck with a technical decision or issue. Take one step backwards, step out of the technical zone, leave the OSD tunnel and try to look at it from a different point of view, a fresh perspective to do what’s right and make the right decision.

There is no right and no wrong, only what’s best and what’s better!

If you have any ‘best practices’ to share, please feel free to contribute in the comment section!

Thanks for reading 🙂

SCCM / MDT: identifying SSD’s from your Task Sequence by Windows Performance Index!

Published / by Rens Hollanders / 11 Comments on SCCM / MDT: identifying SSD’s from your Task Sequence by Windows Performance Index!

Many has been written about detecting SSD’s through WMI query’s and scripts to perform optimization settings for SSD’s when deploying Windows 7/8 to an SSD Disk.

And although there is no unique identifier to really determine if we are dealing with an SSD or traditional hard drive I believe I may have found something that works better then all the other available options.

For example there is the SSD Drive Script which queries for the ‘ random read rate’  and if it appears to be higher then 8Mb/s it will mark your drive as SSD.

Then there is also the WMI Query: wmic diskdrive get caption which queries for the disk Caption, and sometimes a vendor has written ‘ SSD’ into the Caption. But that is also to random.

Other alternatives are managing and detecting hard drive product names, but then there is always human intervention required to update the list, because hard drives are being renewed all the time.

Because the value of the SSD Drive Script sometimes outputs not the correct value I have found a better and more reliable tool to determine the hard drive type and speed. The integrated Windows Performance Index that is present within Windows 7/8. Because this performance assessment generates a properly random read/write rate which it looks you have to generate it yourself when your disk is idle at the time the ‘SSD Script’ is being run.

To initiate a Windows Performance Index rating we need our machine during deployment or patching to be rated.

We do this by adding a ‘ run command line’ with the following parameters:

C:\Windows\SysNative\WinSAT.exe formal

winsat

Next the Performance Index will rate the system performance and when ready have the hard drive assessed:

performance index

Obviously we cant see the rating when executing a task sequence and the Windows shell being locked, but if we would run a normal WinSAT benchmark this would be the result. And I have noticed that the values of a benchmarked traditional hard drive will not exceed 6.9 while SSD’s exceed 7.0 and more.

No we have something that we can use as a condition to either run or do not run SSD optimalization tasks in a SCCM / MDT task sequence, by using the following condition on a group of optimalization actions:

condition

Because the machine’s hardware has been assessed a performance number has been assigned to the disk. If the disk is not rated the value always responds with zero. In this case we query if the hard drive performance rating is greater then 6.9, if this is the case we execute our SSD optimalization, if it is smaller we skip these steps.

*Alternatively, you can use the same query to execute the WinSAT benchmark for your system if it is allready rated, if it is rated it will skip the benchmark and if a machine is not rated, the benchmark will be executed.

Hope this may help you in your quest to detect and find a reliable condition from which the detection of SSD’s can be queried, and I hope that in the near feature other solutions may be handed to us.