Citrix Certified Administrator

Tuesday, 21. May 2013

Continuing workplace role changes have taken me away from the Microsoft System Center suite.  Now, I’m working with Citrix XenApp, and had little administration or engineering experience with Citrix technologies prior to this change.

I recently attended Citrix Synergy 2013 in Anaheim, CA and had the opportunity to take a free exam.  Because of my recent experience planning and beginning to implement XenApp 6.5, I took the 1Y0-A20 exam which is titled “Citrix XenApp 6.5 Administration”.  The test wasn’t too difficult considering I’d built several XenApp farms while evaluating 6.5 for migration, and it shouldn’t be too difficult for most who’ve built a farm or two.

Citrix Certified Administrator

Square Foot Gardening

Sunday, 15. April 2012

I recently decided to get back into gardening. I had previously grown various vegetables like tomatoes, peppers, and various herbs.

This past weekend, I created a 4′ x 12′ x 1′ raised garden bed. I decided to try the “Square Foot Gardening” method, which I have never tried before. This method calls for 6″ of loose, nutrient rich soil and a raised bed divided into 1′ squares, with each square containing a unique crop. As my raised beds are 12″ deep, I used 6″ of semi-composted tree material to fill in the bottom half, and commercial “Mel’s Mix” for the top half.

My current garden contains:

  • Tomatoes (transplants)
  • Bell Peppers (transplants)
  • Jalapeño Peppers (transplants)
  • Red Chili Peppers (transplants)
  • Carrots (seeds)
  • Radishes (seeds)
  • Green Onions (seeds)
  • Leeks (seeds)
  • Zucchini (transplants)
  • Straightneck Squash (transplants)
  • Rhubarb (bulbs)
  • Garlic (bulbs)
  • Shallots (bulbs)
  • Basil (transplants)
  • Thyme (transplants)
  • Oregano (transplants)
  • Chives (transplants)
  • Cilantro (transplants)
  • Peppermint (transplants)

Some items will be grown in large plastic pots rather than the raised bed as required due to space constraints.

Initial progress pictures:



ConfigMgr 2012 RC2 – Primary Site is Read-Only

Tuesday, 7. February 2012

In the course of evaluating ConfigMgr 2012 in my lab, I ran into an issue where my Primary site would be in read-only mode. This only occured when using the Primary site along with a Central Administration Site; a standalone Primary site would not exhibit this issue.

Another blog explained my issue perfectly, but they had no solution (link). In short, I received the following message when launching the ConfigMgr console on the Primary site server:

Your Configuration Manager console is in read-only mode while this site completes tasks related to maintenance mode.

After these tasks are complete, you must reconnect you Configuration Manager console before you can edit or create new objects.



The cause for this issue is a lack of replication between the CAS and the Primary site. When you look in Monitoring -> Database Replication, you will probably see that the “Link is being configured”. Until the link is configured, and the necessary replication from the CAS to the Primary site has occured, the Primary site will remain read-only.

The solution (at least in my case) was quite simple. Both sites were using unique instances on the same SQL server, and the SQL Server Service Broker (SSB) port was left at the default value of 4022 when installing each site. A conflict arises when both sites attempt to use the same port for the SSB. To avoid the issue, use a different (and unused!) port for each CAS/Primary site.

If you’re already experiencing the issue, you may re-run the installer and change this:

  1. Run the installer from the local installation directory (for example: C:\Program Files\Microsoft Configuration Manager\bin\X64\SETUP.EXE).
  2. Choose Perform site maintenance or reset this Site from the Available Setup Options screen.
  3. Choose Modify SQL Server configuration from from the Site Maintenance screen.
  4. Adjust the SSB Port setting on the Database Information screen and complete the wizard.

Once you complete the wizard, you will begin to see increased activity on SQL server. It may take a few minutes, but you will notice the Database Replication status will change as the replication setup progresses. You can also monitor the rcmctrl.log file on each site for activity noting that replication is progressing.

(Credit goes to an unknown ConfigMgr at Microsoft who was able to answer questions that I passed through another contact.)

My First *REAL* PowerShell Script

Friday, 13. August 2010

So I’ve done a few one-liner’s in PowerShell to more efficiently collect certain information about a server, folder, etc, but today I wrote what I consider to be my first real PowerShell script.  It should be no coincidence that today’s workday was the conclusion of a week of PowerShell training from Microsoft.

Anyway, the concept behind the script is simple.  I have a few network shares to which our deployment of Microsoft Configuration Manager sites perform their backups to.  I wanted a quick and easy way to be assured that those backups were completing properly.  This script does the following:

  • Loop through each of the main SCCM backup folder shares (one at each major datacenter)
  • Loop through each site backup folder
  • Determine if folder was modified within the last 24 hours
  • Determine if the backup task’s log file was modified within the last24 hours
  • If above conditions are true, open the log file and ensure that it says that the task completed with no errors
  • If above conditions aren’t true, note the directory of the failed backups.
  • Report backup status to the console.  If backups failed, report SCCM site code.
function Check-SCCMBackups
# $arrBackupFolders contains a list of locations that contain SCCM backups.
$arrBackupFolders = "\\server1\share1","\\server2\share2","\\server3\share3"
$dateYesterday = (Get-Date).addDays(-1)
# Initialize variables
# variable to hold list of sites whose backups did not run
$strSiteBackupNotRun = ""
 # variable to hold list of sites whose backups ran but failes
$strSiteBackupFailed = ""
$boolTasksDidRun = 1 
# initialized to 1; set to 0 if any backup task did execute successfully
$boolTasksRanSuccessfully = 1 
Write-Host "Checking status of SCCM backup folders... " -nonewline
# Loop through each value of $arrBackupFolders....
ForEach ($strUNCPath in $arrBackupFolders)
# ... and get the directories it contains.
$objBackupFolders = Get-ChildItem $strUNCPath | where {$_.PsIsContainer}
# Loop through each discovered directory....
ForEach ($objDirectory in $objBackupFolders)
# ... and check it directory was modified within the last day.
if ($objDirectory.LastWriteTime -ge $dateYesterday)
# Task executed (we know because folder write time was updated)
# Open log file and determine if job completed successfully.
$strLogFilePath = $strUNCPath + "\" + $objDirectory.Name + "\" + $objDirectory.Name + "Backup\smsbkup.log"
# Check smsbkup.log file modify time to ensure it was
# modified within the last 24 hours.
$strLogFile = Get-Item $strLogFilePath
# File was NOT (!) modified in last 24 hours!
If (!$strLogFile.LastWriteTime -ge $dateYesterday)
# Since log file was not modified (something else must've written to the directory)....
# Add Site Code (directory name) to the list of sites that did not run and set the flag...
$strSiteBackupNotRun = $strSiteBackupNotRun + " " + $objDirectory.Name
$boolTasksDidRun = 0
# ...then stop processing this object and continue with the loop.
# Line indicating success is four lines from the bottom of the file.
$strLine = (Get-Content $strLogFilePath)[-4]
# If the line doesnt match criteria, backup did not complete successfully.
if ($strLine -notmatch "Backup task completed successfully with zero errors")
# Task ran but did not complete successfully
# Add Site Code (directory name) to the list of failed backups....
$strSiteBackupFailed = $strSiteBackupFailed + " " + $objDirectory.Name
# ... and set the flag indicating that one or more tasks did not complete successfully.
$boolTasksRanSuccessfully = 0
# Task did not run as the directory modified date is not within last day.
# Add Site Code (directory name) to the list of sites that did not run and set the flag
$strSiteBackupNotRun = $strSiteBackupNotRun + " " + $objDirectory.Name
$boolTasksDidRun = 0
Write-Host "Complete!"
# Check status of boolean flags and present data.
if ($boolTasksDidRun -eq 0 -OR $boolTasksRanSuccessfully -eq 0)
# One or more tasks did not execute or failed during execution.
if ($boolTasksDidRun -eq 0)
# One or more tasks did nto execute; list the site codes (directory names).
Write-Host "Backup tasks for the following sites did not run: $strSiteBackupNotRun" -foregroundcolor red -backgroundcolor black
if ($boolTasksRanSuccessfully -eq 0)
# One or more tasks did not run successfully; list the site codes (directory names).
Write-Host "Backup tasks for the following sites did not complete successfully: $strSiteBackupFailed" -foregroundcolor red -backgroundcolor black
elseif ($boolTasksDidRun -eq 1 -AND $boolTasksRanSuccessfully -eq 1)
# All tasks completed successfully
# (Directory modify dates are within 24 hours and all log files show last task execution was successful.)
Write-Host "All SCCM backup tasks executed and ran successfully." -foregroundcolor green
Write-Host ""

PowerShell Training This Week!

Monday, 9. August 2010

My employer has set up Microsoft training this week. We have scheduled to bring a trainer from Microsoft to our largest office who will provide a four day training starting tomorrow.

Because of this, I have been digging through some PowerShell resources I had previously found to get ahead of the curve for this training. I’m listing those resources here as a reminder to myself and in the hopes that somebody else may find them useful.

VMware – Determining ownership of a virtual disk using PowerShell

Tuesday, 29. June 2010

So as you may have guessed, I work with VMware a bit in my employment. On top of that, I’ve tried to start using PowerShell to automate repetitive tasks that I run into. (In fact, one of the major points of the existence of this page is to give myself a location to store these findings in a place which I can find them later, and if someone else finds them useful, then great.)

Anyway, the problem I ran into today was that I had a VM on a datastore whose name did not match a VM in my vCenter inventory. How could I tell if these files were in use by a legitimate VM or just wasting space? I could right-click and ‘Edit Settings’ on hundreds of VM’s… or I could use Powershell.

so I started off by connecting to the vCenter server and getting an inventory of VM’s:

Connect-VIServer servername
$VM = Get-VM

Then I grabbed a list of disks which matched my criteria:

$Disks = $VM | Get-HardDisk | Where {$_.FileName -like '*web*' }

Then I did…

$Disks | Get-Member and saw that there are Name, FileName, and ParentID properties. By doing $Disks | Select Name, FileName, ParentID,  I now have the parent Id of the VM.

So how do I know which VM the parent ID field references?

$VM | Where {$_.Id -like 'parent id from previous select'}

… which returns ….

Name PowerState Num CPUs Memory (MB)
---- ---------- -------- -----------
WebSrv1 PoweredOn 4 4096

Is there an easier way to do this? Probably, and I already have a couple of ideas. If I can get them working and cleaned up, I’ll post them here.

SCCM: Determining collection refresh time using PowerShell

Monday, 21. June 2010

I recently had a need to examine the last refresh time of a large number of SCCM collections. We had a group of collections which are used to define maintenance windows for various servers, and we wanted to ensure that all these collections were updating regularly (and to fix the ones that weren’t).

Normally, I would take such a boring task and look for some way to automate it. I have a co-worker who evangelizes about the merits of PowerShell every chance he gets and we were able to put together a few lines to get the information I was looking for.

So obviously this requires that you have PowerShell installed, and it does require PowerShell v2. We used the SCCM PowerShell module located here, which appears to be the most “complete” unofficial PowerShell module I’ve found so far.

Once your PowerShell environment is configured, connecting to an SCCM server is as easy as:
$SCCMServer = Connect-SCCMServer servername

Next, you can do other things like:
# get all SCCM collections
$AllCollections = Get-SCCMCollection -SccmServer $SCCMServer

#show all not updated today
$a | where {$_.LastRefreshTime -notlike ‘20100621*’} | select Name

Instead of getting a list of all collections, you could target only certain ones:
$a = Get-SCCMCollection -sccmserver $sccm | where {$ -like 'Test*'} | Select name

The lines above were enough to save me tons of time I would otherwise spend manually verifying each collections properties. There are plenty of other opprotunities in this module including the ability to gather all sorts of information about advertisements, collections, sites, and packages, and I plan to continue to look for opprotunities where using these tools can allow me to work more efficiently.

MCTS: System Center Configuration Manager

Saturday, 19. June 2010

I posted previously about attending a Microsoft authorized training course on Microsoft System Center Configuration Manager (officially shortened to ConfigMgr, but commonly referred to as SCCM). After that course, I had intended on taking the exam but as the product was not one in which I was involved with regularly, and some life stuff that got in the way, I put it off.

This year my work responsibilities have shifted and I’m now managing my employer’s ConfigMgr environment. I’ve spent time since getting back up to speed on the product, remembering those details from the class and learning other details which you can only get by experience on a “real-world” deployment. Because I’ve been so deep in ConfigMgr lately, it seemed to only make sense that I should get the test out of the way, and the fact that Microsoft’s “Second Shot” program is nearing an end prompted me to start studying.

Overall, I would say the biggest study tool I used was my employment. Aside from that, I also used this book which I have found to contain a good amount of information.  Microsoft also has information on its page which gives details about the exam, skills measured, and recommend training; that information can be found here.

MCTS: System Center Configuration Manager

VMware Certified Professional (vSphere 4)

Sunday, 15. November 2009

So I took advantage of VMware’s VCP upgrade offer and have successfully passed the VMware Certified Professional exam for vSphere 4.

Normally VMware requires that a candidate complete an authorized training course in order to attain VCP certification, however, through the end of the year they are allowing current VCP’s on Virtual Infrastructure 3 to upgrade by successfully passing the vSphere 4 exam.  This is great for those of us who may not have the opprotunity to attend another class on our employer’s time and money.

VMware Certified Professional logo

VMware – Storage Migration (part 2)

Monday, 8. June 2009

In a previous post, I described a VMware Storage Migration task I was assigned and a quick run-down of the capability of the Storage VMotion feature.  Now, I’ll go into more detail about an awesome little tool that could make the process much easier.

The tool is called ‘SVMotion‘ and it is a plug-in to the Virtual Infrastructure client that you may already be using to administer your hosts and guest machines.  Once installed, it adds the ‘Migrate Storage’ option to the right-click menu of your VM’s.  When you select it, you will be presented with a window showing a list of datastores and the ones that the VM and its disks are on.

From here, you can drag the VM and its disks from their source datastore to the destination datastore.  Once you click ok, you’ll see a new ‘Relocate Virtual Machine Storage’ task that will show the migrations progress.

There’s a couple things I’ve learned from my use of this plug-in:

     1.  You have the move the actual virtual machine files as part of the migration whether you want to or not.  This is a Storage VMotion requirement.  If you like where the virtual machine files reside, move them with the disk and then move them back after.

     2.  This item appears to be a quirk of the plugin.  Say you have a VM with multiple disks but only want to move the virtual machine files and the first disk.  When you drag the VM to the destination datastore and click ok (having left the remaining disks in their original location), the plug-in attempts to move all VM files to the destination.  If you have the space, it will begin; if you dont, it will immediately return an error.

Overall, this plug-in is very nice and the price is right (free!).  It’s certainly much more convenient to use the GUI to initiate the Storage VMotion whenever you can.