List All Domain Controllers and roles with PowerShell

New job, new infrastructure, and I needed a way to quickly find out what Active Directory Domain Controllers were present, where they were located, and what roles and functions they all performed.
So I wrote this simple script to do it.

$DCs = Get-ADDomainController -Filter *
$Results = New-Object -TypeName System.Collections.ArrayList
foreach($DC in $DCs){
    [string]$OMRoles = ""
    $ThisResult = New-Object -TypeName System.Object
    Add-Member -InputObject $ThisResult -MemberType NoteProperty -Name Name -Value $DC.Name
    Add-Member -InputObject $ThisResult -MemberType NoteProperty -Name Site -Value $DC.Site
    Add-Member -InputObject $ThisResult -MemberType NoteProperty -Name IPv4Address -Value $DC.IPv4Address
    Add-Member -InputObject $ThisResult -MemberType NoteProperty -Name OperatingSystemVersion -Value $DC.OperatingSystemVersion
    Add-Member -InputObject $ThisResult -MemberType NoteProperty -Name IsGlobalCatalog -Value $DC.IsGlobalCatalog
    Add-Member -InputObject $ThisResult -MemberType NoteProperty -Name IsReadOnly -Value $DC.IsReadOnly
    foreach($OMRole in $DC.OperationMasterRoles){
        $OMRoles += ([string]$OMRole+" ")
    }
    Add-Member -InputObject $ThisResult -MemberType NoteProperty -Name OperationMasterRoles -Value $OMRoles
    $Results.Add($ThisResult) | Out-Null
}
$Results = $Results | Sort-Object -Property Site
$Results | Format-Table -AutoSize

You could also export the $Results object to CSV via Export-CSV.

Posted in PowerShell | Tagged , , , , , , , , , , , , , , | Leave a comment

Using the Windows Volume Shadow Copy Service (VSS)

Having just written an article about how to get items back from a volume shadow copy, I thought I should make some notes about how VSS works, how to configure it, and actually get VSS to create you some shadow copies! This is also useful because, on a filer server, it enables users to recover their own accidentally deleted or overwritten files and folders via the Previous Versions feature.

If you’ve never come across VSS before, TechNet has this to say:

Shadow Copies of Shared Folders provides point-in-time copies of files that are located on shared resources, such as a file server. With Shadow Copies of Shared Folders, users can view shared files and folders as they existed at points of time in the past. Accessing previous versions of files, or shadow copies, is useful because users can:

  • Recover files that were accidentally deleted. If you accidentally delete a file, you can open a previous version and copy it to a safe location.
  • Recover from accidentally overwriting a file. If you accidentally overwrite a file, you can recover a previous version of the file.
  • Compare versions of a file while working. You can use previous versions when you want to check what has changed between two versions of a file.

It’s probably something you want, assuming you have a bit of space somewhere to hold them. Note that due to how they work, a shadow copy only uses the amount of space necessary to hold the changes made to the volume. When you view the shadow copy, you’ll see the complete volume as it was when the shadow copy was taken, but behind the scenes the space used by that shadow copy is not that of a complete copy of the volume at that point in time.

You can store your shadow copies on a different volume. This can be a good idea, as otherwise the volume free space can seem to mysteriously vanish. However, the shadow copy volume possibly needs to be the same size or larger than your data volume as it has to hold a copy of all the changed data blocks from the data volume, which depends on the rate of change and how often you create shadow copies. You might also want to read the article Designing a Shadow Copy Strategy, which is old but still relevant.

Configure VSS for a file server

I’m doing this on Windows Server 2008 R2, just because that’s what my test VM is running. This stuff works back as far as Server 2003 though (but of course you’re not still using that in 2016…).

My server has a 40GB D drive for data and I’ve added a 40GB V drive for VSS use. As this ia a VM, I’ve done both of these as thin provisioned disks from VMware.

I’m running all the commands from an administrator command prompt, which vssadmin requires. I’m using Windows ISO files for the data. There is currently no shadow storage configured, and the drives have been freshly formatted:

C:\>vssadmin list shadowstorage
vssadmin 1.1 - Volume Shadow Copy Service administrative command-line tool
(C) Copyright 2001-2005 Microsoft Corp.

No items found that satisfy the query.

C:\>dir d: /a
 Volume in drive D is Data
 Volume Serial Number is 1863-3C01

 Directory of D:\

28/01/2016  12:29              $RECYCLE.BIN
28/01/2016  11:25              System Volume Information
               0 File(s)              0 bytes
               2 Dir(s)  42,851,102,720 bytes free

C:\>dir v: /a
 Volume in drive V is VSS
 Volume Serial Number is 7468-48AD

 Directory of V:\

28/01/2016  12:29              $RECYCLE.BIN
28/01/2016  11:24              System Volume Information
               0 File(s)              0 bytes
               2 Dir(s)  42,851,102,720 bytes free

So now we’ll configure V: to be the shadow storage for D:, and to use the entire V: drive for that purpose:

C:\>vssadmin add shadowstorage /for=d: /on=v: /maxsize=unbounded
vssadmin 1.1 - Volume Shadow Copy Service administrative command-line tool
(C) Copyright 2001-2005 Microsoft Corp.

Successfully added the shadow copy storage association

Now we’ll copy some data onto D:, and then check the space usage on D: and V: again:

C:\>dir D: /a
 Volume in drive D is Data
 Volume Serial Number is 1863-3C01

 Directory of D:\

28/01/2016  12:29              $RECYCLE.BIN
28/01/2016  11:25              System Volume Information
28/02/2011  12:41     3,181,234,176 Windows 7 Enterprise x64 English SP1 (X17-27625).iso
28/02/2011  12:38     2,433,157,120 Windows 7 Enterprise x86 English SP1 (X17-27617).iso
               2 File(s)  5,614,391,296 bytes
               2 Dir(s)  37,236,707,328 bytes free

C:\>dir v: /a
 Volume in drive V is VSS
 Volume Serial Number is 7468-48AD

 Directory of V:\

28/01/2016  12:29              $RECYCLE.BIN
28/01/2016  11:24              System Volume Information
               0 File(s)              0 bytes
               2 Dir(s)  42,851,102,720 bytes free

As expected, there has not been any data used on V: yet.
Now let’s create a shadow copy of D:

C:\>vssadmin create shadow /for=d:
vssadmin 1.1 - Volume Shadow Copy Service administrative command-line tool
(C) Copyright 2001-2005 Microsoft Corp.

Successfully created shadow copy for 'd:\'
    Shadow Copy ID: {bbec09d9-3ed2-40b0-943d-5b459976fb80}
    Shadow Copy Volume Name: \\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy1

And we can see more detail about the shadow copy:

C:\>vssadmin list shadows
vssadmin 1.1 - Volume Shadow Copy Service administrative command-line tool
(C) Copyright 2001-2005 Microsoft Corp.

Contents of shadow copy set ID: {2d629a37-10e0-4fb4-bf45-e0702de26f50}
   Contained 1 shadow copies at creation time: 28/01/2016 12:40:16
      Shadow Copy ID: {bbec09d9-3ed2-40b0-943d-5b459976fb80}
         Original Volume: (D:)\\?\Volume{cadd2f53-ba6b-11e5-93f2-005056a200aa}\
         Shadow Copy Volume: \\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy1
         Originating Machine: VSSTEST.rcmtech.co.uk
         Service Machine: VSSTEST.rcmtech.co.uk
         Provider: 'Microsoft Software Shadow Copy provider 1.0'
         Type: ClientAccessible
         Attributes: Persistent, Client-accessible, No auto release, No writers, Differential

Now the space usage on V: is:

C:\>dir V: /a
 Volume in drive V is VSS
 Volume Serial Number is 7468-48AD

 Directory of V:\

28/01/2016  12:29              $RECYCLE.BIN
28/01/2016  12:40              System Volume Information
               0 File(s)              0 bytes
               2 Dir(s)  40,737,103,872 bytes free

So the space has dropped by about 2GB. VSS stores its data inside the System Volume Information folder, which by default only the local SYSTEM account has access to. So that we can see what’s going on, I’ve given myself access to this folder – but you shouldn’t normally mess with it.

C:\>dir "v:\System Volume Information" /a
 Volume in drive V is VSS
 Volume Serial Number is 7468-48AD

 Directory of v:\System Volume Information

28/01/2016  12:40              .
28/01/2016  12:40              ..
28/01/2016  11:24            20,480 tracking.log
28/01/2016  12:40            65,536 {3808876b-c176-4e48-b7ae-04046e6cc752}
28/01/2016  12:40     2,113,929,216 {db3a43f3-c5b1-11e5-88bc-005056a200aa}{3808876b-c176-4e48-b7ae-04046e6cc752}
               3 File(s)  2,114,015,232 bytes
               2 Dir(s)  40,737,103,872 bytes free

Note how the timestamp of the two files with GUIDs as their names matches when the shadow copy was taken. Note however that the GUIDs don’t match those reported via the vssadmin command…!
So let’s add some more files to the data drive. Clearly the free space drops on D:, but there are no changes to V: as we’ve not changed any blocks covered by our shadow copy:

C:\>dir d: /a
 Volume in drive D is Data
 Volume Serial Number is 1863-3C01

 Directory of D:\

28/01/2016  12:29              $RECYCLE.BIN
28/01/2016  12:40              System Volume Information
28/02/2011  12:41     3,181,234,176 Windows 7 Enterprise x64 English SP1 (X17-27625).iso
28/02/2011  12:38     2,433,157,120 Windows 7 Enterprise x86 English SP1 (X17-27617).iso
29/04/2014  10:34     3,234,070,528 Win_Ent_8.1_32BIT_English-Custom.ISO
24/04/2014  14:47     4,274,061,312 Win_Ent_8.1_64BIT_English-Custom.ISO
               4 File(s) 13,122,523,136 bytes
               2 Dir(s)  29,728,509,952 bytes free

C:\>dir v: /a
 Volume in drive V is VSS
 Volume Serial Number is 7468-48AD

 Directory of V:\

28/01/2016  12:29              $RECYCLE.BIN
28/01/2016  12:40              System Volume Information
               0 File(s)              0 bytes
               2 Dir(s)  40,737,103,872 bytes free

Let’s see how that changes if we take another shadow copy:

C:\>vssadmin create shadow /for=d:
vssadmin 1.1 - Volume Shadow Copy Service administrative command-line tool
(C) Copyright 2001-2005 Microsoft Corp.

Successfully created shadow copy for 'd:\'
    Shadow Copy ID: {9f1f3eef-7e0d-4f55-ab72-3ed5ed2871cb}
    Shadow Copy Volume Name: \\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy2

C:\>dir v: /a
 Volume in drive V is VSS
 Volume Serial Number is 7468-48AD

 Directory of V:\

28/01/2016  12:29              $RECYCLE.BIN
28/01/2016  12:40              System Volume Information
               0 File(s)              0 bytes
               2 Dir(s)  40,736,202,752 bytes free

No real change there, only a slight drop. What’s happened in the System Volume Information folder?

C:\>dir "v:\System Volume Information" /a
 Volume in drive V is VSS
 Volume Serial Number is 7468-48AD

 Directory of v:\System Volume Information

28/01/2016  12:52              .
28/01/2016  12:52              ..
28/01/2016  11:24            20,480 tracking.log
28/01/2016  12:40            65,536 {3808876b-c176-4e48-b7ae-04046e6cc752}
28/01/2016  12:52           901,120 {db3a43f3-c5b1-11e5-88bc-005056a200aa}{3808876b-c176-4e48-b7ae-04046e6cc752}
28/01/2016  12:52     2,113,929,216 {db3a4401-c5b1-11e5-88bc-005056a200aa}{3808876b-c176-4e48-b7ae-04046e6cc752}
               4 File(s)  2,114,916,352 bytes
               2 Dir(s)  40,736,202,752 bytes free

There’s a new small-ish file created when we took the last shadow copy, and the larger file has also been updated.
Now we’ll see how VSS handles some data drive changes. Let’s delete some of the files from D: and then copy some new ones onto it, and see how the free space on the two drives looks:

C:\>dir d: /a
 Volume in drive D is Data
 Volume Serial Number is 1863-3C01

 Directory of D:\

28/01/2016  12:29              $RECYCLE.BIN
13/08/2009  10:08     2,996,799,488 Server 2008 R2 x64 (X15-59754).iso
02/03/2011  14:41     3,166,720,000 Server 2008 R2 x64 SP1 (X17-22580).iso
28/01/2016  12:40              System Volume Information
29/04/2014  10:34     3,234,070,528 Win_Ent_8.1_32BIT_English-Custom.ISO
24/04/2014  14:47     4,274,061,312 Win_Ent_8.1_64BIT_English-Custom.ISO
               4 File(s) 13,671,651,328 bytes
               2 Dir(s)  29,179,383,808 bytes free

C:\>dir "v:\System Volume Information" /a
 Volume in drive V is VSS
 Volume Serial Number is 7468-48AD

 Directory of v:\System Volume Information

28/01/2016  12:52              .
28/01/2016  12:52              ..
28/01/2016  11:24            20,480 tracking.log
28/01/2016  12:40            65,536 {3808876b-c176-4e48-b7ae-04046e6cc752}
28/01/2016  12:52           901,120 {db3a43f3-c5b1-11e5-88bc-005056a200aa}{3808876b-c176-4e48-b7ae-04046e6cc752}
28/01/2016  12:59     3,523,215,360 {db3a4401-c5b1-11e5-88bc-005056a200aa}{3808876b-c176-4e48-b7ae-04046e6cc752}
               4 File(s)  3,524,202,496 bytes
               2 Dir(s)  39,326,916,608 bytes free

There were no changes when the two Windows 7 ISOs were deleted, but as you can see, the large file in V:\System Volume Information has grown quite a bit after the two new Windows Server 2008 R2 ISOs were added.
Whilst copying the new ISOs, I had Performance Monitor (perfmon) running to monitor disk reads and writes on D:, and disk writes on V:. The activity was interesting. As expected we have plenty of write activity to the data drive:data disk writes

There’s also quite a bit of read activity on the data disk:
data disk reads

Which corresponds to write activity on the VSS disk:
vss disk writes

So we can actually “see” VSS reading the blocks about to be overwritten on the data drive and writing them onto the VSS drive to preserve them for the shadow copies.

In order to access the data in the shadow copies, you can go via the GUI – just right-click the drive or a folder or file and select Properties – Previous Versions. This also works remotely if the drive or folder is shared.

previous versions

Alternatively, you can create a symbolic link to the shadow copy.

More about how VSS works

VSS operates at the block level, and uses a “copy on first write” principle to keep the data safe: when the blocks used on the data volume that are included in a shadow copy are about to be overwritten, it copies them to the shadow copy volume to enable the shadow copy of the data volume to remain intact.

Note that if you lose the data drive or the volume shadow copy drive, you will lose access to your shadow copies. Shadow copies are thus not a replacement for a proper backup regime, they instead complement it.

Also, if the data drive goes offline, but comes back online again, the GUI access to previous versions seems to break. The GUI just shows There are no previous versions available. The vssadmin command does still show the shadow copies though. To fix this, restart the Server service (aka lanmanserver).

VSS manages the space usage on the shadow copy volume by removing older shadow copies to make room for new ones as necessary. Note that many backup applications make use of VSS, and thus temporarily create shadow copies. These are then removed once the backup operation is finished, but can cause some/all of “your” shadow copies to be lost due to their temporary space usage. If the shadow copy volume is too small to hold a temporary shadow copy, the backup will fail.

Because VSS monitors changes at the block level (not the file level), disk defragmentation software can cause your shadow copies to be lost. The shadow copy volume watches all the block changes the the defrag has to do and thus might have to remove some/all of your shadow copies to keep track of those changes. Some defrag software, e.g. PerfectDisk, has a VSS compatibility setting to help with this. It is most problematic if your cluster size is less than 16KB, because this is the size that VSS uses internally and is thus unable to tell if the defrag IO is different to normal IO.

 

 

Posted in Security, Storage, Windows | Tagged , , , , , , , , , , , , , , , , , | Leave a comment

Restore malware-encrypted files from VSS snapshots

There have sadly been a few cases recently where a user has unwittingly run CryptoWall (or CryptoLocker, TeslaCrypt) on their PC and then encrypted a big chunk of one of my shared network drives. A nice quick way to get the files back is by using the VSS shadow copies of the volume.

This won’t usually work locally on your own PC as one of the things these crypto malwares tend to do is remove all shadow copies, but if your data is on a separate file server they can’t usually remove the shadow copies from that.

The first part is manual, PowerShell doesn’t have any VSS integration still as far as I can tell, and it’s not worth the hassle parsing the output of vssadmin – it’s much easier to do this bit by hand.

I suggest you do all this remotely via PowerShell remoting. As such I’ll be prefixing some of the commands with cmd /c as they only exist in that environment (or PowerShell’s default aliases make them do different things to how they work in the regular command prompt).

Steps to recovery

  1. Enter-PSSession -ComputerName FS01 -Credential (Get-Credential -UserName campus\rc-admin -Message “Gimme”)
    You may not need the -Credential bit, just depends if you’re running PowerShell as a user that has admin rights on the file server or not.
    Your PowerShell prompt prefix should change to show that you’re operating on a remote machine:
    [fs01]:
  2. vssadmin list shadows
    This will give you a list of all the shadow copies. You probably want the most recently created one before your files got encrypted. You’re looking for the “creation time” in the second line of each batch of text, the drive letter the shadow copy is for is shown in brackets in the “Original Volume” line, and then you need to make a note of the “Shadow Copy Volume” line as you need it in the next step but one.
    e.g.:
    Contents of shadow copy set ID: {a8c93c0b-a6c8-4ad9-ab3c-699b36f69915}
    Contained 1 shadow copies at creation time: 26/01/2016 07:00:16
    Shadow Copy ID: {8bc3ce53-831f-41e6-ad75-68384cb74bfb}
    Original Volume: (D:)\\?\Volume{0dac1ea6-f36f-4240-b4c6-a3e8f579ef44}\
    Shadow Copy Volume: \\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy67
    Originating Machine: FS01.rcmtech.co.uk
    Service Machine: FS01.rcmtech.co.uk
    Provider: ‘Microsoft Software Shadow Copy provider 1.0’
    Type: ClientAccessible
    Attributes: Persistent, Client-accessible, No auto release, No writers, Differential
  3. We’re going to create a symbolic link to the shadow copy, but first we’re going to create a folder to hold the link, to be tidy.
    md VSS
  4. Now create the symbolic link, note that you need to add a backslash to the end of the Shadow Copy Volume path you copied earlier:
    cmd /c mklink /d C:\VSS\67 \\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy67\
    Note that with PowerShell 5.0 (half-out at the time of writing) you can now create a symbolic link using New-Item -ItemType SymbolicLink.
  5. Now you can edit the variables and then run the script below.
  6. Once you’ve finished, remove the symbolic link to the VSS shadow copy:
    cmd /c rd C:\VSS\67

PowerShell Script

The script searches the path specified for any encrypted files, these are identified by their encrypted file extension. The last lot of these I had were “.micro”. If your malware variant uses a different random file extension for each file it encrypts, you’ll have a nice coding challenge on your hands!

Once it has identified the encrypted files, it goes through each of them in turn and generates what the original filename would have been, and calculates the path to  the same file inside the volume shadow copy. It then copies the original file back, deletes the encrypted file, and once all that is done, runs through and deletes all the “how to get your files unencrypted” instruction files that the malware leaves all over the place.

Note that PowerShell cannot handle “long” filenames, over 260 characters, (why, still???). If you have any of these you might want to investigate using RoboCopy as it can deal with them, and has been able to for years (which makes the lack of support in PowerShell even more annoying).

# Before running this script:
# Use: vssadmin list shadows to find the latest unencrypted shadow copy - see the date & time they were created
# Record the Shadow Copy Volume, and use this to create a symbolic link:
# Create a folder to hold the symbolic link: md C:\VSS
# Then use: cmd /c mklink /d C:\VSS\67 \\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy67\
# You need to add a trailing backslash to the Shadow Copy Volume name produced by vssadmin.
# Once done, remove the symbolic link by using: cmd /c rd C:\VSS\67

# This is the path on the file server that got encrypted:
$EncryptedPath = "D:\SharedFolder\Documents\Guidance\"
# This is the path to your shadow copy symbolic link:
$VSSPath = "C:\VSS\67\"
# File extension that the encrypted files have:
$Extension = ".micro"
# File name (minus extension) used for the "How to get your stuff unencrypted" files:
$RecoverFileFilter = "help_recover_instructions+ntl"

$FileList = Get-ChildItem -LiteralPath $EncryptedPath -Filter *$Extension -Recurse -Force
$TotalFiles = $FileList.Count
Write-Host ("Found "+$TotalFiles)
$Counter = 0
foreach($EncryptedFile in $FileList){
    $DestFileName = $EncryptedFile.FullName.Replace($Extension,"")
    $VSSFileName = $DestFileName.Replace("D:\",$VSSPath)
    try{
        # Use LiteralPath to prevent problems with paths containing special characters, e.g. square brackets
        Copy-Item -LiteralPath $VSSFileName -Destination $DestFileName -ErrorAction Stop
        Remove-Item -LiteralPath $EncryptedFile.FullName -Force
    }
    catch{
        $Error[0]
    }
    Write-Progress -Activity "Fixing" -Status $DestFileName -PercentComplete ($Counter/$TotalFiles*100)
    $Counter++
}
Write-Progress -Activity "Fixing" -Completed
Write-Host "Done recoverying files. Now cleaning up."

$RecoveryFileList = Get-ChildItem -LiteralPath $EncryptedPath -Filter *$RecoverFileFilter* -Recurse
foreach($RecoveryFile in $RecoveryFileList){
    try{
        Remove-Item -LiteralPath $RecoveryFile.FullName -force -ErrorAction Stop
    }
    catch{
        $Error[0]
    }
}

Have fun with it! ;-)

Posted in PowerShell | Tagged , , , , , , , , , , , , , , | 1 Comment

PowerShell script to get VMs and their datastores from an offline VMware ESXi host via vCenter

I recently had the local storage controller fail in one of my ESXi hosts. The host carried on running, and the VMs on it carried on running, but it wasn’t very happy. It was unresponsive via the vSphere client, and there was a good chance that it wouldn’t boot up once it was shutdown. I wasn’t able to vMotion the VMs off it – they’d get part way through and then fail after a while.

My recovery plan was as follows: Shut down all the VMs running on the failing host, remove the host (and thus all its VMs) from vSphere, browse to the VMs on their various datastores and re-add them onto the remaining hosts by double-clicking their .vmx files.

Which was all fairly straightforward except that I still have a LOT of legacy SAN datastores, and the VMs could be on any of them. So I wrote this script to interrogate vCenter and tell me where the VMs live on a particular host. This script does not talk to the host, so will work even if it is powered off. You will need to have VMware PowerCLI installed.

Connect-VIServer -Server vcenter.rcmtech.co.uk
$VMs = Get-VM
# Note that the hostname here needs to be written exactly as the host appears in the vSphere client
$VMHost = Get-VMHost -Name deadhost.rcmtech.co.uk
foreach($VM in $VMs){
    if($VM.VMHost -like "deadhost*"){
        $VM.Name,$VM.Harddisks
    }
}

Simple but effective – albeit slow, but PowerCLI seems to be painfully slow anyway.

Posted in PowerShell, vSphere | Tagged , , , , , , , , | Leave a comment

Test your web access with PowerShell

I had a few issues yesterday with intermittent access to the internet. Some sites worked, others didn’t. The ones that didn’t work weren’t resolving via DNS, and the ones that did work didn’t always display correctly (presumably due to bits of them being sourced from other DNS domains that weren’t resolving).

I wanted a way of checking when the problem had been fixed so wrote this PowerShell script.

In the process of doing this, I wrote a fairly simple way of displaying a table with one column with text in a different colour.

The script takes its input from a hashtable consisting of a URL to a page/site, and some text to look for on the page that would be displayed if everything is working normally. The script does three tests:

  1. Does the site resolve via DNS
  2. Did the page load at all
  3. Was the expected text found on the page

I’ve not included a ping test because some web servers/VIPs don’t allow pinging, and in any case the result of this test would mostly be the same as “Did the page load at all”. If you want to add that in though it’d be pretty easy. Also, if the DNS test fails the script does not blindly plough on and try and retrieve the web page anyway, so it is moderately efficient.

The downside to using a hashtable is that you can’t have duplicate “names” in there, so if you tried to test two different pages from the same DNS domain it’d fail when building the results hashtable. I decided to use hashtables because I don’t use them very much and wanted to play with them, and because I knew the limitation wouldn’t be a problem based on what I want to test.

The output looks something like this:
InternetTester

I’ve added some comments so you can work out what’s going on, but it’s pretty straightforward. Here’s the script:

# Create hashtable for sites to be tested
$Sites = @{}

# Stop progress bar appearing during Invoke-WebRequest
$ProgressPreference = "SilentlyContinue"

# Add sites hashtable
# Format is: <URL to be retieved>,<text to find on that page>
$Sites.Add("http://windows.microsoft.com/en-us/windows-10/upgrade-to-windows-10-faq","Upgrade to Windows 10: FAQ - Windows Help")
$Sites.Add("http://www.google.co.uk/","I'm Feeling Lucky")
$Sites.Add("http://www.bbc.co.uk/news/education-35043243","Universities suffer cyber-attack")
$Sites.Add("http://www.twitter.com/","Welcome to Twitter")
$Sites.Add("http://www.itv.com/","The ITV Hub - The home of ITV")
$Sites.Add("https://netsight.ja.net/Public/HomePage.aspx","Welcome to Netsight")

# Loop forever (Press Ctrl-C to stop the script)
while ($true)
{
    $Results = @{}
    $MaxSiteLength = 0

    foreach($Site in $Sites.GetEnumerator()){
        $x = $Site.Name -match "//.*?/" # regex with "lazy" wildcard to get DNS domain name from URL
        $SiteName = $matches[0] -replace "/",""
        
        # Find the longest site name so results can be displayed neatly in a table
        if($SiteName.Length -gt $MaxSiteLength){
            $MaxSiteLength = $SiteName.Length
        }

        # Clear the variable used to hold the page content
        # If Invoke-WebRequest fails this would otherwise contain the previously tested page
        $Page = $null

        # Check the DNS is resolving correctly
        try{
            $x = Resolve-DnsName -Name $SiteName -DnsOnly -NoHostsFile -QuickTimeout -ErrorAction Stop
            try{
                # DNS is OK, fetch the page
                $Page = Invoke-WebRequest -Uri $Site.Name -TimeoutSec 1 -ErrorAction Stop
                if($Page.Content -like ("*"+$Site.Value+"*")){
                    # Everything is good
                    $Results.Add($SiteName,"OK")
                }else{
                    # The page loaded but the text wasn't found
                    $Results.Add($SiteName,"Page Text Warning")
                }   
            }
            catch{
                $Results.Add($SiteName,"Page Load Fail")
            }
        }
        catch{
            $Results.Add($SiteName,"DNS Fail")
        }
    }

    # Display the results
    Clear-Host
    # Write the table header
    Write-Host ("Site".PadRight($MaxSiteLength+2)+"Result")
    # Sort the sites into alphabetical order - hashtables display randomly otherwise
    foreach($Result in ($Results.GetEnumerator() | Sort-Object -Property Name)){
        # Look for keywords in the result text and set the colour variable appropriately
        switch -Wildcard ($Result.Value)
        {
            "*ok*" {$ForegroundColour = "Green"}
            "*warning*" {$ForegroundColour = "Yellow"}
            "*fail*" {$ForegroundColour = "Red"}
            Default {$ForegroundColour = "Gray"}
        }
        # Display the site name, padded to ensure the results line up neatly
        Write-Host $Result.Name.PadRight($MaxSiteLength+2) -NoNewline
        # Display the result in colour
        Write-Host $Result.Value -ForegroundColor $ForegroundColour
    }
    Get-Date -Format s
    Start-Sleep -Seconds 15
}
Posted in PowerShell | Tagged , , , , , , , , , , | Leave a comment

Analyse AppLocker Logs for Exceptions

If you’re planning to rollout AppLocker you might want to run it in Audit mode first, to see where things are being run from.

You might want to store those logs centrally, see my previous post for how to get distributed Windows Event Logs into SQL Server.

So now you’ve got a table full of paths to executables, and you need to process that to give you a list of exceptions – things which would be blocked – remove any duplicates, and collate the data across all users.

Conveniently, I have written a PowerShell script to do just this:

$SQLServer = "sql-event01.rcmtech.co.uk"
$SQLDB = "EventCollection"
$SQLTable = "MicrosoftWindowsAppLocker_EXEandDLL"
# Open connection to SQL DB
$SQLDBConnection = New-Object -TypeName System.Data.SqlClient.SqlConnection -ArgumentList "Server=$SQLServer;Database=$SQLDB;Integrated Security=SSPI"
$SQLDBConnection.Open()
# Get data
$SQLCommand = $SQLDBConnection.CreateCommand()
$SQLSelect = "SELECT MIN(TimeCreated) TimeCreated, MIN(MachineName) MachineName, MIN(UserId) UserId, MIN(Id) Id, Message
    FROM [$SQLDB].[dbo].[$SQLTable]
    WHERE (Message NOT LIKE '%%programfiles%') AND (Message NOT LIKE '%%system32%') AND (Message NOT LIKE '%%windir%')
    GROUP BY Message"
$SQLCommand.CommandText = ($SQLSelect)
$SQLReader = $SQLCommand.ExecuteReader()
$SQLResultsTable = New-Object System.Data.DataTable
$SQLResultsTable.Load($SQLReader)
$SQLReader.Close()
$SQLDBConnection.Close()
Write-Host ("SQL query returned "+($SQLResultsTable | Measure-Object -Line).Lines+" results")
#$SQLResultsTable | Export-Csv -Path $OutFile -NoTypeInformation -Force

$Results = New-Object -TypeName System.Collections.ArrayList
foreach($Line in $SQLResultsTable){
    $Location = $Line.Message -replace "USERS\\.*?\\","USERS\xxx\"
    $Location = $Location -replace "USERS\d\$\\.*?\\","USERSx$\xxx\"
    $Location = $Location -replace "was allowed.*",""
    $Results.Add($Location) | Out-Null
}
$Results = $Results | Select-Object -Unique
$Results = $Results | Sort-Object
Write-Host ("After generalisation there are now "+$Results.Count+" results")
$Results | Out-GridView -OutputMode Multiple -Title "Unique Results" | ConvertTo-Csv -NoTypeInformation | clip
Write-Host "Any selected entries have been placed on the clipboard" -ForegroundColor Gray

So what am I doing?

  • Open a connection to the SQL Server, you’ll need to change the variables at the top as appropriate
  • Run a SELECT query that removes duplicate Message entries whilst retaining the first instance of the other columns that go with the remaining Message. Those columns are actually not displayed in the final script output, but it was a fun learning process to work out how to do this, so I left it in!
  • The SELECT query also excludes any paths that are part of the default AppLocker Executable rules:
    • All files located in the Program Files folder
    • All files located in the Windows folder
  • The results of this are then processed by stripping out usernames from paths such as %OSDRIVE%\USERS\a-person\APPDATA
  • I then strip out usernames from users’ personal network drives, which in my case are paths such as \\fileserver\users1$\a-person\personal. This is done with a simple regular expression, as are the next few bits of string replacement.
  • I then strip out all the text from the end, starting “was allowed”. This is needed because the third default AppLocker rule is to allow members of the local administrators group to run anything from anywhere, and some of my users are currently local administrators on their PCs. Thus you can get the same path being reported as both “was allowed to run.” and “was allowed to run but would have been prevented from running if the AppLocker policy were enforced.”.
  • Each processed message line is then placed into a results array.
  • The array is then sorted to leave only unique paths
  • Finally, these are displayed in a GridView.
  • Any selected lines are placed onto the clipboard if you click OK.

This seems to work pretty well. My SQL table has around 75,000 rows in it, and I end up with 191 unique “problem” paths being displayed by this script.

Posted in PowerShell, Security, Windows | Tagged , , , , , , , , | Leave a comment

Send Windows Event Logs to SQL Database

I’m currently in the process of planning for an AppLocker rollout to all my PCs (about 7,500 of them) due to an increasing amount of malware. You should probably be doing this too. Anyway, a sensible first step is to identify which paths things are running from, which is pretty easy – you just turn AppLocker on in Audit mode. This makes it write messages into its event log telling what has been allowed to run, and what would be blocked from running were it in enforce mode rather than audit mode.

You now have your PCs collecting all this useful info in their event logs. Now you need to collate it centrally and process it. “Aha”, I thought, “this will be a great time to try out using the built in Windows Event Forwarding“. I followed some instructions, and it worked fine on my Windows 8.1 PC, and also on a colleague’s Windows 10 PC. Sadly, it failed on the other 7498 Windows 7 PCs. After a few days of trying to get it to work on them, I gave up and wrote my own version in PowerShell.

I think my version is better, because it allows you to query events in SQL, which is easier for me than trying to extract sensible information directly out of an event log – especially if the info you’re after is in the Messages field. You could of course modify the script and the SQL table to collect whatever fields you want. This is also not restricted to the AppLocker event log, you can collect events from any Windows Event Log.

This is going to be a big post with multiple sections, sorry about that, but it is pretty straightforward.

Overview

  • You need an SQL server to store the events that you’re going to collect from the PCs. Maybe use SQL Server 2014 Express – which is free.
  • The PCs push the events to the SQL server using an SQL bulk copy, which is pretty efficient.
  • The event collection script is written in PowerShell, you should probably have at least version 3 of this on your PCs, and .Net 3.5, ideally PowerShell 4 and .Net 4.5 (at time of writing).
  • The collection script is launched via a scheduled task, runs as the Network Service account, does not show on the user’s desktop whilst running, and only requires the Domain Computers group to have access to the SQL database. I’ve configured the scheduled task via a Group Policy Preference.
  • The script writes a registry marker when it runs, and on subsequent runs only uploads events that have occurred since its last run. This means you can run it as often as you like an not get duplicate events in your SQL table.
  • The script takes two parameters, the event log name to collect from, and the SQL server to send the events to.

PowerShell Script

param(
    [parameter(Mandatory=$true)][string]$LogName,
    [parameter(Mandatory=$true)][string]$SQLServer
)

# Check event log for events written since this script was last run
# or all events if this is the first run of the script
# and then upload them to SQL Server efficiently

# Create a simplified version of the log name for use elsewhere in the script
$LogNameSimplified = $LogName.Replace("/","_")
$LogNameSimplified = $LogNameSimplified.Replace(" ","")
$LogNameSimplified = $LogNameSimplified.Replace("-","")
Write-Host "SQL table name: $LogNameSimplified"

# Registry key to store last run date & time
$RegKey = "HKCU:\Software\RCMTech\EventCollector"
# SQL Database that holds the table for the events
$SQLDatabase = "EventCollection2"

function Get-UserFromSID ($SID){
    # Does what it says on the tin
    $SIDObject = New-Object -TypeName System.Security.Principal.SecurityIdentifier($SID)
    $User = $SIDObject.Translate([System.Security.Principal.NTAccount])
    $User.Value
}

# Initialise LastRun variable, make it old enough that all events will be collected on first run
# Always use ISO 8601 format
[datetime]$LastRunExeDll = "1977-01-01T00:00:00"

if(Test-Path $RegKey){
    # Registry key exists, check LastRun value
    $LastRunValue = (Get-ItemProperty -Path $RegKey -Name $LogNameSimplified -ErrorAction SilentlyContinue).$LogNameSimplified
    if($LastRunValue -ne $null){
        $LastRunExeDll = $LastRunValue
    }
}else{
    # Registry key does not exist, create it, then set the NewsID value and run full script
    Write-Host "Registry key not present"
    New-Item -Path $RegKey -Force | Out-Null
}

# Get the events logged since LastRun date & time
Write-Host ("Collecting events from "+(Get-Date -Date $LastRunExeDll -Format s))
$Events = Get-WinEvent -FilterHashtable @{logname=$LogName; starttime=$LastRunExeDll} -ErrorAction SilentlyContinue
Write-Host ("Found "+$Events.Count+" events")

if($Events.Count -gt 0){
    # Process event data into a DataTable ready for upload to SQL Server
    # Create DataTable
    $DataTable = New-Object System.Data.DataTable
    $DataTable.TableName = $LogNameSimplified
    # Define Columns
    $Column1 = New-Object system.Data.DataColumn TimeCreated,([datetime])
    $Column2 = New-Object system.Data.DataColumn MachineName,([string])
    $Column3 = New-Object system.Data.DataColumn UserId,([string])
    $Column4 = New-Object system.Data.DataColumn Id,([int])
    $Column5 = New-Object system.Data.DataColumn Message,([string])
    # Add the Columns
    $DataTable.Columns.Add($Column1)
    $DataTable.Columns.Add($Column2)
    $DataTable.Columns.Add($Column3)
    $DataTable.Columns.Add($Column4)
    $DataTable.Columns.Add($Column5)
    # Add event data to DataTable
    foreach($Event in $Events){
        $Row = $DataTable.NewRow()
        $Row.TimeCreated = $Event.TimeCreated 
        $Row.MachineName = $Event.MachineName
        $Row.UserId = Get-UserFromSID -SID $Event.UserId
        $Row.Id = $Event.Id
        $Row.Message = $Event.Message
        $DataTable.Rows.Add($Row)
    }

    # Bulk copy the data into SQL Server
    try{
        $SQLConnection = New-Object -TypeName System.Data.SqlClient.SqlConnection -ArgumentList "Data Source=$SQLServer;Integrated Security=SSPI;Database=$SQLDatabase"
        $SQLConnection.Open()
        $SQLBulkCopy = New-Object -TypeName System.Data.SqlClient.SqlBulkCopy -ArgumentList $SQLConnection
        $SQLBulkCopy.DestinationTableName = "dbo.$LogNameSimplified"
        $SQLBulkCopy.BulkCopyTimeout = 60
        $SQLBulkCopy.WriteToServer($Datatable)
        # Create/update the LastRun value - assuming all the above has worked - in ISO 8601 format
        New-ItemProperty -Path $RegKey -Name $LogNameSimplified -Value (Get-Date -Format s) -Force | Out-Null
        Write-Host "Data uploaded to SQL Server"
    }
    catch{
        Write-Host "Problem uploading data to SQL Server"
        Write-Error $error[0]
    }
}

A few points to note on the script:

  • Pass it the name of the event log to collect events from, e.g. Microsoft-Windows-AppLocker/EXE and DLL
  • If the upload to SQL Server fails, the timestamp marker is not written to the registry and thus the events in the event log will try to be uploaded again on the next run of the script. i.e. you will not be missing events in the SQL table if the SQL server is unavailable when the script runs.
  • The log name is simplified to a form that SQL Server is happy with by removing spaces and hyphens, and converting forward slashes to underscores.
  • The script assumes the database name of EventCollection
  • The script requires you to create a table within this database for each log that you want to collect from, the table name needs to be the simplified version of the log name passed to the script, e.g.
    Microsoft-Windows-AppLocker/EXE and DLL becomes
    MicrosoftWindowsAppLocker_EXEandDLL
  • See below for SQL script to create the database and table
  • ALWAYS use ISO 8601 format datetime with PowerShell! (especially if you live outside USA)
  • I have created a GP Pref to copy the script onto the C drive of all my PCs
  • Storing the event data in a DataTable is bit more fiddly than using a simple array of objects, but it makes the bulk copy into SQL Server much easier – you just dump the whole thing across.
  • The UserId returned from the event log in in the form of a SID, which is not terribly useful to me, so I wrote the Get-UserFromSID function to change this into a username.

SQL Database and Table Creation

Here’s the SQL Server code, paste into SQL Management Studio and run it.

USE [master]
GO

CREATE DATABASE [EventCollection]
 CONTAINMENT = NONE
 ON  PRIMARY 
( NAME = N'EventCollection', FILENAME = N'D:\EventCollection.mdf' , SIZE = 5120KB , MAXSIZE = UNLIMITED, FILEGROWTH = 1024KB )
 LOG ON 
( NAME = N'EventCollection_log', FILENAME = N'L:\EventCollection_log.ldf' , FILEGROWTH = 10%)
GO

ALTER DATABASE [EventCollection] SET COMPATIBILITY_LEVEL = 110
GO

IF (1 = FULLTEXTSERVICEPROPERTY('IsFullTextInstalled'))
begin
EXEC [EventCollection].[dbo].[sp_fulltext_database] @action = 'enable'
end
GO

USE [EventCollection]
GO

CREATE TABLE [dbo].[MicrosoftWindowsAppLocker_EXEandDLL](
	[TimeCreated] [datetime] NULL,
	[MachineName] [varchar](50) NULL,
	[UserId] [varchar](50) NULL,
	[Id] [int] NULL,
	[Message] [varchar](500) NULL
) ON [PRIMARY]

GO

You also need to grant the Domain Computers group permission to bulk copy into the table.

From SQL Management Studio:

  1. Go to Security – Logins
  2. Create a new Login for the Domain Computers group, on the User Mapping page tick the EventCollection database.
  3. Go to Databases – EventCollection – Security – Users
  4. Double-click the Domain Computers group, go to the Securables page
  5. Click Search… – All objects of the types… – OK
  6. Tick Tables – OK
  7. Ensure the table is selected in the Securables section, then in the Explicit permission tab tick:
    1. Insert: Grant
    2. Select: Grant
  8. Click OK.

Note that I am not a SQL Server expert, so whilst this all works it may be missing many optimisations. I believe SQL bulk copies do not cause much transaction log activity, but you could always set your database to simple recovery mode anyway.

Scheduled Task Configuration

I’ve configured this via a Group Policy Preference. Here’s what I did.

  1. Create a new GPO, or edit an existing one. In the Group Policy Management Editor go to Computer Configuration, Preferences, Control Panel Settings, Scheduled Tasks.
  2. Right-click, New – Scheduled Task (At least Windows 7). Leave the settings at their defaults except as detailed below.
  3. General tab
    1. Action: Replace
    2. Name: Collect AppLocker Events EXE DLL
    3. When running the task, use the following user account: NT AUTHORITY\Network Service
  4. Triggers tab – you can use any trigger you like, personally I’m doing it once a day based on time and day of the week
    1. Click New…
    2. Begin the task: On a schedule
    3. Settings: Weekly
    4. Start: <today’s date> 15:00:00
    5. Recur every: 1 weeks on: Monday Tuesday Wednesday Thursday Friday
    6. Delay task for up to (random delay): 1 hour (stops your SQL server being overwhelmed with all the collections happening at once)
    7. Stop task if it runs longer than: 30 minutes (this is just a safety net in case the script errors badly/hangs)
    8. Enabled needs to be ticked
  5. Actions tab
    1. Click New…
    2. Action: Start a program
    3. Program/script: %WindowsDir%\System32\WindowsPowerShell\v1.0\powershell.exe (note that GPPrefs use their own “environment variables, hence %WindowsDir% and not %WinDir%. Hit F3 to view & insert GPPref variables)
    4. Add arguments(optional): -ExecutionPolicy Bypass -File “C:\Program Files\RCMTech\CollectEvents.ps1” (see my note earlier about using a GPPref to copy the script locally onto the PCs)
  6. Settings tab
    1. Allow task to be run on demand: ticked (useful for testing, and why not anyway)
    2. Run task as soon as possible after a scheduled start is missed: ticked (in case the PC is switched off when the task is scheduled to run)
  7. Common tab
    1. Remove this item when it is no longer applied: ticked

Note that running a scheduled task as a specific user is no longer possible via GPPref due a security flaw. Network Service is a good choice in this situation anyway. It causes the connection to the SQL server to be using the credentials of the computer’s own Active Directory account, e.g. RCMTech\MYPC$ which means you don’t need to give your users access to the database. This is good from a data protection point of view as the resulting database contains personally identifiable information.

Group Policy Preference to copy script onto target machines

  1. Computer Configuration, Preferences, Windows Settings, Files.
  2. Right-click, New – File
  3. General tab
    1. Source File(s): \\rcmtech.co.uk\NETLOGON\LocalScripts\*.*
    2. Destination Folder: %ProgramFilesDir%\RCMTech
    3. Suppress errors on individual file actions: ticked
  4. Common tab
    1. Remove this item when it is not longer needed: ticked
    2. Item-level targeting:
      1. the folder \\rcmtech.co.uk\NETLOGON\LocalScripts exists

All done

Once all the above is in place, you’re good to go. Now you just need to do something with all that event data sat in your SQL database.

Posted in PowerShell, Scripting, Security, Windows | Tagged , , , , , , , , , , , , , , , , , , , , , , , | 1 Comment

Convert Hyper-V VM vhdx to VMware vSphere vmdk

This turned out to be pretty simple, if pretty slow. I was converting from a Hyper-V 2012 R2 VM into vSphere 5.0. The VM itself was also running Windows Server 2012 R2.

There are a few steps and points to note:

  1. Download and install StarWind Converter
  2. Make sure you have plenty of disk space – of the various vmdk formats that StarWind Converter can output, the only one that is compatible with ESXi hosts is a full fat file – so it’ll take up the full amount of space even if the original vhdx file was thin provisioned and consequently smaller.
  3. Make sure you have fast disks and network connections. I had an 850GB VM to convert and even using 10Gb networks and SSD storage it still took hours and hours.
  4. Log on to the Hyper-V VM
    1. Make a note of the drive letter used by the CD/DVD drive
    2. Make a note of what the other drive letters are too. I suggest creating a file called Drive_<driveletter>.txt in the root of each drive to ensure things have stayed the same once the vSphere VM is up and running.
    3. Power off the Hyper-V VM you want to convert to vSphere.
  5. Fire up StarWind Converter and pick the source vhdx file – I pointed it directly at the vhdx file on the Hyper-V host via a UNC path.
  6. Pick the output file format to convert to – you want to choose “VMware ESX server image”
  7. Chose the filename and location for the vmdk file – put this on a separate disk/place to the source if possible – you want to maximise data throughput and minimise disk & network contention. The naming convention in vSphere datastores is <VM Name>.vmdk for the first disk, <VM Name>_1.vmdk for the second disk etc. so it helps if you use that convention here.
  8. Kick off the conversion and wait for it to finish. This will probably take a while.
  9. Now you’ll have pairs of vmdk files: <VM Name>.vmdk and <VM Name>-flat.vmdk. The former is the descriptor file and will be about 1KB, the latter is the actual data file and will be whatever size your Hyper-V VM thought its disk was.
  10. Now you need to create an empty vSphere VM. Place it on a datastore that has sufficient capacity for your newly converted vmdk files.
    1. It doesn’t matter what size you make the initial harddisk as we’ll remove it later anyway.
    2. Configure the VM with the same amount of RAM and number of CPUs as the Hyper-V VM had.
    3. Configure the NIC type to whatever you want, personally I always use VMXnet3. Set the port group to match the VLAN that the Hyper-V VM was on.
    4. I used the default SAS disk controller.
    5. Go into Options – and change the boot setting from BIOS to EFI. The default for new VMs is BIOS, so this is especially important as otherwise the vSphere VM will not boot.
  11. Once the VM is created, edit the settings and remove the harddisk, choose the option to delete the files from disk.
  12. Open the datastore browser and select the folder where your new VM lives. Upload the converted <VM Name>.vmdk file(s) and the associated <VM Name>-flat.vmdk file(s). This will take a while. Note that the datastore browser hides a lot of files, including the “*-flat.vmdk” files.
  13. Now edit the VM settings, and add a new disk. When prompted, choose the option to add an existing disk and browse to find the <VM Name>.vmdk file. Repeat for any additional _1.vmdk, _2.vmdk etc. disks.
  14. Power the vSphere VM on. Once Windows has started, log on and open Disk Management.
    1. You’ll need to change the CD/DVD drive letter to whatever it was in the Hyper-V VM.
    2. Next bring online any other disks and they should drop in with their correct drive letters.
  15. Now install VMware tools. If you’ve used the VMXnet3 NIC this will appear after the install so you can configure it with the correct IP details before rebooting the server to complete the tools install.
  16. Remove any old Hyper-V hardware from within the VM.
  17. That’s it.

If you wanted to change the vmdk type to thin provisioned afterwards you could migrate the VM to a different datastore and change the disk type as part of the migration.

Obviously don’t power the Hyper-V VM back on again once the vSphere one is up and running.

The key points here are:

  • StarWind Converter is very handy.
  • Change the VM boot type from BIOS to EFI or it won’t boot.
  • The process is very slow due to the amount of data (probably) involved. The 800GB VM took me the best part of a (working) day to convert from powering off the Hyper-V VM and having the vSphere one up and running and back in operation.

 

Posted in Hyper-V, Storage, vSphere, Windows | Tagged , , , , , , , , , , , , , | Leave a comment

Using Group Policy WMI Filters with examples

WMI Filters exist at the bottom on the Group Policy Management Console and are a way to target Group Policy Objects (GPOs) based on the results of the WMI query.

Other ways of targeting GPOs are by the OU that it is linked to (and the members of that OU), or by Active Directory security group.

WMI filters are useful as they can give you much more granularity, and are also dynamic. For example, if you’re applying a GPO to your PCs but only want it to apply to Windows 7, use a WMI filter. If the PC gets upgraded to Windows 10 the GPO will automatically stop applying. The only other way to do this would be to manually have all your Windows 7 PCs as a member of a security group or an OU, and have a way to ensure they were removed if the OS changed. Computers that are members of a security group have to be rebooted for the group membership change to take effect (this is when the computer account logs on to AD).

So, some filters:

  • VMware VMs only
    SELECT Model FROM Win32_ComputerSystem WHERE Model = “VMWare Virtual Platform”
  • For the inverse just add NOT after WHERE, e.g. everything except VMware VMs
    SELECT Model FROM Win32_ComputerSystem WHERE NOT Model = “VMWare Virtual Platform”
  • Hyper-V VM
    SELECT Model,Manufacturer FROM Win32_ComputerSystem WHERE Model = “Virtual Machine” AND Manufacturer = “Microsoft Corporation”
    I added Manufacturer as it makes it clearer that this is a Hyper-V VM as opposed to the “Model” property potentially just signifying a generic “Windows has detected that it is not running on physical hardware”. It does not do that, as far as I know Model = “Virtual Machine” is unique to Hyper-V. (e.g. for VirtualBox you use “VirtualBox”)
  • Physical servers (which in my environment means not VMware or Hyper-V)
    SELECT Model FROM Win32_ComputerSystem WHERE NOT Model LIKE “%Virtual%”
  • Laptops, or PCs & Servers with a UPS
    SELECT * FROM Win32_Battery
  • Windows 7
    SELECT Caption,Primary FROM Win32_OperatingSystem WHERE Caption LIKE ‘Microsoft Windows 7%’ AND Primary = TRUE
    Note that I’m checking the “Primary” property based on info in the book VBScript, WMI, and ADSI Unleashed by Don Jones (chapter 29, p.482). It is quite possibly not necessary.
  • Windows 7 or Server 2008 (inc. R2)
    SELECT Caption,Primary FROM Win32_OperatingSystem WHERE (Caption LIKE ‘Microsoft Windows Server 2008%’ AND Primary = TRUE) OR (Caption LIKE ‘Microsoft Windows 7%’ AND Primary = TRUE)
  • Computers with names beginning Finance
    SELECT Name FROM Win32_ComputerSystem WHERE Name LIKE ‘Finance%’
Posted in Windows | Tagged , , , , , , , , , , , , , , , , , , , , | Leave a comment

Testing for open ports with PowerShell

Want to find out if a TCP port is open via a firewall and/or if something is listening on that port? From within a script? You can use PowerShell.

A colleague found this, which uses System.Net.Sockets.TcpClient from the .NET framework via PowerShell. That would work, but then I found Test-NetConnection.

Test-NetConnection allows testing of TCP ports, and is a built-in cmdlet, which for me is preferable over custom code wherever possible. I’d previously used Test-Connection, which is simpler and just uses ICMP echo requests (aka ping).

I’m using the cmdlet as follows:

if(Test-NetConnection -ComputerName 192.168.1.51 -Port 443 -InformationLevel Quiet){
    # Something is listening on that port, do some stuff
} else {
    # Nothing listening on that port, do something else
}

Simple as that.

Posted in Networking, PowerShell | Tagged , , , , , , , , | Leave a comment