CM Scheduled Maintenance Tasks Info
Because of this reddit post, Maintenance Task does not finish, I got inspired to look closer at maintenance tasks. There is already a view for seeing what your tasks are set to, and how long they have taken to run the last time they ran.
There is also a very helpful instructions here, How site maintenance tasks can make your life much easier, including some powershell, for asking any of the tasks to "run now", instead of waiting for a scheduled time.
I've included the powershell code for 'run now', just in case (as sometimes happens for me) a linked post disappears.
Why this might be interesting to know... if a task takes "too long", or, say, Backups run because a task triggers at midnight, backup triggers at 1am, and the midnight-task gets cancelled and never finishes, you might want to adjust some of the timings, or 'DeleteOlderThan', because your site might need adjusting.
select
SQLTaskStatus.TaskName,
SQLTaskStatus.BeginTime,
SQLTaskStatus.LatestBeginTime,
SQLTaskStatus.IsEnabled,
SQLTaskStatus.CompletionStatus,
SQLTaskStatus.LastStartTime,
SQLTaskStatus.LastCompletionTime,
DATEDIFF(SECOND, SQLTaskStatus.LastStartTime, SQLTaskStatus.LastCompletionTime ) as 'RunTime in seconds',
CASE WHEN (1&SQLTaskStatus.DaysOfWeek)=1 THEN 'SUNDAY' END as 'Sunday',
CASE WHEN (2&SQLTaskStatus.DaysOfWeek)=2 THEN 'MONDAY' END as 'Monday',
CASE WHEN (4&SQLTaskStatus.DaysOfWeek)=4 THEN 'TUESDAY' END as 'Tuesday',
CASE WHEN (8&SQLTaskStatus.DaysOfWeek)=8 THEN 'WEDNESDAY' END as 'Wednesday',
CASE WHEN (16&SQLTaskStatus.DaysOfWeek)=16 THEN 'THURSDAY' END as 'Thursday',
CASE WHEN (32&SQLTaskStatus.DaysOfWeek)=32 THEN 'FRIDAY' END as 'Friday',
CASE WHEN (64&SQLTaskStatus.DaysOfWeek)=64 THEN 'SATURDAY' END as 'Saturday',
SQLTaskStatus.RunNow,
SQLTaskStatus.SiteCode,
SQLTaskStatus.DeleteOlderThan
from vSMS_SQLTaskStatus AS SQLTaskSTatus
order by SQLTaskStatus.TaskName
OPTION(USE HINT('FORCE_LEGACY_CARDINALITY_ESTIMATION'))
Powershell for triggering a task to 'run now'.
-replace SiteCode with your site code
-replace the name with the name of the task you want to run
and you do have to run it while connected to your Provider
$SiteCode = 'ABC'
$MT = Get-CMSiteMaintenanceTask -SiteCode $SiteCode -Name 'Delete Aged Scenario Health History'
$MethodParam = New-Object 'System.Collections.Generic.Dictionary[String,Object]'
$MethodParam.Add('SiteCode',$($SiteCode))
$MethodParam.Add('TaskName',$($MT.TaskName))
$ConfigMgrCon = Get-CMConnectionManager
$ConfigMgrCon.ExecuteMethod('SMS_SQLTaskStatus','RunTaskNow',$MethodParam)
CM Inventory per-user browser extensions
At my company, there was recently a need to verify a custom vendor browser extension, specifically for Edge. I found several methods for gathering Chrome extensions (including a clumsy attempt by myself several years ago), but then stumbled across this method, which could be modified to run as a Configuration Item + Mof Edit.
Source for this routine: https://github.com/vastlimits/uberAgent-Scripts/blob/main/Get-BrowserExtensionInfo/Get-BrowserExtensionInfo.ps1
It looks like that routine was last updated by Helge Klein; which I suspect is this Helge Klein (but I can't be sure): https://helgeklein.com/ Mr. Klein has done some other work which you may have heard of, like "DelProf2" (Delprof2 deletes inactive user profiles), or "SetACL" (SetACL manages permissions, auditing and ownership information.) Check him out if you have a need for something like that.
For Browser Extensions, if you want to use this with CM, the steps are...
1) deploy the CI inside a Baseline
2) Import the .mof and enable inventory.
That's the simple and short explanation. For the nitty-gritty details and background story. Browser Extensions are recorded in the user context.
What this routine does is multi-layered, and solves some (but not all) of the various issues I've felt "could" be encountered with inventorying per-user information.
First, a script inside the CI, running as system (not the logged in users) creates (if it doesn't already exist), a custom WMI Namespace called "CustomCMClasses". If you so choose, you can change that if you like. I've seen other examples using "ITLocal" as the custom namespace. But for purposes of this blog, we'll assume you won't be modifying that name. Then, it uses the well-known SIDs for "Everyone" and "Authenticated Users", to open up that namespace to allow those types of logins (aka, everyone and authenticated users) to write entries to classes in that namespace, like, for example... the per-user browser extensions.
Second, a script inside the CI runs under user context. It will first delete any records ALREADY in that class for that specific user, and then repopulate the class with anything found in the per-user browser extensions, for chrome, edge (chromium based), and firefox. What's nice about that is that if this is a multi-user device, you will continue to get information for all of the users who log in.
POTENTIAL drawback is that let's surmise that Bob Smith logged on in July, and entries were created for him in WMI then. Since then, he has not used this box or has even left the company. There might be stale entries for Bob being inventoried... potentially for the life of the device. That means that you will want to create reports where you filter on the 'ScriptRunTime' within the last xx days, so you don't pull stale data into reports.
--> Here <-- is the .zip containing the Configuration Item .cab to be imported into your CM Console (rename it before importing). If you successfully import the CAB file, you don't have to do anything with the .renameAsPS1 files in the Zip. Those .RenameAsPS1 files would be *IF* the .cab import fails, you could create your own CI, and add each of those as a Rule in the CI; one where you leave it to run as system, and the other where you carefully check the box for 'run scripts by using the logged on user credentials'. Also in the .zip is a BrowserExtension-ImportMe.mof file. Presuming you didn't change the custom class from being called 'CustomCMClasses', you would rename that to just .MOF, and import that into your Console, Administration, Client Settings, Default Client Settings, Hardware Inventory.
Once you have the CI and a baseline including the CI created, you deploy the Baseline potentially to a small collection of devices. On those few devices, interactively do policy refreshes, and run the Baseline (from the Control Panel applet). Note that you MIGHT have to run the baseline twice--the first time to create the initial custom class and set permissions. Once that is done, it'll skip over that next time. Then run the baseline again. Using your favorite WMI browser (wmiexplorer?) look at customcmclasses, and the class inside. See if it contains what you expect it to contain. If so, hooray!
If you are happy with the results, import the .mof (you'll get a view likely called v_gs_browserextensions0... usually). Enable inventory for that. Deploy the baseline to the rest of your environment where you want to get browser extensions. Note, I would NOT have the baseline run frequently. Perhaps every 4 days? or every 7 days? This information isn't mission critical, imo; it's a nice-to-have; for those (hopefully few) times when manager-types want to know about browser extensions.
Sample report to get you started (once you have deployed the CI as a Baseline, tested it, and inventory is enabled). Note that the InstallTime returned by the script is in that "seconds since 1970", so it's a bit annoying to tease out. And 'default' extensions for chrome/Edge just have an installtime of -11644473600000, which doesn't translate to a 'real time', so in the report I just Null that out.
select
Case when be.ExtensionInstallTime0 = '-11644473600000' then
NULL
else
dateadd(ms, cast(be.ExtensionInstallTime0 as BIGINT)%(3600*24*1000), dateadd(day, cast(be.ExtensionInstallTime0 as BIGINT)/(3600*24*1000), '1970-01-01 00:00:00.0') )
end as 'InstallTime'
,
be.Browser0 as 'Browser',
be.ExtensionFromWebStore0 as 'ExtensionInstalledFromWebStore',
be.ExtensionID0 as 'ExtensionID',
be.ExtensionInstalledByDefault0 as 'ExtensionInstalledByDefault',
be.ExtensionName0 as 'ExtensionName',
be.ExtensionVersion0 as 'ExtensionVersion',
case when be.ExtensionState0 = 1 then 'Active' else 'Disabled' end as 'ExtensionState',
be.OSUser0 as 'User',
be.ProfileDir0 as 'ProfileDir',
be.ProfileGaiaName0 as 'ProfileGaiaName',
be.ProfileName0 as 'Browser ProfileName (if Browsers have multiple profiles)',
be.ProfileUserName0 as 'Browser Profile UserName',
be.ScriptLastRan0 as 'LastTime This information was updated locally'
from v_GS_BrowserExtensions0 be
===============
Example results (from my mini-lab, for example)
InstallTime | Browser | ExtensionInstalledFromWebStore | ExtensionID | ExtensionInstalledByDefault | ExtensionName | ExtensionVersion | ExtensionState | User | ProfileDir | Browser ProfileName (if Browsers have multiple profiles) | LastTime This information was updated locally |
NULL | Edge | FALSE | jmjflgjpcpepeafmmgdpfkogkghcpiha | TRUE | Edge relevant text changes | 1.1.3 | Active | smsadmin | Default | Profile 1 | 7/6/2023 9:56 |
NULL | Chrome | TRUE | aapbdbdomjkkjkaonfhkkikfgjllcleb | FALSE | Google Translate | 2.0.13 | Active | sherry | Default | Person 1 | 7/6/2023 8:19 |
NULL | Chrome | TRUE | nmmhkkegccagdldgiimedpiccmgmieda | TRUE | Chrome Web Store Payments | 1.0.0.6 | Active | sherry | Default | Person 1 | 7/6/2023 8:19 |
NULL | Chrome | TRUE | ghbmnnjooekpmoecnnnilnnbdlolhkhi | TRUE | Google Docs Offline | 1.63.3 | Active | sherry | Default | Person 1 | 7/6/2023 8:19 |
NULL | Edge | FALSE | jmjflgjpcpepeafmmgdpfkogkghcpiha | TRUE | Edge relevant text changes | 1.1.3 | Active | sherry | Default | Profile 1 | 7/6/2023 8:19 |
7/3/2023 13:41 | Firefox | TRUE | customscrollbars@computerwhiz | NULL | Custom Scrollbars | 4.2.2 | Active | sherry | 9fpff8kr.default-release | default-release | 7/6/2023 8:19 |
CM Inventory per-user installed applications
There are several examples of 'how to inventory' per-user installed applications and version, like Teams, or OneDrive, etc. This is another one, from Benjamin Reynolds, and I've tested it a few times to make sure it worked like I thought it would.
Overall, the steps are..
1) deploy the CI inside a Baseline
2) Import the .mof and enable inventory.
That's the simple and short explanation. For the nitty-gritty details and background story. Per-user apps are recorded in the user context, i.e., in the hkey_current_user registry keys which are notoriously difficult to inventory. Yes, there are methods to mount the hives, read inside, record, and inventory. But then to me, that means you might be opening up and reading a user profile that hasn't been used in months or years--how relevant is it to know that Bob Smith, who left the company 2 years ago, still has an "old" version of Teams associated with the profile that he can't possibly be using for 2 years?
What this routine does is multi-layered, and solves some (but not all) of the various issues I've felt "could" be encountered with inventorying per-user information.
First, a script inside the CI, running as system (not the logged in users) creates (if it doesn't already exist), a custom WMI Namespace called "CustomCMClasses". If you so choose, you can change that if you like. I've seen other examples using "ITLocal" as the custom namespace. But for purposes of this blog, we'll assume you won't be modifying the name, you will leave it as 'CustomCMClasses'. Then, it uses the well-known SIDs for "Everyone" and "Authenticated Users", to open up that namespace to allow those types of logins (aka, everyone and authenticated users) to write entries to classes in that namespace, like, for example... the per-user installed applications.
Second, a script inside the CI runs under user context. It will first delete any records ALREADY in that class for that specific user, and then repopulate the class with anything found in the per-user uninstall information. What's nice about that is that if this is a multi-user device, you will continue to get information for all of the users who log in.
POTENTIAL drawback is that let's surmise that Bob Smith logged on in January, and entries were created for him. Since then, he has not used this box or has even left the company. There might be stale entries for Bob being inventoried... potentially for the life of the device. That means that you will want to create reports where you filter on the 'ScriptRunTime' within the last xx days, so you don't pull stale data into reports.
--> Here <-- is the .zip containing the Configuration Item .cab to be imported into your CM Console (rename it before importing). If you successfully import the CAB file, you don't have to do anything with the .renameAsPS1 files in the Zip. Those .RenameAsPS1 files would be *IF* the .cab import fails, you could create your own CI, and add each of those as a Rule in the CI; one where you leave it to run as system, and the other where you carefully check the box for 'run scripts by using the logged on user credentials'. Also in the .zip is a ImportIntoDefaulClientHardwareInventory.RenameToMof file. Presuming you didn't change the custom class from being called 'CustomCMClasses', you would rename that to just .MOF, and import that into your Console, Administration, Client Settings, Default Client Settings, Hardware Inventory.
Once you have the CI and a baseline including the CI created, you deploy the Baseline potentially to a small collection of devices. On those few devices, interactively do policy refreshes, and run the Baseline (from the Control Panel applet). Note that you MIGHT have to run the baseline twice--the first time to create the initial custom class and set permissions. Once that is done, it'll skip over that next time. Then run the baseline again. Then, using your favorite WMI browser (wmiexplorer?) look at customcmclasses namespace, and the class inside. See if it contains what you expect it to contain. If so, hooray!
If you are happy with the results, import the .mof (you'll get a view likely called v_gs_userinstalledapps0... usually). Enable inventory for that. Deploy the baseline to the rest of your environment where you want to get user-installed apps. Note, I would NOT have the baseline run frequently. Perhaps every 4 days? or every 7 days? This information isn't mission critical, imo; it's a nice-to-have; for those (hopefully few) times when manager-types want to know what versions of Teams is out there (for example).
Sample report to get you started (once you have deployed the CI as a Baseline, tested it, and inventory is enabled):
DECLARE @60DaysAgo datetime = (GetDate()-60)
--This is so that if there are stale values from people who have left the company or are no longer using this machine, we don't see them in the reports
Select s1.netbios_name0 as 'ComputerName'
,uapps.Publisher0 as 'Publisher'
,uapps.DisplayName0 as 'DisplayName'
,uapps.Version0 as 'Version'
,uapps.InstallDate0 as 'Application Install Date If Known'
,uapps.user0 as 'Username associated with this Install'
,uapps.InstallLocation0 as 'Install Location If Known'
,uapps.UninstallString0 as 'UninstallString'
,uapps.ScriptRunTimeUTC0 as 'Date information was gathered'
from v_gs_userinstalledapps0 uapps
join v_r_system s1 on s1.resourceid=uapps.resourceid
where uapps.ScriptRunTimeUTC0 > CAST(@60DaysAgo as DATE)
order by s1.netbios_name0, uapps.publisher0, uapps.DisplayName0
File Inventory via Hardware Inventory
If you are less than pleased with how file inventory functions (badly named 'Software Inventory' in CM), like me, this is possibly something you'd want to test in your lab (you have a lab, right) and see if for those occasional requests we all seem to get for "can't you just inventory a file...", that this script + mof edit would work for you.
If you have file inventory on at all (software inventory), for things like *.exe in ProgramFiles, leave it on for that (don't change everything).
But let's say you got a request to "find all .pst files stored anywhere on local drives", because your company is still battling cleaning up email archives people saved as local .pst files. Sure, you can add that to file inventory, but as you know file inventory takes hours and HOURS to run on computers.
This script, in testing, took between 16 and 90 seconds when I was testing.
After you have customized the script for YOUR weird one-off rule(s), and tested it interactively completely outside of CM, and are happy it works as you expect it to work to populate the WMI class with values, then..
Add the customized-by-you script as a powershell script inside a Configuration Item; where "what means compliant" is existential, any value returned.
Add the CI to a Baseline, deploy the baseline to your test collection.
After a target has run the baseline, you would go into your CM Console, Administration, Client Settings, Custom Client settings, hardware inventory, Add... and connect to that target, and in root\cimv2, add the 'cm_CustomFileInventory' class. Monitor your CM server dataldr.log to confirm it made the views, and assuming you left the class enabled, after the target gets the new policy with the new instruction for inventory, trigger a hardware inventory. Once inventory arrives at your CM database, look at the newly available view (probably called something like v_gs_cm_customfileinventory0), and there you go.
Only devices which have run the baseline will have something to say, so you can limit the targets reporting back with this custom file inventory by only deploying the Baseline to a specific collection.
<#
.SYNOPSIS
For specific files or types of files, Populate a Custom WMI with that information, for later inventory retrieval
.DESCRIPTION
Query for files, and populate WMI
.NOTES
2023-03-17 Sherry Kissinger
CAUTION! CAUTION! this is NOT meant to be a replacement for File Inventory completely. This routine will populate WMI, and depending upon
the rules YOU might make, you could inadvertently cause WMI Bloat (which can cause problems, and those problems might be difficult to
identify), and then hardware inventory mif size might be too big, resulting in Hardware Inventory failing to report at all.
For example, do NOT query for c:\ , *.*... you are just asking for everything to blow up, and do so badly.
This routine was originally created as a response to "we need to know about any/every pst file on the C: drive". As you know, file inventory,
searching for *.zzz files on all of the C: drive can take 30+ minutes, even if you have an SSD, and relatively few files.
This script, when tested interactively on test devices (ok, it was 2 whole machines in the lab...)
was taking anywhere from 16 to 90 seconds, depending upon the # of files on the drive, size of the drive, etc.
$VerbosePreference options are
'Continue' (she the messages)
'SilentlyContinue' (Do not show the messages, this should be the default)
'Stop' Show the message and halt (tuse for Debugging)
'Inquire' Prompt the user if ok to continue
Example lines for gathering files. These lines, would, for example...
1) Find all *.pst files anywhere on the First known drive (this does not include things like OneDrive or redirected Documents folders however)
2) Find any *.exe which happen to be in first known drive (which is usually c:) \WierdAppInTheRoot and subfolders under WierdAppInTheRoot
3) Find fubar.xml, but only if it is specifically in c:\program files\Widgets (or program files x86\widgets), do not even look in subdirectories under that. (-Recurse has been removed from those lines)
$LocalFileSystemDrives = (Get-psdrive -PSProvider FileSystem)
[System.IO.FileSystemInfo[]]$files = Get-ChildItem -Path ($LocalFileSystemDrives.Root)[0] -include ('*.pst') -Recurse -OutBuffer 1000 -ErrorAction SilentlyContinue
[System.IO.FileSystemInfo[]]$files += Get-ChildItem -Path ($LocalFileSystemDrives.Root)[0]'\WierdAppInTheRoot' -include ('*.exe') -Recurse -OutBuffer 1000 -ErrorAction SilentlyContinue
[System.IO.FileSystemInfo[]]$files += Get-ChildItem -Path $env:programfiles'\Widgets' -include ('fubar.xml') -OutBuffer 1000 -ErrorAction SilentlyContinue | Where-Object {$_.DirectoryName -in ($env:programfiles'\Widgets')}
[System.IO.FileSystemInfo[]]$files += Get-ChildItem -Path ${env:ProgramFiles(x86)}'\Widgets' -include ('fubar.xml') -OutBuffer 1000 -ErrorAction SilentlyContinue | Where-Object {$_.DirectoryName -in (${env:ProgramFiles(x86)}'\Widgets')}
Once you have all of the objects you want, then the next section in the script will populate the class with the values, by reading relevant
information from the $files object.
Once you have tested this interactively (NOT as a CI yet), and you are happy with the results you see interactively in wmiexplorer, then
create a CI with this script (modified for your purposes), and deploy to a test number of devices. Add the custom WMI class
to your inventory, and monitor the results.
*if* your environment has more than just '1 drive', and you were tasked with "find pst files on ANY/all local drives... you can use
this sql query to see 'how many' of the if statements you might need to cover the max # of Drives your clients have:
;with cte as (select ld.resourceid, count(*) as 'count' from v_gs_logical_disk ld where ld.DriveType0=3 group by ResourceID)
Select max(cte.count) from cte
for example, in my environment the max # was 5. True, there was literally only ONE box with that many logical disks... and 99.5% of the
environment 'only' had 1 local disk; but about 0.5% had 2 disks... so it "doesn't hurt" to account for your max# of logical disks, they will
only be queried if they actually exist.
#>
Param (
$Namespace = 'root\cimv2',
$Class = 'cm_CustomFileInventory',
$VerbosePreference = 'SilentlyContinue',
$ErrorActionPreference = 'SilentlyContinue',
$ScriptRanDate = [System.DateTime]::UtcNow
)
Function New-WMIClassHC {
if (Get-CimClass -Namespace "$NameSpace" | Where-Object {$_.CimClassName -eq $Class} ) {
Write-Verbose "WMI Class $Class exists"
}
else {
Write-Verbose "Create WMI Class '$Class'"
$NewClass = New-Object System.Management.ManagementClass ("$Namespace", [String]::Empty,$Null);
$NewClass['__CLASS']=$Class
$NewClass.Qualifiers.Add('Static',$true)
$NewClass.Properties.Add('FileName', [System.Management.CimType]::String,$False)
$NewClass.Properties['FileName'].Qualifiers.Add('Key', $true)
$NewClass.Properties.Add('FilePath', [System.Management.CimType]::String,$False)
$NewClass.Properties['FilePath'].Qualifiers.Add('Key', $true)
$NewClass.Properties.Add('FileVersion', [System.Management.CimType]::String,$False)
$NewClass.Properties.Add('FileSizeKB', [System.Management.CimType]::Uint32,$False)
$NewClass.Properties.Add('FilePath', [System.Management.CimType]::String,$False)
$NewClass.Properties.Add('LastWriteTimeUTC', [System.Management.CimType]::DateTime,$false)
$NewClass.Properties.Add('ScriptLastRan', [System.Management.CimType]::DateTime, $false)
$NewClass.Put() | Out-Null
}
Write-Verbose "End of Trying to Create an empty $Class in $Namespace to populate later"
}
Write-Verbose "Delete the values in $Class in $Namespace so we can populate it cleanly. If $Class exist, you must have rights to it for this to work."
Remove-CimInstance -Namespace $Namespace -Query "Select * from $Class" -ErrorAction SilentlyContinue
Write-Verbose "Create $Class if it does not exist at all yet (this will only occur once per device)"
New-WMIClassHC
Write-Verbose "Add to the object any additional rules you may want."
Write-Verbose "localFilesystemDrives is used in case you need to 'find a file on any/all local drives'"
Write-Verbose "you may have to check how many local drives your environment has, and have enough lines to address possibilities"
$LocalFileSystemDrives = (Get-psdrive -PSProvider FileSystem | Where-Object {$_.DisplayRoot -notlike "\\*"})
$DriveLetter = ($LocalFileSystemDrives).Root[0]
if ( $DriveLetter.length -eq 1) {
$DriveLetter = $DriveLetter+':\'
}
[System.IO.FileSystemInfo[]]$files = Get-ChildItem -Path $DriveLetter -include ('*.pst','*.foo') -Recurse -OutBuffer 1000 -ErrorAction SilentlyContinue
#------------------------
$DriveLetter1 = ($LocalFileSystemDrives).Root[1]
if ( $DriveLetter1.length -eq 1) {
$DriveLetter1 = $DriveLetter1+':\'
}
If ($LocalFileSystemDrives.count -eq 2) {
[System.IO.FileSystemInfo[]]$files += Get-ChildItem -Path $DriveLetter1 -include ('*.pst','*.foo') -Recurse -OutBuffer 1000 -ErrorAction SilentlyContinue
}
#------------------------
$DriveLetter2 = ($LocalFileSystemDrives).Root[2]
if ( $DriveLetter2.length -eq 1) {
$DriveLetter2 = $DriveLetter2+':\'
}
If ($LocalFileSystemDrives.count -eq 3) {
[System.IO.FileSystemInfo[]]$files += Get-ChildItem -Path $DriveLetter2 -include ('*.pst','*.foo') -Recurse -OutBuffer 1000 -ErrorAction SilentlyContinue
}
#-------------------------
$DriveLetter3 = ($LocalFileSystemDrives).Root[3]
if ( $DriveLetter3.length -eq 1) {
$DriveLetter3 = $DriveLetter3+':\'
}
If ($LocalFileSystemDrives.count -eq 4) {
[System.IO.FileSystemInfo[]]$files += Get-ChildItem -Path $DriveLetter3 -include ('*.pst','*.foo') -Recurse -OutBuffer 1000 -ErrorAction SilentlyContinue
}
#-------------------------
$DriveLetter4 = ($LocalFileSystemDrives).Root[4]
if ( $DriveLetter4.length -eq 1) {
$DriveLetter4 = $DriveLetter4+':\'
}
If ($LocalFileSystemDrives.count -eq 5) {
[System.IO.FileSystemInfo[]]$files += Get-ChildItem -Path $DriveLetter4 -include ('*.pst','*.foo') -Recurse -OutBuffer 1000 -ErrorAction SilentlyContinue
}
#-------------------------
#### example where you only want to look in the specific root folder, and not recursively in all subfolders.
[System.IO.FileSystemInfo[]]$files += Get-ChildItem -Path $env:windir -include ('SomeFileOnlyInWindowsFolder.txt') -OutBuffer 1000 -ErrorAction SilentlyContinue | Where-Object {$_.DirectoryName -in ($Env:windir)}
Write-Verbose "Populate $Class in $Namespace with the file object information as queried"
Foreach ($File in $Files) {
Write-Verbose "This section is to try to get the productversion or fileversion, if the file has that metadata"
if (![string]::IsNullOrEmpty($File.VersionInfo.ProductVersion)) {
$FileVersion = $File.VersionInfo.ProductVersion
} Else
{
if (![string]::IsNullOrEmpty($File.VersionInfo.FileVersion)) {
$FileVersion = $file.VersionInfo.FileVersion
}
else {$FileVersion = ''}
}
$Size = [uint32][math]::Round(((Get-Item $File.FullName).length / 1kb),0)
New-CimInstance -Namespace "$Namespace" -class $Class -argument @{
FileName=$File.Name;
FilePath=$File.DirectoryName;
FileVersion=$FileVersion;
LastWriteTimeUTC=$file.LastWriteTimeUtc;
FileSizeKB=$Size;
ScriptLastRan=$ScriptRanDate
} | Out-Null
}
Write-Host "Compliant"