Configmgr 2012 Truncate History Tables

Updated 2019-04-29, with DRS replication in ConfigMgr Current Branch, global data tables might be hit with this _hist truncate, the sql below has been modified to exclude global data types, so that your replication (for example, in rcmctrl.log) doesn't go crazy about having to re-initialize any data.  Thanks very much to Umair Khan, Twitter @TheFrankUK, for the assist!

Have you ever noticed, being the extreme ConfigMgr geek that you are, that you have v_gs and v_hs views?  Which point back to current, and History tables in your database.

Have you ever, and I mean EVER, needed to reference anything in the v_hs views?  Ever?  If you have, then perhaps this isn't for you.  If you've never used the data in the history views... why are you keeping it?  Sure, there are Maintenance Tasks you can tweak to help keep that data down, but... there is a quick (not supported) way to clean that up.

Keeping in mind this is NOT SUPPORTED (but it works anyway), so do this at your own risk, etc. etc.  If you mess up, I don't support you.  Microsoft won't support you.  You have a backup of your database, right?

On your Primary Site (even if you have a CAS, you still do this at your primary sites), all of this is done in SQL, the console is not involved at all.

Take the below, and in SQL management Studio, just take a look at how much History data you have.  Only you can determine if that's cause for concern, and you want to automate cleaning that up using a SQL Truncate process.  At my company, in the 12+ years that people on this team have been supporting SMS, then ConfigMgr...no one ever needed data in the History tables.  So...for us this was a lot of space gained, that didn't need to be backed up, and made nightly processing of some of the maintenance tasks that look at history tables finish MUCH faster than they have in months.

John Nelson (aka, Number2) would run the Truncate manually occasionally; but after a while that gets tedious.  :)  So he showed me how to see what is going to be truncated (query #1) and then how to make a Scheduled Job that runs daily, to actually do the Truncate of History tables.

Query #1: This particular query is only to look at what you have.  It does nothing but show you results.  Run this against all of your ConfigMgr sites with a CM_ database; and see if there is history you want to truncate.  If so, you may want to then move on to running SQL #2 (below).

SELECT
    t.NAME AS TableName,
    s.Name AS SchemaName,
    p.rows AS RowCounts,
    SUM(a.total_pages) * 8 AS TotalSpaceKB,
    SUM(a.used_pages) * 8 AS UsedSpaceKB,
    (SUM(a.total_pages) - SUM(a.used_pages)) * 8 AS UnusedSpaceKB
FROM
    sys.tables t
INNER JOIN
          sys.indexes i ON t.OBJECT_ID = i.object_id
INNER JOIN
    sys.partitions p ON i.object_id = p.OBJECT_ID AND i.index_id = p.index_id
INNER JOIN
    sys.allocation_units a ON p.partition_id = a.container_id
LEFT OUTER JOIN
    sys.schemas s ON t.schema_id = s.schema_id
INNER JOIN
    ArticleData AD on AD.ArticleName = T.Name
WHERE
    t.NAME NOT LIKE 'dt%'
    AND t.NAME LIKE '%[_]HIST'
    AND t.is_ms_shipped = 0
    AND i.OBJECT_ID > 255
    AND AD.ReplicationID not in (Select ID from ReplicationData where ReplicationPattern = 'global')
GROUP BY
    t.Name, s.Name, p.Rows
ORDER BY
    rowcounts desc

SQL #2:  This will CREATE a job, with a daily schedule.  Before you run it, change CM_FUN to be your CM_<your Site Code> ; and you may also want to change
                @active_start_date=20140909
prior to running it, to whatever date you want the daily schedule to really start.  Once created, presuming SQL Server Agent is running, on that SQL server, for the Databse of CM_<whatever you put in>, it'll truncate your history tables on the schedule defined.

Optional:  After you've run the below, in your SQL Management Studio, Sql Server Agent, Jobs, if your right-click on the new job "ConfigMgr Truncate History Tables", you can select "Start Job at Step..." to have the job run RIGHT now; to confirm it works.  Once it's done, you can re-run query #1 above and see that it's clean(er).  Note that as machines report inventory, data will go into the history tables frequently.  You may already have new rows after you just ran the Truncate job, but it should be much less than it was.

Optional:  The next day, or weekly, or monthly...whatever schedule you have internally for checking up on your ConfigMgr infrastructure, every once in a while, run Query #1 above; and/or every once in a while, in SQL go to SQL Server Agent, Jobs, right-click on the Configmgr Truncate History Tables job, and select "View History", to see that the job was successful.

USE [msdb]
GO /****** Object:  Job [ConfigMgr Truncate History Tables]    Script Date: 9/8/2014 2:05:50 PM ******/
BEGIN TRANSACTION
DECLARE @ReturnCode INT
SELECT @ReturnCode = 0
/****** Object:  JobCategory [[Uncategorized (Local)]]    Script Date: 9/8/2014 2:05:51 PM ******/
IF NOT EXISTS (SELECT name FROM msdb.dbo.syscategories WHERE name=N'[Uncategorized (Local)]' AND category_class=1)
BEGIN
EXEC @ReturnCode = msdb.dbo.sp_add_category @class=N'JOB', @type=N'LOCAL', @name=N'[Uncategorized (Local)]'
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
END
DECLARE @jobId BINARY(16)
EXEC @ReturnCode =  msdb.dbo.sp_add_job @job_name=N'ConfigMgr Truncate History Tables',
                @enabled=1,
                @notify_level_eventlog=0,
                @notify_level_email=0,
                @notify_level_netsend=0,
                @notify_level_page=0,
                @delete_level=0,
                @description=N'Truncate ConfigMgr database History tables',
                @category_name=N'[Uncategorized (Local)]',
                @owner_login_name=N'NT AUTHORITY\SYSTEM', @job_id = @jobId OUTPUT
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
/****** Object:  Step [Truncate]    Script Date: 9/8/2014 2:05:51 PM ******/
EXEC @ReturnCode = msdb.dbo.sp_add_jobstep @[email protected], @step_name=N'Truncate',
                 @step_id=1,
                @cmdexec_success_code=0,
                @on_success_action=1,
                @on_success_step_id=0,
                @on_fail_action=2,
                @on_fail_step_id=0,
                @retry_attempts=0,
                @retry_interval=0,
                @os_run_priority=0, @subsystem=N'TSQL',
                @command=N'USE [CM_FUN]
GO
DECLARE @SQL NVARCHAR(MAX) = N''
''
SELECT
  @SQL = @SQL+N''TRUNCATE TABLE dbo.''+TABLE_NAME+'';
''  
FROM
  INFORMATION_SCHEMA.TABLES x
  INNER JOIN ArticleData A on x.TABLE_Name = A.ArticleName
WHERE
  x.TABLE_SCHEMA = ''dbo''
  AND x.TABLE_NAME LIKE ''%[_]HIST''
  AND A.ReplicationID not in (Select ID from ReplicationData where ReplicationPattern = ''global'')
ORDER BY
  x.TABLE_NAME
exec sp_executesql @SQL
',
                @database_name=N'CM_FUN',
                @flags=0
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
EXEC @ReturnCode = msdb.dbo.sp_update_job @job_id = @jobId, @start_step_id = 1
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
EXEC @ReturnCode = msdb.dbo.sp_add_jobschedule @[email protected], @name=N'ConfigMgr Truncate Hsitory',
                @enabled=1,
                @freq_type=4,
                @freq_interval=1,
                @freq_subday_type=1,
                @freq_subday_interval=0,
                @freq_relative_interval=0,
                @freq_recurrence_factor=0,
                @active_start_date=20140908,
                @active_end_date=99991231,
                @active_start_time=231100,
                @active_end_time=235959,
                @schedule_uid=N'9936718a-af85-497b-ac0d-d47d91ce99d8'
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
EXEC @ReturnCode = msdb.dbo.sp_add_jobserver @job_id = @jobId, @server_name = N'(local)'
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
COMMIT TRANSACTIONGOTO End
Save QuitWithRollback:
    IF (@@TRANCOUNT > 0) ROLLBACK TRANSACTION
EndSave:
GO

 

 

ConfigMgr MaxExecutionTime Guesses for Updates

There is a situation which MIGHT happen for you.  The default for Cumulative Updates is, I believe 60 minutes now.  But many updates are still defaulting to 10 minutes.  I don't personally think that default should change, however, occasionally there are large updates (think Microsoft Office updates) which might be several hundred GB in size, and might take more than 10 minutes to install.  In your reporting, and when looking at local logs, the CM client says the install "Failed", but all you do is a re-scan for updates, and CM says it's installed.  So what gives, you wonder?  Well, this could be a possible reason.  It's not that the install 'failed' per se.  But after 10 minutes, the CM client stopped 'watching for' the successful install.  It timed out kind of.  Since I noticed a pattern that "it's usually when those updates are ginormous, they take longer to install", below is a POSSIBLE sql query to perhaps help you find and adjust the "Max Execution Timeout" on any individual updates.

A couple of pre-requisites.  Naturally, the content has to be downloaded. So if you run this 5 minutes after a "hotfix Tuesday" sync, it might not have much to say.  Because the content hasn't been downloaded to calculate "how big" any particular update is.  So you do have to wait until your content is downloaded to track these down.

Also note that I haven't created any kind of "powershell script" to automatically adjust the Max Execution Timeout.  This is just a report, and the admin would either posh-script changing each individual update, or use the console, find each update, right-click on it and in properties for that update, adjust up the max Execution Timeout to fit.

Also note these "suggestions" are just that, suggestions.  There is no right or wrong answer for how long Max Execution Timeout should be.  Leaving it all alone as-is with no changes from what you have will still work just fine.  One of the problems you may encounter might discourage you from touching or doing anything with this at all could be this following scenario...  Here's the scenario where following these suggestions would be a big bad horrible idea.  Let's say you allow your devices to have a service window every night for 4 hours.  Then you follow these suggestions, and for whatever reason, there were 8 different Office updates, and you changed them all from 10 minutes to 60 minutes each... for a total of 8 hours estimated time to install.  A client, when it gets the Software Update deployment, when it used to think "ok, these 8 will take me 80 minutes, I can do that in my 4 hour window, let's start!".  It'll start installing, and maybe it only gets 3 done... but it does get 3 done.  If you set them to 60 minutes each, the client might decide "wow, 8 hours... I can't do that in my service window... I'll just wait until I have 8+ hours to get this done".  and of course... it may never install any of them.  So be careful in deciding whether or not this is a potentially BAD idea, for your environment.  Or at least be aware of the potential repercussions, so you know what to un-do.

What this sql does, is list for Updates released in the last 30 days, and content has been downloaded, kind of look at the maxexecutiontime set, vs. how big the content is.  and if, for example, the content size is between 50 and 100mb, but it's maxexecutiontime isn't 20 minutes or more, then maybe you the admin might want to think about making MaxExecutionTime on that specific update to be 20 minutes--so you don't get false "I failed to install" reports which a re-scan will address.

Again... this isn't perfect.  It's just a possible suggestion, if you maybe have seen this behavior in your Software Updates deployments, and were wondering if there was a way to be pro-active about increasing the MaxExecutionTime without waiting for your reports to tell you the next day.

DECLARE @StartDate datetime = DateADD(Day, -30, GETDATE())
DECLARE @EndDate datetime = GetDate()

;with cte as (select ui.MaxExecutionTime/60 [Max ExecutionTime in Minutes], ui.articleid, ui.title, ui.DateLastModified, ui.DatePosted
,ui.IsSuperseded, ui.IsExpired
,(SUM(files.FileSize)/1024)/1 as [Size in KB]
,(SUM(files.FileSize)/1024/1024)/1 as [Size in MB]
from v_updateinfo ui
join v_UpdateContents content on content.CI_ID=ui.CI_ID
join vCI_ContentFiles files on files.Content_ID=content.Content_ID
where severity is not null
and content.ContentProvisioned = 1
and ui.dateposted between @StartDate and @EndDate
and ui.IsExpired = 0
group by ui.MaxExecutionTime, ui.articleid, ui.title, ui.DateLastModified, ui.dateposted, ui.IsSuperseded, ui.IsExpired
)

select
Case when cte.[Size in MB] < 50 and cte.[Max ExecutionTime in Minutes] >= 10 then 0
when cte.[Size in MB] BETWEEN 50 and 100 and cte.[Max ExecutionTime in Minutes] >= 20 then 0
when cte.[Size in MB] between 100 and 150 and cte.[Max ExecutionTime in Minutes] >= 30 then 0
when cte.[Size in MB] between 150 and 200 and cte.[Max ExecutionTime in Minutes] >= 40 then 0
when cte.[Size in MB] between 200 and 250 and cte.[Max ExecutionTime in Minutes] >= 50 then 0
when cte.[Size in MB] between 250 and 300 and cte.[Max ExecutionTime in Minutes] >= 60 then 0
when cte.[Size in MB] > 300 and cte.[Max ExecutionTime in Minutes] >=90 then 0
else 1
End as [Could use MaxExecutionTime Adjustment],
case when cte.[Size in MB] < 50 then '10 minutes'
when cte.[Size in MB] BETWEEN 50 and 100 then '20 minutes'
when cte.[Size in MB] between 100 and 150 then '30 minutes'
when cte.[Size in MB] between 150 and 200 then '40 minutes'
when cte.[Size in MB] between 200 and 250 then '50 minutes'
when cte.[Size in MB] between 250 and 300 then '60 minutes'
when cte.[Size in MB] > 300 then '90 minutes'
end as 'time to set'
, cte.*

from cte
order by [Could use MaxExecutionTime Adjustment] desc, [Time to set] desc

IE 11 Enterprise Mode Tracking - Part 1

Internet Explorer 11 comes with this great new feature, Enterprise Mode, and it saves the day when it comes to most browser related upgrade issues. At least in my case it did a really good job of mitigating them. Unfortunately for me the documentation didn't do a great job of showing me how to make sense of it all,  especially when it came to how I was supposed to collect all the data from the web endpoint in the 'enable' key. (https://msdn.microsoft.com/en-us/library/dn640699.aspx) Basically you can put a URL in the 'enable' registry key which instructs the browser to post some data to a URL when ever a user enables, or disables, enterprise mode. That is great! Well wait, what do we need to setup to collect the data again?

There are a few posts out in the interwebs that describe how to do this already.  Some people use a web page to write to a text file, and Microsoft even released a way for us to collect this data with ConfigMgr (https://msdn.microsoft.com/en-us/library/dn833204.aspx). Great problem solved!  Well unfortunately ConfigMgr may not be everywhere I need it, or maybe I don't want to collect all this data in my 'already too big for my own good' CM database.  I thought I would share what I have done to make my life a little easier when someone asks "who is using enterprise mode and for which URLs?".  So let's setup an IIS site with a page that collects the data and then writes the data into a database for us. Basically we setup IIS, setup the database, then drop an aspx file into our website to write the data to the database for us. **Small disclaimer here: I am a systems admin and not an application developer. So please test this, use the appropriate data locations, and please change the passwords.  Also, let me know what i messed up!**

First off, we need a web server. For the purpose of this article I setup Server 2012 R2 with SQL Express as a starting point. Next I ran the following commands to setup the IIS Role on the server.

DISM /online /enable-feature /featurename:NetFx4 /all
DISM /online /enable-feature /featurename:IIS-WebServerRole
DISM /online /enable-feature /featurename:IIS-ASPNET45 /all
DISM /online /enable-feature /featurename:WCF-HTTP-Activation

Next, lets make a directory for our IIS site and database, then create and configure the IIS site using the commands below.

::Make our web dir
mkdir c:\ie11em
::Make our SQL dir
mkdir c:\sql
::Stop the Default IIS site
c:\windows\system32\inetsrv\appcmd stop site /site.name:"Default Web Site"
::Create our IE11EM IIS site
c:\windows\system32\inetsrv\appcmd.exe add site /name:IE11EM /id:2 /physicalPath:c:\ie11em /bindings:http/*:80:
::Enable windows auth (so we can capture user names)
c:\windows\system32\inetsrv\appcmd set config /section:windowsAuthentication /enabled:true
::Disabled Anonymous Auth
c:\windows\system32\inetsrv\appcmd set config /section:anonymousAuthentication /enabled:false

Now to setup the database, lets open up SQL Management Studio and run the following script.  This will also create the SQL auth account for the webpage to write to the database.  You will need to enable SQL auth on your SQL instance prior to running this, in SQL Mgmt Studio right click on your server name and select properties.  Then select the Security node, and set the Server Authentication option to 'SQL Server and Windows Authentication mode'.  (aka Mixed Mode)  Then restart you sql instance for the changes to take effect.

/************************** Start IE11EM.SQL **************************/
USE [master]
GO
/****** Object: Database [IE11EntModeDB] ******/
CREATE DATABASE [IE11EntModeDB]
CONTAINMENT = NONE
ON PRIMARY
( NAME = N'IE11EntModeDB', FILENAME = N'c:\SQL\IE11EntModeDB.mdf' , SIZE = 3072KB , MAXSIZE = UNLIMITED, FILEGROWTH = 1024KB )
LOG ON
( NAME = N'IE11EntModeDB_log', FILENAME = N'c:\SQL\IE11EntModeDB_log.ldf' , SIZE = 1024KB , MAXSIZE = 8GB , FILEGROWTH = 10%)
GO
ALTER DATABASE [IE11EntModeDB] SET COMPATIBILITY_LEVEL = 110
GO

USE [IE11EntModeDB]
GO

/****** Object: User [svc.ie11entmode] ******/
CREATE LOGIN [svc.ie11entmode] WITH PASSWORD = '[email protected]!';
GO
CREATE USER [svc.ie11entmode] FOR LOGIN [svc.ie11entmode] WITH DEFAULT_SCHEMA=[dbo]
GO
ALTER ROLE [db_datareader] ADD MEMBER [svc.ie11entmode]
GO
ALTER ROLE [db_datawriter] ADD MEMBER [svc.ie11entmode]
GO
/****** Object: Table [dbo].[IEEntMode] ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[IEEntMode](
[key] [int] IDENTITY(1,1) NOT NULL,
[Source] [nvarchar](128) NULL,
[Url] [nvarchar](2048) NULL,
[Mode] [nvarchar](5) NULL,
[User] [nvarchar](64) NULL,
[TimeStamp] [datetime] NOT NULL CONSTRAINT [DF_IEEntMode_TimeStamp] DEFAULT (getdate()),
CONSTRAINT [PK_IEEntMode] PRIMARY KEY CLUSTERED
(
[key] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]

GO
/****** Object: View [dbo].[vwIE11EnterpriseModeCountSummary] ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO

CREATE VIEW [dbo].[vwIE11EnterpriseModeCountSummary]
AS
SELECT Url, Mode, COUNT(Mode) AS tCount
FROM dbo.IEEntMode
WHERE (Url <> '')
GROUP BY Url, Mode


GO
/****** Object: View [dbo].[vwIE11EnterpriseModeDomainDetail] ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO

CREATE VIEW [dbo].[vwIE11EnterpriseModeDomainDetail]
AS
SELECT LEFT(Url, CHARINDEX('/', Url, 9) - 1) AS Domain, Mode
FROM dbo.IEEntMode
WHERE (Url <> '')
GO

USE [master]
GO

ALTER DATABASE [IE11EntModeDB] SET READ_WRITE
GO
/************************** End IE11EM.SQL **************************/

Alright, almost there. The last step is to place the aspx page in our web directory. Save the code below as ie11em.aspx into the c:\ie11em directory created earlier.  Make certain to substitute your server's name in the connection string where you see "%YourServerNameHere%" listed.

/************************** Start ie11em.aspx **************************/

<html><head><title>IEEM Logging</title></head><body>

<%
'Dim objFSO, objTextFile, csvFile

Response.write("'" & Request.ServerVariables("REMOTE_HOST") & "','" & Request.ServerVariables("AUTH_USER") & "','" & Request.Form("URL") & "','" & Request.Form("EnterpriseMode") & "'")

Dim sqlConnection1 As New System.Data.SqlClient.SqlConnection("Server=%YourServerNameHere%; Database=IE11EntModeDB; User Id=Svc.ie11entmode; [email protected]!";)

Dim cmd As New System.Data.SqlClient.SqlCommand
cmd.CommandType = System.Data.CommandType.Text
cmd.CommandText = "INSERT IEEntMode ([Source],[User],[Url],[Mode]) VALUES ('" & Request.ServerVariables("REMOTE_HOST") & "','" & Request.ServerVariables("AUTH_USER") & "','" & Request.Form("URL") & "','" & Request.Form("EnterpriseMode") & "')"
cmd.Connection = sqlConnection1

sqlConnection1.Open()
cmd.ExecuteNonQuery()
sqlConnection1.Close()

cmd = Nothing
sqlConnection1 = Nothing

%>
</body></html>

You can now test your IE 11 data collection site by putting "https://%YourServerNameHere%/ieem.aspx" in your browser.  If everything is working you will see '192.168.1.24','domain\username','','' appear in the browser, where 192.168.1.24 is the IP of the system you opened the browser on, and the domain\username is the user you are logged in with.  Since we are not passing the same variable strings the browser does, you will see blanks in the last 2 values.  Once you plug this URL value in the registry key to enable Enterprise Mode you will also see the URL passed from the browser, and a binary On or Off value.  (HKLM\Software\Policies\Microsoft\Internet Explorer\Main\EnterpriseMode enable="https://%YourServerNameHere%/ieem.aspx")  

Once you have updated the registry, restarted IE, and you have enabled Enterprise Mode on a few sites.  Use the query below to see the data you are collecting.

SELECT [key],[Source],[Url],[Mode],[User] ,[TimeStamp] FROM [dbo].[IEEntMode]

Next up in part 2:  A few SSRS reports to rollup the data and a runbook to rollout the sitelist to, you guessed it, the website above.

Until next time, cheers!

MMS

MMS MoA Nov 10-11-12 2014

For those of you who attended our last meeting, you might recall our survey about a conference this fall.   Well it's now official:  The Midwest Management Summit.  Watch the Twitter feeds for #MMSMinnesota and #MMS.  And learn about the conference at our site: http://mms.mnscug.org

And follow MMS on Facebook: https://www.facebook.com/MidwestManagementSummitMN

MNSCUG June Meeting Notes

Fred's Notes from the Active Directory Best Practices from Robert Wakefield with NowMicro:

Backup of group policies via GPMC or script. Most GP admins do not backup group policy objects.

It's not a bad idea to backup directly from the GPMC

Use PowerShell to backup GPOs, this can be scheduled. Group policy backup via PS doesn't get links or security, etc. Only the object itself.

Use a central store to gather and distribute ADMX files.

Purging Data from a SQL DB, Selectively.

I had this odd request to restore a database from backup, then clear all the data from the database with the excption of a dozen tables.  OK, I can do that but there is 1200 tables and I have other work to do too.  And you may notice from the tables it was the CM database.  So lets use a cursor against the list of tables and truncate some data!

USE CM_SMS

declare @name nvarchar(256)

declare cur CURSOR LOCAL for

SELECT name

FROM sys.Tables

--Keep data in these tables

where name not IN('Computer_System_Data','Network_Data','PC_Bios_Data','Operating_System_Data','Processor_Data','PC_Memory_Data','System_Enclosure_DATA','Disk_Data','Video_Controller_Data','WorkstationStatus_DATA','SYSTEM_CONSOLE_USER_DATA','Add_Remove_Programs_64_Data','Add_Remove_Programs_Data','SoftwareFile','System_DISC')

open cur

fetch next from cur into @name

while @@FETCH_STATUS = 0 BEGIN

    BEGIN TRY

     EXECUTE('TRUNCATE TABLE ' + @name)

       print 'Success: ' + @name

    END TRY

     BEGIN CATCH

          print 'Failed: ' + @name

       END CATCH

    fetch next from cur into @name

END

close cur

deallocate cur

Copyright © 2019 - The Twin Cities Systems Management User Group