MIM2016: Using Azure MFA in an Authorization Workflow with PowerShell

While thinking about Azure MFA and it’s usage in MIM for password reset or as authorization step when requesting a PAM role, I thought to myself, why not use this as an workflow activity in an authorization workflow. For example when requesting a group membership. Sadly you can not configure the OOB MFA activities that comes with MIM.

So why not doing it on my own, using the Azure MFA SDK. And I find out it’s quite simple so far.
This demo approves a member join to a group by Azure MFA with a phone call, you have to anser the call with a # to get into the group. The MobilePhone attribute of your MIM Portal users have to be set to a valid number for this demo to work.
Read more of this post

MIMWAL: Time limited group membership (aka simple PAM solution)

Yes, it’s me again and Yes with MIMWAL again 😉

When talking with people about Privileged Access Management (PAM) scenario of my, I often get asked if the dedicated PAM forest is required. The aswer is yes, this is by design and also a very important security feature of the solution as you can never be sure your current forest is not already compromised. Also you can have the PAM forest more secure and some other benefits.

However having time limited group can also be useful in a one forest/domain scenario. So I played around a bit in my demolab and tried to build a simply PAM like solution with help of the Microsoft Workflow Activity Library (MIM WAL).

Description and benefits of my demo scenario:

  • Having time limited group membership
  • Duration of group membership can be modified
  • Can be initiated by users directly or by admins/helpdesk
  • Users get notified when their group membership expires

Read more of this post

MIMWAL: Add new users to default groups

I recently started to have a look on the Microsoft Workflow Activity Library (WAL or MIM/FIM WAL) that was given to public some time ago.

In my current projects I used the PowerShell activity in a lot of time to do things that can’t be done with OOB functions that comes with FIM/MIM.

One of those things is doing a one-time member add to default groups for new users. I’ve done this with PowerShell but you have to make use of the FIMAutomation cmdlets that do updates through FIM/MIM WebService and as everyone knows this is not the fastest way. I could get some performance enhancements using the Lithnet PowerShell Module.

So I took a look on how to do that with MIMWAL and here are the results:
Read more of this post

Using AuthZ Workflows on FIM built-in service account changes [Workaround]

As everybody knows the two FIM built-in accounts “Forefront Identity Manager Service Account” and “Built-in synchronization account” will bypass all AuthZ workflows.

So by default you are not able to do approvals for example on changes by these accounts.
In addition you cannot have AuthZ workflows on set transition, only Action workflows are allowed here.

But a customer wants to final delete accounts 180 days after deactivation.
This action should be approved by a helpdesk administrator because there are some manual and non-technical tasks to do before this should happen.

Hmmm, so with the above restrictions, what to do?

I used the FIM PowerShell Activity a lot in that customer solution, and I remember that changes done by this activity runs in the context of a normal user account (from FIMs perspective) which is the service account of the FIM web service (svcFIMService in my case).

In order to allow updates to the FIM service by this account via the Export-FIMConfig and Import-FIMConfig cmdlets I created this account in portal and grant permissions to the necessary objects.
If it does not exists, just create this account with the following attributes set:

  • DisplayName
  • AccountName (sAMAccountName from FIM webservice account in AD)
  • Domain
  • ObjectSID (from AD)

(You should manually create this account, as I got into trouble when I try to synchronize this account to FIM portal)

How to use this:

I created a workflow with the PowerShell activity which sets an attribute I created on user account, let’s say DoFinalDelete, to a value of true.

I created a MPR which fires these workflow when users transition into my set “Users with disableDate older than 180 days”.
(Btw. this disableDate is also set by a powershell workflow activity, as you can imagine)

Now I’m able to create an MPR with an AuthZ workflow to approve this change of the account svcFIMService and after that can trigger all other MPRs and workflows I want.
So in my scenario I import the DoFinalDelete attribute to MV and trigger deprovisioning on the objects in the provisioning code of my MV extension using the DeprovisionAll() method, which then triggers all the defined actions on my MA’s regarding to their deprovisioning configurations.

So once again this great piece of code “FIM PowerShell Activity” from Craig Martin and Brian Desmond is like a Swiss army knife for me. (thx guys)
You can do nearly all with PowerShell and only have to maintain one custom activity in FIM Portal, which made upgrades and migrations much easier.

FIM 2010: Configuration Deployment (Part 2: Configuration Objects)

As you hopefully read in Part 1, we often had to deal with deploying configuration from one stage to another. Deploying the schema is the easier part as this mostly goes 1:1 from dev to test and finally production, but I have to deal with environment specific differences in the configuration objects like Sets, Workflows or MPRs.

In synchronization rules for example I often use fixed stings, like domain or constructing DNs for provision objects to LDAP directories and many other attributes. I also have some sets which must be different in the environments as in testing stage some more people have permissions to do something rather than in production, so I want to manage some of these sets manually and don’t want them to be deployed.

So, like in the schema deployment script I put all the example scripts together and also start the exports in parallel using PowerShell jobs. In addition after joining and calculating the delta configuration I do a search and replace of the fixed string values and even delete some objects which I don’t want to deploy. These replacements and deletes are defined in a configuration XML file.

So here is the script I use to deploy my configuration changes from one environment to the other:
(This script is currently designed to run from the target system)

param([switch]$AllLocales)
# use -AllLocales parameter to deploy schema data incl. localizations

#### Configuration Section ####
$sourceFilename="D:\Deploy\Policy_Source.xml"
$sourceServer="fim.source.com"
$destFilename="D:\Deploy\Policy_Target.xml"
$destServer="fim.target.com"

$tempDeltaFile = "D:\Deploy\Policy_DeltaTemp.xml"
$finalDeltaFile = "D:\Deploy\Policy_DeployData.xml"
$undoneFile = "D:\Deploy\undone.xml"

$config=[XML](Get-Content .\DeployReplaceConfig.xml)

#### Cleanup old data files ####
remove-item $sourceFilename -ErrorAction:SilentlyContinue
remove-item $destFilename -ErrorAction:SilentlyContinue
remove-item $tempDeltaFile -ErrorAction:SilentlyContinue
remove-item $finalDeltaFile -ErrorAction:SilentlyContinue
remove-item $undoneFile -ErrorAction:SilentlyContinue

#### Load FIMAutonation cmdlets ####
if(@(get-pssnapin | where-object {$_.Name -eq "FIMAutomation"} ).count -eq 0) {add-pssnapin FIMAutomation}

#### Function Definitions ####
function CommitPortalChanges($deployFile)
{
	$imports = ConvertTo-FIMResource -file $deployFile
	if($imports -eq $null)
	  {
		throw (new-object NullReferenceException -ArgumentList "Changes is null.  Check that the changes file has data.")
	  }
	Write-Host "Importing changes into T&A environment"
	$undoneImports = $imports | Import-FIMConfig
	if($undoneImports -eq $null)
	  {
		Write-Host "Import complete."
	  }
	else
	  {
		Write-Host
		Write-Host "There were " $undoneImports.Count " uncompleted imports."
		$undoneImports | ConvertFrom-FIMResource -file $undoneFile
		Write-Host
		Write-Host "Please see the documentation on how to resolve the issues."
	  }
}

function SyncPortalConfig($sourceFile, $destFile, $tempFile)
{
	$joinrules = @{
		# === Customer-dependent join rules ===
		# Person and Group objects are not configuration will not be migrated.
		# However, some configuration objects like Sets may refer to these objects.
		# For this reason, we need to know how to join Person objects between
		# systems so that configuration objects have the same semantic meaning.
		Person = "AccountName";
		Group = "DisplayName";

		# === Policy configuration ===
		# Sets, MPRs, Workflow Definitions, and so on. are best identified by DisplayName
		# DisplayName is set as the default join criteria and applied to all object
		# types not listed here.
		#"ma-data" = "Description";

		# === Schema configuration ===
		# This is based on the system names of attributes and objects
		# Notice that BindingDescription is joined using its reference attributes.
		ObjectTypeDescription = "Name";
		AttributeTypeDescription = "Name";
		BindingDescription = "BoundObjectType BoundAttributeType";

		# === Portal configuration ===
		ConstantSpecifier = "BoundObjectType BoundAttributeType ConstantValueKey";
		SearchScopeConfiguration = "DisplayName SearchScopeResultObjectType Order";
		ObjectVisualizationConfiguration = "DisplayName AppliesToCreate AppliesToEdit AppliesToView"
	}

	$destination = ConvertTo-FIMResource -file $destFile
	if($destination -eq $null)
		{ throw (new-object NullReferenceException -ArgumentList "destination Schema is null.  Check that the destination file has data.") }

	Write-Host "Loaded destination file: " $destFile " with " $destination.Count " objects."

	$source = ConvertTo-FIMResource -file $sourceFile
	if($source -eq $null)
		{ throw (new-object NullReferenceException -ArgumentList "source Schema is null.  Check that the source file has data.") }

	Write-Host "Loaded source file: " $sourceFile " with " $source.Count " objects."
	Write-Host
	Write-Host "Executing join between source and destination."
	Write-Host
	$matches = Join-FIMConfig -source $source -target $destination -join $joinrules -defaultJoin DisplayName
	if($matches -eq $null)
		{ throw (new-object NullReferenceException -ArgumentList "Matches is null.  Check that the join succeeded and join criteria is correct for your environment.") }
	Write-Host "Executing compare between matched objects in source and destination."
	$changes = $matches | Compare-FIMConfig
	if($changes -eq $null)
		{ throw (new-object NullReferenceException -ArgumentList "Changes is null.  Check that no errors occurred while generating changes.") }
	Write-Host "Identified " $changes.Count " changes to apply to destination."
	$changes | ConvertFrom-FIMResource -file $tempFile
	Write-Host "Sync complete."
}

$functions = {
	function GetPortalConfig($filename, $serverFQDN, $creds, $AllLocales)
	{
		if(@(get-pssnapin | where-object {$_.Name -eq "FIMAutomation"} ).count -eq 0) {add-pssnapin FIMAutomation}
		$uri="http://" + $serverFQDN + ":5725/ResourceManagementService"
		if ($AllLocales -eq $true)
			{ $policy = Export-FIMConfig -uri $uri -credential $creds -policyConfig -portalConfig -MessageSize 9999999 –AllLocales }
		else
			{ $policy = Export-FIMConfig -uri $uri -credential $creds -policyConfig -portalConfig -MessageSize 9999999 }
		$policy | ConvertFrom-FIMResource -file $filename
	}
}

#### Main Script ####
$creds=Get-Credential -message "Enter credentials for source FIM"
$myargs=@($sourceFileName, $sourceServer, $creds, $AllLocales.IsPresent)
start-job -name "SourceFIM" -init $functions -script { GetPortalConfig $args[0] $args[1] $args[2] $args[3] } -ArgumentList $myargs

$creds=Get-Credential -message "Enter credentials for destination FIM"
$myargs=@($destFileName, $destServer, $creds, $AllLocales.IsPresent)
start-job -name "DestFIM" -init $functions -script { GetPortalConfig $args[0] $args[1] $args[2] $args[3] } -ArgumentList $myargs

write-host "Waiting for Policy Export to complete..."
get-job | wait-job

Write-Host "Exports complete: Starting Policy compare..."
Write-Host
SyncPortalConfig $sourceFilename $destFilename $tempDeltaFile

Write-Host "`nReplace configuration data with destination values"
$portalData=(get-content $tempDeltaFile)

foreach ($ReplaceConfig in $config.DeployConfig.ReplaceData)
{
	$newPortalData=@()
	foreach ($CurrentLine in $portalData)
	{
		$newPortalData+=$CurrentLine.replace($ReplaceConfig.SearchString, $ReplaceConfig.ReplaceString)
	}
	$portalData=$newPortalData
}

$portalData | set-content $finalDeltaFile

Write-Host "`nRemoving excluded Objects from deploy data"
$deployXML=[XML](get-content $finalDeltaFile)

foreach ($ReplaceConfig in $config.DeployConfig.IgnoreData)
{
	$deleteObjects=$deployXML.Results.ImportObject | where { $_.TargetObjectIdentifier -eq "urn:uuid:"+$ReplaceConfig.SearchGUID }
	foreach ($delObj in $deleteObjects)
	{
		write-host "Deleting " $delObj.ObjectType ": " $delObj.AnchorPairs.JoinPair.AttributeValue
		try
		{
			$deployXML.Results.RemoveChild($delObj) | out-null
		}
		catch [System.Management.Automation.MethodInvocationException]
		{
			write-host "Ignoring " $delObj.ObjectType ": " $delObj.AnchorPairs.JoinPair.AttributeValue " not in DeploymentData"
		}
	}
}

$deployXML.Save($finalDeltaFile)
Write-Host "`n`nFinal file for deployment created: " $finalDeltaFile

$input=Read-Host "Do you want commit changes to destination FIM ? (y/n)"
if ($input -eq "y")
{
	CommitPortalChanges $finalDeltaFile
}

This is the configuration XML file: In IgnoreData the Name is only for viewing what object this is, all search is done on the GUID, which is the GUID of the target system.

<?xml version="1.0" encoding="utf-8"?>
<DeployConfig>
	<ReplaceData Name="NetbiosDomain">
		<SearchString>SOURCE-DOM</SearchString>
		<ReplaceString>TARGET-DOM</ReplaceString>
	</ReplaceData>
	<ReplaceData Name="DomainSID">
		<SearchString>S-1-5-21-xxxxxxxxx-xxxxxxxxxx-xxxxxxxxx</SearchString>
		<ReplaceString>S-1-5-21-xxxxxxxxxx-xxxxxxxxx-xxxxxxxxx</ReplaceString>
	</ReplaceData>
	<ReplaceData Name="LDAPDomain1">
		<SearchString>DC=source,DC=com</SearchString>
		<ReplaceString>DC=target,DC=com</ReplaceString>
	</ReplaceData>
	<IgnoreData Name="Set: All GroupAdmins">
		<SearchGUID>95833928-23e0-4e3d-bcc0-b824f3a8e123</SearchGUID>
	</IgnoreData>
	<IgnoreData Name="Set: All UserAdmin">
		<SearchGUID>28279e59-14b7-4b89-9d8a-b4c666a65301</SearchGUID>
	</IgnoreData>
</DeployConfig>

Due to the implementation of PowerShell jobs, if you get any error or exceptions you can retrieve the output from both background jobs with the following command:

Receive-Job

Or

Receive-Job –name

The script also creates the undone.xml file like the original join script, so you can use the original ResumeUndoneImport.ps1 script to retry importing the unprocessed changes.
In addition to this I have built a little helper script that would generate the XML for the objects I don’t want to be deployed and copy this directly to the clipboard, so you can easily add a lot of objects in a short time.

PARAM([string]$DisplayName)
set-variable -name URI -value "http://localhost:5725/resourcemanagementservice" -option constant

Write-Host "- Reading Set Information from Portal"
if(@(get-pssnapin | where-object {$_.Name -eq "FIMAutomation"} ).count -eq 0)
{add-pssnapin FIMAutomation}

$exportObject = export-fimconfig -uri $URI `
                               -onlyBaseResources `
                               -customconfig ("/Set[DisplayName='$DisplayName']")
if($exportObject -eq $null) {throw "Cannot find a set by that name"}
$ResourceID = $exportObject.ResourceManagementObject.ResourceManagementAttributes | `
                Where-Object {$_.AttributeName -eq "ObjectID"}

Write-Host "DisplayName: $DisplayName has ObjectID: " $ResourceID.Value
Write-Host
Write-Host "The following output is added to your clipboard:"
Write-Host
$objectGUID=$ResourceID.Value.Replace("urn:uuid:","")
$output=@()
$output+="`t"
$output+="`t`t$ObjectGUID"
$output+="`t"
$output | clip
$output
Write-Host

Notes:

This script has currently one small issue, each of the replacement objects are currently deployed every time you use this script, as the delta calculation see differences between the environments, and replacement is done after sync and calculate, but that’s ok it still works fine.

I will do an update of the script in maybe the near future to check if the replacement is already done to avoid this, but I haven’t find a way to do so right now.

%d bloggers like this: