FIM 2010 R2: sync-rule-invalid-xml-attribute-flow and unable to update FIM Service ma/mv data

I spend several hours on a dev stage FIM 2010 R2 Server at a customer which was throwing the following error on synchronizations mainly of the FIM MA:



Read more of this post

AADConnect: User Writeback: Filtering user objects from the cloud

I recently installed the Preview #2 of Azure Active Directory Connect (AADConnect) in on my testlab with user write-back feature enabled.

Sadly there is currently no possibility to filtering objects that are created in the cloud, so they get not provisioned to the on-premise directory.

I already provided that as a feedback to connect and I assume there will be some filtering OOB in future/final release.

As a workaround you can do the following to modify the sync rules on your own:

Read more of this post

PowerShell Activity: Issues with GUIDs in Workflow Activities and Sync Rules

I recently faced a problem with GUIDs generated in a PowerShell Workflow Activity. As you can see in my previous blog posts I use the FIM PowerShell Workflow Activity a lot of times (nearly most the time).

Currently I’m working on provisioning of user accounts with exchange mailboxes, in addition I have to activate/create the Online Archive for users.

I’m following this blog article from Eihab Isaac for the correct attributes to set, except that I want to do all this with portal sync rules and declarative provisioning.
If you take a look at the article you can see that you have to provide a new GUID to the msExchArchiveGUID attribute in order to get the archive feature to work. Read more of this post

Custom expression function in sync rule doesn’t return NULL

Some days ago I faced a problem at a customer site with datetime attribute flows.
Users have separation dates as long as they have a limited contract, but when they got an unlimited contract the separation date is removed (NULL in database).

I’ve got a response from a system owner that dates for users with unlimited contract will not be cleared in their system.
First thought was I may be forgot to set the “allow Null on export” flag. But it was set on that attribute which is exported as a string value.

My sync rules custom expression was like this:

DateTimeFormat(SeparationDate,"yyyy-MM-dd") => ContractEndDate

Read more of this post

Remove leading zeros from attribute values with a portal sync rule custom expression

Note to Self.

Today I having the requirement of removing leading zeros from attribute employeeID.
Special situation is that employeeIDs can be range from 1 to 5 chars, like:

00002, 00013, 00204 and so on.

Looking at the possible function on sync rules first thought was that this would not be possible, but sometimes things can be easier than they look alike.

Simple replacing the 0 (zero) by spaces, then perform an LTrim and after that replacing the spaces back to 0 (zero) works very well.

So the portal sync rules custom expression goes like this:

ReplaceString(LTrim(ReplaceString(employeeID,"0"," "))," ","0")


Sync Engine crash using outbound scope filter in sync rules

Sorry for the break, I will try to post more frequently in the near future.

Sometimes a very small mistake can lead into a bigger problem, where you can spend a lot time on.
This happens to me with FIM Sync Rules some days ago.

I created a sync rule in the FIM Portal to import some objects from Active Directory. I used the sync rule method without using EREs, so I set up an Outbound Scope Filter, like in the screenshot below.


The outbound scope filter just checks a string and a Boolean attribute to determine the objects for which outbound sync should apply. But exactly there I made my small mistake.
I ran a delta import on the FIM MA and after that a delta sync and got an error that Sync Engine service has stopped (stopped-server).


No other information where displayed and the event log only shows some not very useful errors.

Faulting application name: miiserver.exe, version: 4.1.3510.0, time stamp: 0x5307e22e
Faulting module name: ntdll.dll, version: 6.1.7601.18229, time stamp: 0x51fb164a
Exception code: 0xc0000374
Fault offset: 0x00000000000c4102
Faulting process id: 0x2c68
Faulting application start time: 0x01cf86eb70f151bf
Faulting application path: D:\FIM\Synchronization Service\Bin\miiserver.exe
Faulting module path: C:\Windows\SYSTEM32\ntdll.dll
Report Id: b4f78312-01e0-11e4-adbb-3c4a927120f6

Using the preview function on the Management Agent instead of a normal delta sync throws me another error:
Unable to get preview XML from server, the remote procedure call failed.


So I spend some hours on investigation on this error and tried a lot of things, one of the last things I tried was to remove one of the outbound scope filter rules, so removed the userclass string attribute and only test my sync rule with the boolean attribute in the scope filter.
Now I got a sync-rule-scoping-filter-invalid-xml error together with another crash of the Sync Engine.


I take a deeper look on my scope filter and just for fun I changed the value from “True” to “true” (upper/lowercase).
Another delta import followed by a delta sync and the sync rule was projected into to metaverse.

So I spend 3 hours on just an upper/lowercase typo problem, thanks a lot FIM. 😉
I tried to reproduce the error on another environment, and also get the sync-rule-scoping-filter-invalid-xml error but this Sync Engine didn’t crash. Seems to be a combination of maybe the patch level and other things the lead into the complete crash of the service.

So everyone, be a careful using compare values on boolean attributes in scope filters.


FIM 2010: Configuration Deployment (Part 2: Configuration Objects)

As you hopefully read in Part 1, we often had to deal with deploying configuration from one stage to another. Deploying the schema is the easier part as this mostly goes 1:1 from dev to test and finally production, but I have to deal with environment specific differences in the configuration objects like Sets, Workflows or MPRs.

In synchronization rules for example I often use fixed stings, like domain or constructing DNs for provision objects to LDAP directories and many other attributes. I also have some sets which must be different in the environments as in testing stage some more people have permissions to do something rather than in production, so I want to manage some of these sets manually and don’t want them to be deployed.

So, like in the schema deployment script I put all the example scripts together and also start the exports in parallel using PowerShell jobs. In addition after joining and calculating the delta configuration I do a search and replace of the fixed string values and even delete some objects which I don’t want to deploy. These replacements and deletes are defined in a configuration XML file.

So here is the script I use to deploy my configuration changes from one environment to the other:
(This script is currently designed to run from the target system)

# use -AllLocales parameter to deploy schema data incl. localizations

#### Configuration Section ####

$tempDeltaFile = "D:\Deploy\Policy_DeltaTemp.xml"
$finalDeltaFile = "D:\Deploy\Policy_DeployData.xml"
$undoneFile = "D:\Deploy\undone.xml"

$config=[XML](Get-Content .\DeployReplaceConfig.xml)

#### Cleanup old data files ####
remove-item $sourceFilename -ErrorAction:SilentlyContinue
remove-item $destFilename -ErrorAction:SilentlyContinue
remove-item $tempDeltaFile -ErrorAction:SilentlyContinue
remove-item $finalDeltaFile -ErrorAction:SilentlyContinue
remove-item $undoneFile -ErrorAction:SilentlyContinue

#### Load FIMAutonation cmdlets ####
if(@(get-pssnapin | where-object {$_.Name -eq "FIMAutomation"} ).count -eq 0) {add-pssnapin FIMAutomation}

#### Function Definitions ####
function CommitPortalChanges($deployFile)
	$imports = ConvertTo-FIMResource -file $deployFile
	if($imports -eq $null)
		throw (new-object NullReferenceException -ArgumentList "Changes is null.  Check that the changes file has data.")
	Write-Host "Importing changes into T&A environment"
	$undoneImports = $imports | Import-FIMConfig
	if($undoneImports -eq $null)
		Write-Host "Import complete."
		Write-Host "There were " $undoneImports.Count " uncompleted imports."
		$undoneImports | ConvertFrom-FIMResource -file $undoneFile
		Write-Host "Please see the documentation on how to resolve the issues."

function SyncPortalConfig($sourceFile, $destFile, $tempFile)
	$joinrules = @{
		# === Customer-dependent join rules ===
		# Person and Group objects are not configuration will not be migrated.
		# However, some configuration objects like Sets may refer to these objects.
		# For this reason, we need to know how to join Person objects between
		# systems so that configuration objects have the same semantic meaning.
		Person = "AccountName";
		Group = "DisplayName";

		# === Policy configuration ===
		# Sets, MPRs, Workflow Definitions, and so on. are best identified by DisplayName
		# DisplayName is set as the default join criteria and applied to all object
		# types not listed here.
		#"ma-data" = "Description";

		# === Schema configuration ===
		# This is based on the system names of attributes and objects
		# Notice that BindingDescription is joined using its reference attributes.
		ObjectTypeDescription = "Name";
		AttributeTypeDescription = "Name";
		BindingDescription = "BoundObjectType BoundAttributeType";

		# === Portal configuration ===
		ConstantSpecifier = "BoundObjectType BoundAttributeType ConstantValueKey";
		SearchScopeConfiguration = "DisplayName SearchScopeResultObjectType Order";
		ObjectVisualizationConfiguration = "DisplayName AppliesToCreate AppliesToEdit AppliesToView"

	$destination = ConvertTo-FIMResource -file $destFile
	if($destination -eq $null)
		{ throw (new-object NullReferenceException -ArgumentList "destination Schema is null.  Check that the destination file has data.") }

	Write-Host "Loaded destination file: " $destFile " with " $destination.Count " objects."

	$source = ConvertTo-FIMResource -file $sourceFile
	if($source -eq $null)
		{ throw (new-object NullReferenceException -ArgumentList "source Schema is null.  Check that the source file has data.") }

	Write-Host "Loaded source file: " $sourceFile " with " $source.Count " objects."
	Write-Host "Executing join between source and destination."
	$matches = Join-FIMConfig -source $source -target $destination -join $joinrules -defaultJoin DisplayName
	if($matches -eq $null)
		{ throw (new-object NullReferenceException -ArgumentList "Matches is null.  Check that the join succeeded and join criteria is correct for your environment.") }
	Write-Host "Executing compare between matched objects in source and destination."
	$changes = $matches | Compare-FIMConfig
	if($changes -eq $null)
		{ throw (new-object NullReferenceException -ArgumentList "Changes is null.  Check that no errors occurred while generating changes.") }
	Write-Host "Identified " $changes.Count " changes to apply to destination."
	$changes | ConvertFrom-FIMResource -file $tempFile
	Write-Host "Sync complete."

$functions = {
	function GetPortalConfig($filename, $serverFQDN, $creds, $AllLocales)
		if(@(get-pssnapin | where-object {$_.Name -eq "FIMAutomation"} ).count -eq 0) {add-pssnapin FIMAutomation}
		$uri="http://" + $serverFQDN + ":5725/ResourceManagementService"
		if ($AllLocales -eq $true)
			{ $policy = Export-FIMConfig -uri $uri -credential $creds -policyConfig -portalConfig -MessageSize 9999999 –AllLocales }
			{ $policy = Export-FIMConfig -uri $uri -credential $creds -policyConfig -portalConfig -MessageSize 9999999 }
		$policy | ConvertFrom-FIMResource -file $filename

#### Main Script ####
$creds=Get-Credential -message "Enter credentials for source FIM"
$myargs=@($sourceFileName, $sourceServer, $creds, $AllLocales.IsPresent)
start-job -name "SourceFIM" -init $functions -script { GetPortalConfig $args[0] $args[1] $args[2] $args[3] } -ArgumentList $myargs

$creds=Get-Credential -message "Enter credentials for destination FIM"
$myargs=@($destFileName, $destServer, $creds, $AllLocales.IsPresent)
start-job -name "DestFIM" -init $functions -script { GetPortalConfig $args[0] $args[1] $args[2] $args[3] } -ArgumentList $myargs

write-host "Waiting for Policy Export to complete..."
get-job | wait-job

Write-Host "Exports complete: Starting Policy compare..."
SyncPortalConfig $sourceFilename $destFilename $tempDeltaFile

Write-Host "`nReplace configuration data with destination values"
$portalData=(get-content $tempDeltaFile)

foreach ($ReplaceConfig in $config.DeployConfig.ReplaceData)
	foreach ($CurrentLine in $portalData)
		$newPortalData+=$CurrentLine.replace($ReplaceConfig.SearchString, $ReplaceConfig.ReplaceString)

$portalData | set-content $finalDeltaFile

Write-Host "`nRemoving excluded Objects from deploy data"
$deployXML=[XML](get-content $finalDeltaFile)

foreach ($ReplaceConfig in $config.DeployConfig.IgnoreData)
	$deleteObjects=$deployXML.Results.ImportObject | where { $_.TargetObjectIdentifier -eq "urn:uuid:"+$ReplaceConfig.SearchGUID }
	foreach ($delObj in $deleteObjects)
		write-host "Deleting " $delObj.ObjectType ": " $delObj.AnchorPairs.JoinPair.AttributeValue
			$deployXML.Results.RemoveChild($delObj) | out-null
		catch [System.Management.Automation.MethodInvocationException]
			write-host "Ignoring " $delObj.ObjectType ": " $delObj.AnchorPairs.JoinPair.AttributeValue " not in DeploymentData"

Write-Host "`n`nFinal file for deployment created: " $finalDeltaFile

$input=Read-Host "Do you want commit changes to destination FIM ? (y/n)"
if ($input -eq "y")
	CommitPortalChanges $finalDeltaFile

This is the configuration XML file: In IgnoreData the Name is only for viewing what object this is, all search is done on the GUID, which is the GUID of the target system.

<?xml version="1.0" encoding="utf-8"?>
	<ReplaceData Name="NetbiosDomain">
	<ReplaceData Name="DomainSID">
	<ReplaceData Name="LDAPDomain1">
	<IgnoreData Name="Set: All GroupAdmins">
	<IgnoreData Name="Set: All UserAdmin">

Due to the implementation of PowerShell jobs, if you get any error or exceptions you can retrieve the output from both background jobs with the following command:



Receive-Job –name

The script also creates the undone.xml file like the original join script, so you can use the original ResumeUndoneImport.ps1 script to retry importing the unprocessed changes.
In addition to this I have built a little helper script that would generate the XML for the objects I don’t want to be deployed and copy this directly to the clipboard, so you can easily add a lot of objects in a short time.

set-variable -name URI -value "http://localhost:5725/resourcemanagementservice" -option constant

Write-Host "- Reading Set Information from Portal"
if(@(get-pssnapin | where-object {$_.Name -eq "FIMAutomation"} ).count -eq 0)
{add-pssnapin FIMAutomation}

$exportObject = export-fimconfig -uri $URI `
                               -onlyBaseResources `
                               -customconfig ("/Set[DisplayName='$DisplayName']")
if($exportObject -eq $null) {throw "Cannot find a set by that name"}
$ResourceID = $exportObject.ResourceManagementObject.ResourceManagementAttributes | `
                Where-Object {$_.AttributeName -eq "ObjectID"}

Write-Host "DisplayName: $DisplayName has ObjectID: " $ResourceID.Value
Write-Host "The following output is added to your clipboard:"
$output | clip


This script has currently one small issue, each of the replacement objects are currently deployed every time you use this script, as the delta calculation see differences between the environments, and replacement is done after sync and calculate, but that’s ok it still works fine.

I will do an update of the script in maybe the near future to check if the replacement is already done to avoid this, but I haven’t find a way to do so right now.

%d bloggers like this: