Quantcast
Channel: Identity Management Category
Viewing all 66 articles
Browse latest View live

Tips & Tricks: Novell IDM JDBC Driver Filter Gotcha

$
0
0

Tips & Tricks: Novell IDM JDBC Driver Filter Gotcha


Have you ever had a JDBC driver that wouldn't find changes in the database while using a triggerless Publisher channel?

The obvious things to check are the views/tables in the database the driver monitors to make sure the changes appear there correctly. Ideally you will check the class objects and attributes in the driver filter to make sure they are sent to Synchronize for the Publisher channel. Additionally you should check to make sure the Publisher channel is enabled and configured properly in the driver settings.

Unfortunately once you have checked these items you still may have the issue.

Having run into this issue a few times I'd recommend taking another look at the Driver Filter.

Basically what happens in a JDBC driver with a triggerless connection to a database is that it polls the database tables or views configured in the driver settings at the scheduled intervals by doing a "SELECT *" command for each table or view tracked. If a class in the filter has the "Track member of template" set to "No" the driver does not do the SELECT statement for the table or view associated with that class.  Thus, in the filter, the class object setting for "Track member of template" should be set to "Yes" if you want those objects monitored in the Publisher channel.

Another way this issue can be found is by reviewing the driver logs. Each time the driver checks for changes in the database the SELECT statements are recorded (provided that the logging is turned on and set to an appropriate level). If no SELECT statement is found for a table/view that is expected then the template setting is the  likely cause.

And since this is a filter setting the fix is easy:


1. Open the driver filter in Designer
2. Select the desired class in the filter
3. Set the "Track member of template" setting to "Yes"
4. Save the filter changes
5. Deploy the filter changes to eDirectory
6. Restart the driver to initialize the changes
 
 

 

Questions, comments or concerns? Feel free to reach out to us below or at IDMWORKS

 


 

 


Registry Hacking to Remove Unneeded Oracle Services

$
0
0

Registry Hacking to Remove Unneeded Oracle Services


Something we haven’t seen blogged about much is uninstalling services.   This is not something I would recommend for just any service you don’t like.  It’s important to take into consideration the application that installed the service and, when possible use the native application uninstaller.  However when you are dealing with an Oracle application, for example, that doesn't really have an uninstaller and only rarely installs services, it’s  a good idea to know how to hack your way to removing the offending applet that is no longer in use (and we "hacking" in the nicest possible way).

To make this relevant to the Oracle IAM stack, this process is supported for uninstalling the OIM AD password synch agent and uninstalling the OID application service.

On the "unsupported" side, this would be considered a registry hack.

Open the registry on your windows machine (search or run regedit.exe on almost all windows machines)

 

In the left pane (the tree navigator) go to the following key:

  • HKEY_LOCAL_MACHINESYSTEMCurrentControlSetservices
    • Find the key with the name of the service you wish to uninstall
    • For Oracle it is often clearly named ('oracle application service' or some such)
    • Right click on the key(still in the left pane) and click on "delete"
    • You will be asked to confirm, just click "yes".

 


Questions, comments or concerns? Feel free to reach out to us below or at IDMWORKS

The Truth about Indexing in OID

$
0
0

The Truth about Indexing in OID


 Oracle's OID docs are pretty vague around indexing.  In reality, there are really two options:

    1. When creating an attribute, check the "Indexed" box

 

    1. Create the index in the future (after you figure out OID needs it for something!)



In order to do #2, you should follow this procedure:

    1. Navigate to the $MW_HOME/<domain>/ldap/bin/catalog connect="OIDDB" add="true" attribute="<the attribute name that you want to index>" debug="true" verbose="true"



If you try to check the box (as in #1) after you have used the attribute, the ODSM interface will check the box, and make you think the attribute has been indexed (but it really hasn't!)
 

Questions, comments or concerns? Feel free to reach out to us below or at IDMWORKS

All Java is Not Created Equal

$
0
0

All Java is Not Created Equal

 

The what:

Java & CA Identity Manager V12.5


The issue:

I got hung up with a version of Java 6.29 that should have worked with CA IDM V12.5 but on Windows, the IDM install was completing yet the JBoss startup hung on Analytics everytime.

The symptoms:

On Windows, CA IDM did not load thoroughly enough to do a clean uninstall. In the boot log there were errors abound specific to  ‘unzipping jar files’ but not much more to go on.

I replaced JBoss with a new download from a known good load to no avail.

I thought perhaps the Java version was incorrect. I uninstalled and reinstalled Java 6.27 but this only appeared to cause more problems.

The Solution:

In an effort to not have to wipe the box to change the Java install, I renamed the folder containing 6.27 to 6.29 and reset the Java Home variable.
The IDM console started up correctly utilizing the lower version of Java from the higher named Container.
The product release notes did say Java 6.X but this does not always guarantee compatibility.
 

Questions, comments or concerns? Feel free to reach out to us below or at IDMWORKS

Mixed Environment Novell/Net IQ IDM 4 Upgrades

$
0
0

Mixed Environment Novell/Net IQ IDM 4 Upgrades


Over the course of time software requires an upgrade.  Whether the upgrade is due to a need for new functionality, support requirements, or just simply a business desire to stay current the need to perform upgrades is unavoidable.

Novell has introduced a very, very simple method of installing their latest version, IDM 4.  In addition to adding several new features to the Novell IDM environment there is a very handy little feature that allows for IDM 4 environments to exist within an environment including the previous version of IDM 3.7.  This means that the upgrade can be done one server at a time to minimize down time, implementation time and overall headaches.

Even in a mixed environment most of the new features of IDM 4 are available however there are important difference between having a mixed environment with multiple IDM versions and having an environment that is only IDM 4. The key difference is the Novell User Application.  The Novell User Application requires a corresponding driver in the driver set, however, you cannot have an IDM 4 User Application driver in the mixed environment.  The IDM 4 User Application driver requires the use of "packages", a new feature in IDM 4.

At issue is that, in a mixed environment, packages are not supported.  Packages are only supported in a full IDM 4 environment.  If you want to run a mixed environment you must retain the previous version of Novell User Application.  If your upgrade plan is to eventually phase out or upgrade all of the old IDM version to IDM 4 you will be able to use the new User Application once all of the eDirectory instances have been successfully upgraded.

The Novell upgrade documentation is packed full of useful information about to how to perform these types of upgrades and all of the features that are supported through this approach but be careful for potential missteps.
 

Questions, comments or concerns? Feel free to reach out to us below or at IDMWORKS

Using the Sailpoint Provisioning Integration Module (PIM) for Unsupported Connections

$
0
0

Using the Sailpoint Provisioning Integration Module (PIM) for Unsupported Connections 

Sailpoint comes with a method to integrate and interact with other vendor Identity Management applications through a  supported integration module called the Provision Integration Module (PIM).

The supported vendor systems are:

    • Oracle Identity Manager (OIM)
    • Sun (now Oracle) Java System Identity Manager (SIM)
    • IBM Tivoli Identity Manager (TIM)
    • NetIQ Novell Identity Manager
    • BMC ESS
    • BMC Remedy Access
For Sailpoint to work on currently (as of this writing at least) non-supported systems (such as Microsoft Active Directory, LDAP, SQL serve, etc.) the BMC Provisioning Module is best to be used. These connector integrations should thus be configured as read-write connectors. Additionally , all individual connectors come with configuration pdf files so please feel free to read these before attempting any of the following.
 
Steps involved in utilizing the BMC Provisioning Module include the following :
    1. SailPoint IdentityIQ Connector Manager should be installed and configured 
    2. The Connector gateway should be configured and running as system service. This service should always be running (don't forget that!!!)
      • Configure Init.xml contained in the ConnectorGateway folder
      • Execute the install.bat file which creates a service (as shown below)

<SM>

<!-- Connector Manager/Agents Hostname or IP Address-->

<hostname>host name of sailpoint connector </hostname>

<!--Connector Manager/Agents port number-->

<port>Port no</port>

<!-- Use "AS400" for AS400 system and "MAINFRAME" for Mainframes and leave empty in all other cases.-->

<platform></platform>

</SM>

<Server>

<!-- Connector Gateway port number-->

<port>5700</port>

<!-- Delay(in seconds) between two retry attempts while connecting to Connector Manager/Agents-->

<sm_connect_retry>3</sm_connect_retry>

</Server>

 

    1. Install and configure the individual connectors:
      • (Active Directory, LDAP, Lotus Notes..) using the SailPoint IdentityIQ Connector Manager installed in the first step.
      • (Provisioning Manager). MSCS- Managed System Configuration Set should be configured for the connectors(Active Directory, LDAP, Lotus Notes, etc.). This name will be used in the Sailpoint Application setup.



Once the Provisioning manager is installed, create a new application and choose the one ending with “Full” in the list (example: if we want to add a PM for Active Directory then we need to choose Active Directory Full from the list).

That's it!  Good luck and sound off below if you have any questions!
 

 

Questions, comments or concerns? Feel free to reach out to us below or at IDMWORKS

Oracle Identity Manager Basics: Creating a Custom Adapter in OIM 11g

$
0
0

Oracle Identity Manager Basics: Creating a Custom Adapter in OIM 11g


The purpose of this entry is to explain how to create a custom adapter in OIM.  The adapter will write to an external file.  This functionality could easily be changed to modify a database or some other process.  We will use a jar to do the actual heavy lifting so almost anything you can do with java you can do through the adapter.  These steps continue where the Custom Resource Object guide finished off using the resource object created there.

Overview:

    1. Create a java class that has one method which accepts 4 arguments (username, firstname, lastname, favColor) and appends that info to a text file as a CSV.  The file path should be passed as an argument to the constructor.
    2. Copy the jar to the required OIM location
    3. Create the adapter
    4. Add new Process Task
    5. Add as follow-on to Provisioning task


 The Steps:

1.Create java class

We want to create a java class that has one method which has 4 arguments and write that to a file.  The constructor will have the filename passed as well as the value true to set the writer to append instead of overwrite.

public doWork(String filename, boolean appendVal){

// Constructor enables overwriting if appendVal = false, otherwise appends

path = filename;

appendVar = appendVal;

}

public void writeCSV(String arg1, String arg2, String arg3, String arg4) throws IOException{

FileWriter writer = new FileWriter (path, appendVar);

PrintWriter add_line = new PrintWriter(writer);

// CSV requires commas, other formats may require changing delineation method

add_line.println(arg1+","+arg2+","+arg3+","+arg4);

add_line.close();

doWork is a constructor that requires a filename and true

writeCSV(string userName, string firstName, string lastName, string favColor) does the work and writes the strings to the file

Compile the code and make sure it works by running the jar.

 

2. Copy the jar to the required OIM location

You need to manually copy the jar over to the location where OIM can see it.  In this case, that location will be D:OracleOracle_IDM1serverJavaTasks 


3. Create the adapter 


Go to the adapter factory, create a new adapter (Name: WriteToFile, Type: Process Task, Desc: anything)

In the Variable List tab, create 5 variables, one for the filename and the other 4 for the 4 variables, all of which will be Strings that Resolve at runtime.

Add an Adapter Task (Name: Write It)

Select the jar file from the API Source menu. (JavaTaskJar.AlexProg.jar)

Select the class that has the method we want to use (doWork)

Constructor (0 public doWork(java.lang.String)

Methods (0 public void doWork.writeCSV(java.lang.String,java.lang.String,java.lang.String,java.lang.String) throws java.io.IOException

Save. Then go down to the bottom portion.  Under Constructor, if you look at the code in doWork.java, the constructor requires a filename.  So select Input (Map to: Adapter Variables, Name: filename).

Since the output of the function is void, we don’t need to map anything there but we could output status messages so that other tasks are performed depending on the success of the method.  Map the method Input variables to their respective Adapter Variables.

Save and Build
 

4.  Add new Process Task
 

In Process Management, Process Definition, we are going to add a task to the previously created Rubber Ball resource.  Add Task (Name: WriteFile, Conditional, Required for Completion, Allow Cancellation while Pending, Allow Multiple Instances)

In the Integration tab, select the WriteToFile adapter (adpWRITETOFILE).  You need to map the 4 variables to their respective process definitions.  The filename variable needs to be mapped to a literal with D:Oracletest.csv as its value.  For production, you would want to map this to an IT resource so that multiple tasks could use the same location and if it changed, you would only change the location in one place instead of needing to edit each task.
 

5.  Add as a follow-on to the Provisioning task
 

We need to assign the newly created task to activate whenever the resource is provisioned.  As a result, we are going to add the WriteFile task to be triggered by Provision It.

In Provision It, under the Responses tab, select the OK Response which signifies that the provision operation completed properly.  Now, hit Assign at the bottom and select WriteFile.  This indicates that when the OK Response is triggered, then WriteFile will be triggered as well.  This area is how you can create multiple tasks dependent on the response of the previous task.  Say we wanted to run a different method if it failed, we could assign that other method as a task to generate for the UNKNOWN response.  Save Provision It.

Now, if you already have the resource provisioned to a user, you can go into resource history and run the task.  The file should be created wherever you listed it (the literal in our case was D:Oracletest.csv).  If you do not have the resource provisioned to a user yet, provision it and the file should be created.  Your custom adapter should be properly working now.  If you run the task multiple times or with different users, it should create a new line everytime the task is run.

The java code can easily be modified to run SQL queries, add a user to a database, etc. which is where the versatility of this process comes in handy.

 

Questions, comments or concerns? Feel free to reach out to us below or at IDMWORKS

New Tech 101: OAuth 2.0

$
0
0

New Tech 101: OAuth 2.0

When new technology standards appear in industry publications, we quickly receive inquiries as to the applicability of those standards to their use cases.  So let's investigate the growing use of OAuth 2.0 in the general industry both from where it began as well as where it appears to be heading at the current time.


Short History of OAuth

OAuth 1.0 was originally developed to answer the problem of integrating a user's identity information from a resource provider with another provider's offering.  Social media providers like Twitter drove the development and deployment of OAuth 1.0.  Eventually it was recognized that the 1.0 standard contained limitations from an enterprise point of view.  There also existed implementation flaws such as complex signature generation requirements on top of potential security issues.

Providers such as Facebook began looking at the possibilities of fixing the complexities in the 1.0 standard.  This development led to what would eventually become the 2.0 standard version 1, released in April of 2010.  The changes and additions destined for 2.0 were focused around creating a simpler token sharing method as well as allowing applications (both mobile and desktop) to share tokens outside of web browsers.  As development progressed, the differences between the capabilities of OAuth 1.0 and the plans for 2.0 diverged enough that backwards compatibility between 2.0 and 1.0 became infeasible.

As of the writing of this post, OAuth 2.0 is still considered a draft standard.  With increasing industry buy-in from companies such as Microsoft and Google, the adoption of OAuth 2.0 has become widespread.  There are implementations of OAuth 2.0 in many languages including: Java, Cocoa, Ruby, PHP and Python amongst many others.


The Basics: At its core, OAuth is an authentication scheme to verify the identity of a user using a third party.  The implementation from a conceptual point is not much more complicated than that basic idea.

The Concept: A user wants to login to an application, perhaps your company's webmail.  You want to verify that the user is who they say they are.  You direct them to get a token from their social media account that they can then provide to your webmail.  The user will login to their social media account and receive a token that will give the application access to a set of protected resources that the user authorizes.  Those resources can be as simple as the user's personal contact info or even the detailed information of all of their contacts in the social media site.  The user then goes back to the application and passes the token to the application.  The application takes the token and presents it to the social media site.  When the social media site verifies the token, the application now has access to whatever the protected resources may be.



Example: A user wants to import their Facebook contacts into Yahoo! Mail.  They click a button in Yahoo! in the Import section to select Facebook.  The user is then directed to login to their Facebook account.  After successful login, Facebook asks the user if they wish for Yahoo! to have access to their account information.  If the user clicks yes, then Facebook will allow access to the Facebook contacts for that user to Yahoo! Mail.

The conceptual idea and example are illustrations of the first type of OAuth token - an Access Token.

Types of Tokens:

There are two types of tokens: Access and Refresh.  An access token permits exactly that, access to a protected resource for a defined period of time.  A refresh token allows a user to request a new access token after the first token expires.

Ideally, a refresh token is issued to a provider who will need continued access to a protected resource.  So if a provider needs one-time access to a resource, preferably they would only request an access token while providers who need repeated access to the protected resource would have a refresh token and then use that token to obtain access tokens when need be.

Example: Users of social media sites such as Facebook, can often select different apps to do various activities i.e. play games, read horoscopes, etc.  When they choose to initially "install" these apps into their user experience, they have to be logged into their Facebook account.  During that install, they are asked whether they wish to allow the application to access the personal information contained in their account.  Due to the fact that these apps are installed and considered to need repeated access, they receive a refresh token so that when their access token expires, they can request a new one without querying the user again.  When a user goes into their Facebook settings and removes permission to the app, they are telling Facebook to revoke the refresh token.

The basic paradigm is that if you need one time access to a protected resource, request an access token.  If you need repeated access to a protected resource, request a refresh token and then use that token to request access tokens as needed.



OAuth 2.0 and Mobile

As OAuth 2.0 was driven heavily by social media providers like Facebook, Twitter and Flickr, the development of this standard included the possibility of the use of OAuth outside of the strict web browser environment.  The ability to use OAuth tokens in desktop and mobile applications greatly improved the accessibility of OAuth to developers who had not been exposed to OAuth in the web browser environment.  With the backing of large social media providers and the desire of developers to engage those providers, OAuth has started to become an accepted standard in the mobile application environment.  OAuth 2.0 allows mobile applications to request tokens just as web applications can.  This feature allows applications to handle the token request without necessarily spawning a browser window.  The ability to use tokens in an application displays the beginning of true Single Sign On beyond one vendor.  Users are able to only memorize one user/password combination to an authoritative identity source and then provide access to other resources which wish to verify the user's identity.  The possibilities moving forward in this space certainly excite the imagination.

Note: The specification described above uses the OAuth Working Group v2-30 draft of the OAuth 2.0 standard.  This version of the draft was issued July 15, 2012 and expires January 16, 2013.

Draft located at: http://tools.ietf.org/html/draft-ietf-oauth-v2-30
 

 

Questions, comments or concerns? Feel free to reach out to us below or at IDMWORKS


How to Create Custom Challenge Questions in Oracle Identity Manager 11g

$
0
0

How to Create Custom Challenge Questions in Oracle Identity Manager 11g

When you move from OIM 9.x to OIM 11g, you are going to encounter a learning curve - some things are just different. There are big things like SOA approvals and the redesigned web app, and then there are minor things like setting custom challenge questions. In OIM 9.x, if you wanted to add a custom challenge question, it was a simple matter of modifying the lookup called "Lookup.WebClient.Questions". However, if you try to do this in OIM 11g, you'll start seeing some strange errors when you login through the UI or if you try to use the API to set challenge questions, such as "Caused by: java.util.MissingResourceException: Can't find resource for bundle java.util.PropertyResourceBundle, key global.Lookup.WebClient.Questions.What-is-your-favorite-color?".

The solution is pretty simple, even if it's not totally obvious. It turns out that if you add custom challenge questions to OIM 11g in the Lookup.WebClient.Questions lookup, you have to add corresponding properties for localization support. The properties file is called customResources_lang.properties located in Oracle_IDM1/server/customResources (replace lang with your language identifier, for example customReosurces_en.properties for English).

Here's how an entry might look:

global.Lookup.WebClient.Questions.What-is-your-favorite-color?=What is your favorite color?

Once you add a property for each new question, simply restart OIM and you're good to go.
 

 

Questions, comments or concerns? Feel free to reach out to us below or at IDMWORKS

 

The Path Less Taken - When and Why to Futz with your Identity Management

$
0
0

The Path Less Taken - When and Why to Futz with your Identity Management

 

Recently we had a customer with an existing provisioning application that was quite a few revisions back. They also wanted to add Role Management functionality to their environment. Additionally their IDM infrastructure was hosted by a another company which was causing a wrinkle in their plans.

Recently we had a customer with an existing provisioning application that was quite a few revisions back.  They also wanted to add Role Management functionality to their environment.  Additionally their IDM infrastructure was hosted by a another company which was causing a wrinkle in their plans.

So the questions became:


Path I – Purchase and implement an IDM Provisioning tool and provision directly to hosted Directory, Mainframe and other applications.

Path II - Purchase and implement an IDM Provisioning tool and send provisioning requests to to the hosted IDM (in a spoke and hub model).

Path III – Integrate Role Management within their environment and eventually connect to one of the two paths above.

Path IV – Do nothing

 

So let's take a look at our options:

 
 
Path I - Kill the hosted IDM gracefully: 
    • Install an IDM provisioning infrastructure
    • Establish HR as the Authoritative Source connectivity to the hosted IDM
    • Implement connectors to hosted directory, Mainframe, etc.
    • Use migration tools to migrate users, roles and workflows to the IDM
    • Establish Role Management application to IDM using a connector/bridge for an integrated infrastructure
    • Repeat for all provisioning end points
    • Shutdown hosted IDM and assume IDM responsibilities inhouse

Pros:  Get rid of reliance on hosted organization with spotty track record.  Keep quality control inhouse.  Speed up the current processes and increase audit control exponentially.

Cons: Costs include product, resource costs, project costs, operational costs go way up.

 

Path II - Hub & Spoke our IDM:
    • Install an IDM provisioning infrastructure
    • Establish HR Authoritative Source connectivity to IDM
    • Implement a connector/bridge to hosted IDM provisioning engine (Hub & Spoke)
    • Work with hosted IDM to correct current process deficiencies (i.e. fix what is broken)
    • Establish Role Management application to IDM connector/bridge for integrated infrastructure

Pros:  Shift some, but not all, quality control in-house.  Speed up the current processes in some, but not all, aspects. Increase audit control and capabilities exponentially.

Cons: Keep reliance on hosted solution for some, but not all, processes. Costs include product, resource costs, project costs, operational costs go up.

Path III - Role Manage the Host
    • Establish Role Management application to hosted IDM
    • Work with hosted IDM to correct current process deficiencies (i.e. fix what is broken)

Pros:  Increase audit control and capabilities exponentially.

Cons: Keep reliance on hosted solution. Costs include product, resource costs, project costs, operational costs go up.

Path IV - Do Nothing, Live with it
    • Sort of defeats the purpose but then again doesn't require a project budget

Pros:  No project, product or operational costs

Cons: Keep reliance on hosted solution. Costs include effort to correct spotty hosted solution, ongoing weak processes and any related audit findings due to ineffectual hosted solution.

I am not going to state what this client selected as with most roadmaps things change very quickly.  So I ask you reader, what do you think they should do and why?  I am sure the pure techies will say Path I and the business folk will land somewhere in the middle.  But it's the why that makes us most curious.  So please feel free to make your opinion known in the comments section below.

Questions, comments or concerns? Feel free to reach out to us below or at IDMWORKS

Integrating NetIQ (Novell) Identity Manager (IDM) with Aveksa Compliance Manager (ACM)

$
0
0

Integrating NetIQ (Novell) Identity Manager (IDM) with Aveksa Compliance Manager (ACM)

Steps to Entitlement Provisioning:

    1. Install the “Role Based Provisioning Module” which installs the Webservice to connect IDM and ACM 
    2. Create the “User application” and “Role and Provisioning” drivers (these drivers can be configured to hold the entitlements and role information) 
    3. Enable the Novell Plugin in the plugin folder on the Aveksa server, this creates a fulfillment workflow handler with the IDM connection information for provisioning.



_______________________________

If Entitlements are enabled on the IDM, create the Application and Entitlement collector.

Configure the fulfillment handler generated under the request tab for the application (this will add the entitlements to users on the request process.)

_______________________________

If Role Based Provisioning is being used then create the Application and Role collector.

Configure the fulfillment handler generated under the request tab for the application (this will add the entitlements to users on the request process).

Entitlements can then be added to roles in RBPM and provisioned using ACM.

_______________________________

If Entitlements are not implemented in IDM but are represented as Groups and custom attributes in eDirectory:

    • Design and develop a java method that connects and make changes to groups and attributes in LDAP
    • Develop a custom fulfillment workflow embedded within the java method developed 
    • Create an entitlements collectors for groups and attributes and assign them under an application 
    • Configure this custom fulfillment within the request process for the application created.


The request process uses the fulfillment workflow which in turn uses the java method and thus provisions the changes to the IDM (the drivers on IDM should be configured to sync the groups to connected systems).
 

Questions, comments or concerns? Feel free to reach out to us below or at IDMWORKS

Resolving an “Export Failed” Error in OIM Deployment Manager

$
0
0

Resolving an “Export Failed” Error in OIM Deployment Manager

The Problem

Recently I experienced a strange issue when using the OIM 11g Deployment Manager export feature. After selecting the artifacts to export and specifying a file name and location for the XML file, an unfortunately timed “Export Failed“ dialog box appears. Usually the first place you look when this happens is the OIM logs on the server to figure out what’s wrong.  On this occasion, everything looked normal in the logs and there were no error messages or any signs of trouble.

It turns out the problem was locally on the machine I was using not the server.  I checked the JRE console locally and noticed “java.security.AccessControlException: access denied” errors.

Conveniently enough Oracle has a documented solution for this problem:

The Solution

Modify the java.policy in JRE_HOME/lib/security and replace it with the following:
grant{ permission java.security.AllPermission; };

Save the file, then close your browser and try again. Deployment Manager should work like a charm!

 

Questions, comments or concerns? Feel free to reach out to us below or at IDMWORKS

Notes on the The Oracle Identity Governance Suite 11g Essentials Exam (1Z1-459)

$
0
0

Oracle Identity Governance Suite 11g Essentials Exam

Here's some notes on the The Oracle Identity Governance Suite 11g Essentials Exam (1Z1-459)

There were 139 questions with a 2 1/2 hours time limit.  I took about 90-95 minutes before I called it good enough.  I'm going to go through topic by topic to anything I think you should focus on.  I will say to start, they asked for a lot more detailed questions about stuff that I would call nitpicky type stuff than I expected.  There were a lot of pick 3 answer questions.  They asked a lot about specific api's to use, filenames to edit, etc.  Make sure to use the strike-through functionality in the test to mark off answers you know are incorrect, its very helpful with the pick multiple answer questions.  With that, lets go into the topics.

 

Topic 1: OIG Fundamentals

There were a surprising number of questions for this topic (5-10) about what the goals of OIG are, what the goals of OIA are, OIM, etc.  Know what the benefits of both OIA and the suite as a whole are.  Know what benefits to SOX auditing are provided by OIG!

 

Topic 2: OIG System Architecture

This is kind of connected to topic 1 in my head.  You need to know the benefits of the ICF.  Need to know what needs to be setup for HA.  How to use sandboxes.

 

Topic 3: Branding and UI

This was an area that I didn't think was too important when studying but they wound up asking several questions on it.  Specifically, say you want to add a field to x page, what part of the page would that be under <pageformat>, <pagelayoutformat> etc (Those aren't the right tags but its something like that)  There were multiple questions about how to do this so bone up on it if you haven't looked at it already.

 

Topic 4: Catalog

A bunch of questions here.  Specifically how to populate, how to modify, what admins can edit it, what admin accounts are needed to administer it.

 

Topic 5: Approval Workflows and Requests Configuration

You need to know how approvals are setup in OIM, where you would change the workflow to add reviews, which policies would be used in a certain situation, Need to know request dataset info

 

Topic 6: Security

There were a couple questions on OES.  Several questions on delegated Admin that I was able to eliminate answers and then it was easy to see the answer.

 

Topic 7: Bulk Load and Postprocessing

Know what input sources you can use, know what types of data you can import, there were a couple post processing questions that I did not cover in my evernote, stuff like what's the goal of postprocessing, etc.

 

Topic 8: Reconciliation and Postprocessing

There were a number of questions on this topic (~10-20) about how you could configure recon, what fields are necessary, what are the requirements for trusted recon.  Re-eval and Ad-hoc linking

 

Topic 9: Provisioning, RBAC, and Access Policies

Provisioning to disconnected resources had several questions, know what rbac is, you need to know access policies in and out.  There were a bunch of questions about what you're configuring to achieve x,y or z

 

Topic 10: Connectors

They wanted detailed knowledge about what api or spi is used for which connector/plugin etc.  You need to know ICF and how it interacts with both OIM as well as OPAM.

 

Topic 11: Event Handlers, Notifications, Reports and Scheduled tasks

Event Handlers had in depth questions about what classes are used, etc Notifications didn't have much, maybe a question or two.  Reports focused more on how BIPub integrates into weblogic and the db.  As well as how to create the schema (i.e. RCU)  Scheduled tasks had many (~10-20) questions.  How are they used, setup, etc

 

Topic 12:  Identity Analytics

You need to know identity seeding, role mining, a lot of SOD, Certification process in detail, a lot of closed loop remediation, and in depth info about audit policies

 

Topic 13: OPAM

Need to know basics of check-in/check-out (know what each does), How OPAM interacts with OIM, How OPAM integrates with OUD, OPAM request (who can request, how its setup), there was one question on break-glass access (something like what would the account do), a couple questions on risk-based cert and closed loop remediation

 

Topic 14: Deployment

There were some in depth questions about what prereqs are needed for different packages (specifically OIA onto an OIM install) Need to know the OIM/OIA/OPAM integration, a couple LDAPsync questions as well, how its setup, what it does, etc

 

Overall, like I said, I was surprised how in depth they went at some points (connectors, event handlers, scheduled tasks, recon - which class would you use, which plugin, etc) 

 

 

Questions, comments or concerns? Feel free to reach out to us below or at IDMWORKS

Oracle Identity Manager 11gR1 Tuning Considerations

$
0
0

Oracle Identity Manager 11gR1 Tuning Considerations

Oracle Identity Manager (OIM) is a powerful tool for organizations to manage user accounts across many systems. Because it is such a key piece of the IT/IS puzzle, it becomes critical to produce high performance from within. While Oracle provides some baseline sizing recommendations and other resource tuning steps, some of the documentation is incomplete and at times - contradictory. This guide should help clear up confusion regarding OIM performance tuning.

Sizing:

One of the most important improvements you can make for performance is allocating enough resources on your OIM servers. Based on Oracle's OIM 11g "Sizing Guide," you need  6GB of memory (minimum) on each host in your OIM cluster, and ideally a Java "Heap" size of 2GB. On the database (DB) side, the baseline requirements are 6GB memory & 500GB of disk space allocated. In both cases, a single dual core CPU is "recommended."

From these baseline numbers, you can use the sizing guide as well as pre-production testing to determine the right numbers and configurations for your particular installation. Simply put - if you don't allocate at least these minimum resources, performance will suffer greatly, even in a small scale Development or Proof of Concept (POC) environment. Aside from proper sizing, the tuning recommendations can be broken down into two functional categories: 1) OIM Application Tuning and 2) Database Tuning. 

(see "Don't Shortchange Your IAM Dev Environment")

Database Tuning:

Because OIM relies heavily on Oracle Database for most of its' functionality, you will often see huge performance gains just by tweaking initialization parameters and altering various settings. Every time a user is created or provisioned a "new" resource, there are dozens of operations taking place within tables of the functional database. This dependency means the user facing performance in OIM is directly related to the performance of the database itself. While it is always recommended to check the Oracle Database Tuning reports within the database, Oracle provides recommendations of which can be found at this location:

http://docs.oracle.com/cd/E14571_01/doc.1111/e14308/tuningfordb.htm.

They are listed below for quick reference. Always check the OEM documentation for the "latest and greatest" information.

db_block_size 8192
memory_target 6GB
sga_max_size 10 GB
pga_aggregate_target Minimum value is 2 GB
db_keep_cache_size 800M
log_buffer 15 MB
cursor_sharing FORCE
open_cursors 2000
session_cached_cursors 800
query_rewrite_integrity TRUSTED
db_file_multiblock_read_count 16
db_writer_processes 2
processes Based on connection pool settings

 

In addition to DB initialization parameters, Oracle also provide's recommendations which are designed to enhance "bulk operations" such as reconciliation tasks. While the complete document can be found at "Oracle Metalink note ID 1484808.1," we have found the below instructions are particularly helpful:

  • Create a dedicated tablespace for the ORCHESTRATION LOB in the ORCHPROCESS table
  • Enable efficient allocation of LOB chunks
  • Remove USR and PCQ from the KEEP pool and use the default buffer pool instead

The full list of recommendations and steps can be found in the aforementioned Metalink note.

Application Tuning:

Oracle recommends enabling the OIM Caching feature to enhance system performance. With the Caching feature enabled, OIM is enabled to store various components in memory so it doesn't have to query the DB each time the information is needed. The performance gain here is quite obvious - fewer trips to the DB means less waiting.

Oracle provides a quick guide for enabling the cache:

http://docs.oracle.com/cd/E21764_01/doc.1111/e14308/tuningappcache.htm

Aside from caching, we also recommend eliminating any unused access policies or automatic group membership(s). This reduces the processing necessary from OIMs standpoint when creating and/or modifying objects (users, groups, etc...).

Finally, use the Orchestration Process Clean-up Task to regularly sanitize the ORCHPROCESS table. This table can grow & expand considerably-very quickly. This task helps clean-up entries that are no longer needed (the orchestrations are complete). Running this task once a day (during off-peak hours or during a maintenance window) is highly recommended.

 

Questions, comments or concerns? Feel free to reach out to us below or at IDMWORKS

Quick Tip: OIM Challenge Question Limitations

$
0
0

Oracle Identity Manager (OIM) - Challenge Question Limitation

Recently, working on a Challenge Question issue in OIM the customer defined a specific question of which was 64 characters (total length) and when users attempted to use the question an error message was displayed. Due to this information being stored within a database table, there is a limit for the character length of the question(s) and answer(s). Unfortunately, this information is not well documented by Oracle within the readily available documentation.

Investigating the OIM Table Schema

At a glance, one might think you can use up to 100 and 256 characters for questions and answers (respectively); however, this is NOT the case.

Since the values of the questions and answers are encrypted the resulting character strings exceed the length of the original data.

Maximum Character Length Discovered

In the end, we found OIMs max length for both - Challenge Questions & Answers: 

  • Challenge Questions cannot exceed 58 characters
  • Challenge Question Answers are limited to 150 characters

 

Questions, comments or concerns? Feel free to reach out to us below or at IDMWORKS


Using Custom Java in BI Publisher Reports

$
0
0

Using Custom Java in BI Publisher Reports

BI Publisher is a powerful tool for reporting. As the out-of-the-box report(s) solution available for Oracle Identity Management products, BI Publisher provides a ton of useful reporting capabilities. It is sometimes necessary to create custom reports that are tailored for specific needs and we have a great tool for adding functionality: your own custom Java code. Using this method, you can do some cool things like complicated logic, transformation, and more...

Here's how it's done.

Create Custom Java Class

Start by creating your Java class (using Eclipse* for example) and create a JAR file. The functionality can be as simple or complex as needed.

Example, IDMWORKS recently deployed a custom report with Java code that accessed an LDAP field for includsion into the report. Our customized code included functionality for querying an LDAP server and returning the resulting value as a string.

Copy JAR File To Weblogic Domain Lib

In order to leverage the custom JAR file in BI Publisher reports, it is necessary to add the JAR file to the Weblogic server class path.

The easiest way to do this is to copy the JAR file to the $DOMAIN_HOME/lib directory. This will automatically add the JAR file to your server classpath (you will need to restart for this addition to take effect).

Create Report Template 

Next, you will want to create a report template using the template builder for Microsoft Word. When you identify the field which should include your custom Java method, you need to insert a custom field with the following:

<?namespace:javans=http://www.oracle.com/XSL/Transform/java/com.packagename.CustomClass?> <?javans:getSomeFieldFromLDAP(ParamInput)?>

  • Note 1 - You can define the custom namespace in a field at the top of the RTF template to reference it in more than one place.
  • Note 2 - The input parameter for your custom method can refer to a field name from the result set. You could apply a transformation to a field pulled from the database.

Now simply create your report in BI Publisher, upload your template, choose your data source, and that's all there it to it. This is a really powerful and flexible tool for your custom reports and as you can see it's pretty easy to use it.

 

Happy coding!

 

Questions, comments or concerns? Feel free to reach out to us below or at IDMWORKS

Quick Tip: Changing Java Heap Size for OIM

$
0
0

Quick Tip: Changing Java Heap Size for OIM

A common tuning step you'll see to help your OIM performance is to increase the Java Heap size for the WebLogic servers. The problem is, it's hard to find a consensus among the varying Oracle documentations. Hopefully this blog post clears up some of the confusion.

Recommendation

IDMWORKs recommends the following steps for changing the Java Heap size:

1.  Discover the $DOMAIN_HOME/bin directory

2.  Edit the file called "setDomainEnv.sh" (setDomainEnv.cmd on Windows)

3.  Find the line which checks the USER_MEM_ARGS variable (somewhere around line 350 (259 on Windows))

Example:

# IF USER_MEM_ARGS the environment variable is set, use it to override ALL MEM_ARGS values

if [ ""${USER_MEM_ARGS}"" != """" ] ;then
MEM_ARGS=""${USER_MEM_ARGS}""
export MEM_ARGS
fi

4.   Now simply add a line before this block of code which sets USER_MEM_ARGS:

USER_MEM_ARGS=""-Xms2048m -Xmx2048m -XX:MaxPermSize=512m"" export USER_MEM_ARGS

Windows:
set USER_MEM_ARGS=-Xms2048m -Xmx2048m -XX:MaxPermSize=512m

 

It's as simple as that.

 

Now save the file and then restart all of your WebLogic servers.

 

NOTE:

During start-up, you should see the Java Memory Arguments in the output to verify the settings are correct.

  • 2048MB (2GB) is recommended and it is the value we usually use

 

Questions, comments or concerns? Feel free to reach out to us below or at IDMWORKS

DIP, Oracle, and Strong Encryption

$
0
0

DIP, Oracle, and Strong Encryption

In attempting to connect DIP to a third party (Tivoli) LDAP, I received an error that complained - "RSA premaster secret error" as well as a warning - "Simple bind failed."  After some exploration, it was discovered that the SSL key Tivoli used was too large (> 64 bits).  

Out-of-the-Box

Oracle does NOTsupport "strong encryption" due to US Export Restrictions in its' Java installations.  As a result, "Unrestricted JCE Policy Files" need to be installed.

You can download those files on Oracle's site after agreeing to the terms (including the US Export Restrictions):

http://www.oracle.com/technetwork/java/javase/downloads/jce-6-download-429243.html  

Then, copy the JARs from the downloaded/extracted ZIP into the JRE ".../lib/security" folder.  

Case Specific

In my case, specifically, I was installing on AIX and needed to retrieve specific JARs from IBM

https://www14.software.ibm.com/webapp/iwm/web/reg/download.do?source=jcesdk&lang=en_US&S_PKG=142ww&cp=UTF-8

Admittedly

Even though I've known that the US Export Restrictions restricts strong encryption products, this is the first time it has ever required me to go through additional package installation steps.  

Hopefully this advise and my suffering will save some time and hassle if you encounter the same problem.

 

Note:

As of 2009, non-military cryptography exports from the U.S. are controlled by the Department of Commerce's Bureau of Industry and Security.

 

Questions, comments or concerns? Feel free to reach out to us below or at IDMWORKS

Creating Self-Signed SSL Cert for Oracle Internet Directory using WLST

$
0
0

Creating Self-Signed SSL Certificate for Oracle Internet Directory (OID) using WLST

While creating a Self-Signed Certificate for OID v11.1.1.5 (using WLST) I realized that many commands contained in Oracle's documentation were either outdated or incorrect.

The below is a quick run through of the commands necessary to create the necessary certificate and then update OID to make use it. Feel free to independently examine the original commands in the OID Administration Guide but I've found that the modifications below are required to successfully configure SSL in OID.

These steps originate from Fusion Middleware Administration Guide for OID - 26.3 "Configuring SSL by Using WLST"

see - http://docs.oracle.com/cd/E21764_01/oid.1111/e10029/ssl.htm#CBHGAFAC

 

The same three steps as listed in the guide are necessary:

1. Create the Wallet

2. Configure SSL Parameters

3. Restart OID and Update Registration

 

1. Create the Wallet

 

·     To create the wallet, connect to WLST

MIDDLEWARE_HOME/oracle_common/common/bin/wlst.sh

·     At the WLST prompt:

connect('weblogic_user','weblogic_pw','adminserver_host:weblogic_port')

i.e. connect('weblogic','password1','localhost:7001')

 

·     Next, we have to go to the custom mbean tree where the wallet and cert will be created

custom()

cd('oracle.as.oid')

 

Assumption - Oracle_Instance=MW_HOME/asinst_1 and the OID instance is oid1

 

Current OID wallets can be seen using the following:

 

listWallets('asinst_1','oid1','oid')

 

·     Create a new Wallet:

createWallet('asinst_1','oid1','oid','WALLET_NAME','WALLET_PASSWORD')

 

i.e. createWallet('asinst_1','oid1','oid','NewWallet','abracadabra')

 

·     Add a Self-Signed Certificate to the newly created wallet:

addSelfSignedCertificate('asinst_1','oid1','oid','WALLET_NAME','WALLET_PASSWORD','cn=INSTANCE_HOST_NAME','key_size')

 

i.e. AddSelfSignedCertificate('asinst_1','oid1','oid','NewWallet','abracadabra','cn=www.test.com','1024') 

 

2. Configure SSL Parameters

 

The next step in WLST is to set-up SSL for oid1. We need to create an SSL settings file first though.

 

·     In another window, create a file somewhere within your install. Creating it in the "config" folder on the OID server instance you're setting up will keep it handy.

i.e. /path/to/Middleware/asinst_1/config/OID/oid1/

 

·     I generally recommend naming it something meaningful like:

"ssl_settings.prop"

 

Using my examples, you could create the following:

 

/path/to/Middleware/asinst_1/config/OID/oid1/ssl_settings.prop

 

Containing the following variables:

 

§  KeyStore=WALLET_NAME

§  AuthenticationType=auth-type

§  SSLVersions=version

§  Ciphers=cipher

§  SSLEnabled=true

NOTE - All of the possible variable values are located at: 

http://docs.oracle.com/cd/E21764_01/core.1111/e10105/sslconfig.htm#CBDEHJGD 

 

The "actual" values I would use:

 

§  KeyStore=NewWallet

§  AuthenticationType=Server

§  SSLVersions=nzos_Version_3_0

§  Ciphers=SSL_RSA_WITH_RC4_128_MD5

§  SSLEnabled=true

 

·     Now that the file is created, I can go back to the WLST window and configure the SSL setting

configureSSL('asinst_1','oid1','oid','sslport1','/path/to/Middleware/asinst_1/config/OID/oid1/ssl_settings.prop') 

 

At this point, Weblogic has the SSL settings loaded. We still need to load those settings into the actual OID server using opmnctl.

 

 

3. Restart OID

 

·     First, restart the oid1 server:

opmnctl stopproc 'ias-component=oid1'

opmnctl startproc 'ias-component=oid1'

 

·     Update the component registration for OID:

opmnctl updatecomponentregistration componenttype=OID componentname=oid1 -Sport 3131 -Port 3060

 

o  componenttype and componentname should be self-explanatory

o  Sport is the SSL port

o  Port is the non-SSL port

I have found that both "-Sport" and "-Port" are required for the command to run properly. 

 

Everything should be setup correctly and you should be able to login to your OID server over LDAPS. 

 

 

Questions, comments or concerns? Feel free to reach out to us below or at IDMWORKS

 

 

Identity Vault - Unable to connect to Linux/Unix Remote Loader Driver

$
0
0

Identity Vault - Unable to connect to Linux/Unix Remote Loader Driver

Checklist for solving error (-9006) Detail from driver

 

The Linux/Unix driver uses embedded Remote Loader Technology to communicate with the Identity Vault -- bi-directionally synchronizing changes between the Identity Vault and the connected system.

New comers to Identity Manager commonly come face-to-face the above error.

 

The checklist below will save hours upon hours of undesired debugging efforts

 

1.     Ensure the Remote Loader is installed/initiated on the Linux/Unix system

Type “rdxml” in the command line.

If the command is NOT recognized, install the Remote Loader

http://download.novell.com/index.jsp

2.     Start the Remote Loader

Starting the Linux/Unix driver (from iManager or designer) is one thing, starting the Remote Loader driver is another:

On a terminal, type “rdxml -config <path_to_config file> -sp remote_loader_<password> driver_<password>” to define driver and remote loader passwords

Then type “rdxml -config <path_to_config_file>” to start the driver

You can check the status with “/etc/init.d/rdxml status” to confirm that the remote loader driver is running

If upon starting the driver you get the below error message, resolve this by running “nxdrv-config” from the command line, and following the prompts to set the passwords

remote loader password and driver object password must be set

3.     Ensure that the Remote Loader/Driver object password(s) you specify for the Remote Loader driver are the same with those on the Linux/Unix driver configuration.

If you make any changes to the Linux/Unix driver configuration from designer, don’t forget to deploy those changes

4.     Examine the Status Log and DSTRACE output of the Linux/Unix driver and the Remote Loader Driver.

You might want to edit the trace levels (to "3" for instance) to get more detailed debugging information

You can edit the trace level for the Linux/Unix driver from the setting at "/etc/nxdrv.conf"

For the Remote Loader driver, you can edit the trace level in the created configuration file

5.     If you have selected to use an SSL connection in the Linux/Unix driver configuration (recommended), make sure you configure the Remote Loader driver for SSL as well

If you don’t, you’ll get the following error in the Remote Loader driver’s trace file:

Unable to establish client connect; make sure certificates match

6.     Ensure the connection parameters (hostname, port, KMO, etc...) in the Remote Loader configuration file that you created matches the Remote Loader authentication settings in the Linux/Unix driver configuration

 

Questions, comments or concerns? Feel free to reach out to us below or at IDMWORKS

 

 

Viewing all 66 articles
Browse latest View live