Subscribe:

Wednesday, 20 July 2011

(Some) Internals of Oracle Identity Manager Access Policies


Introduction
Many enterprises are considering (or already deployed) an identity management solution either for effective IT automation to reduce costs and/or for compliance purposes. Oracle Identity Manager is part of the Oracle’s identity and Access Management (IAM) solution. It provides functionalities such as, automatic user provisioning, compliance reporting, etc.
In my personal opinion, Oracle Identity Manager (OIM) is a wonderful product from Oracle. Many people don’t understand the basic concepts behind how OIM works. Worst thing is, they complain about the vendor product for their own failures in understanding basic concepts.
If you are planning to work with Oracle Identity Manager, then get ready for learning a lot of new things. OIM requires knowledge and you should be familiar with following:
  • LDAP Directory – especially Oracle Internet Directory or Oracle Directory Server (formerly Sun/Iplanet Directory)
  • Basic understanding of XML
  • Programming in Java
  • Concepts of Microsoft Active Directory and Microsoft Exchange (if you are planning to integrate them)
  • Most importantly, self-initiative and interest to research yourself for things you can’t find in “google”.
Oracle Identity Manager stores all the user information, metadata information, audit information, and everything related to data in the Database (similar to Oracle Internet Directory – OID). There are two supported database environments for OIM to store data. It can be:
  • Oracle Database Server
  • Microsoft SQL Server
The second major component of OIM is the connectors. OIM connectors provide functionality for connecting to various systems across an enterprise. Good thing about OIM is, there are many connectors available. Also, Oracle is standardizing some of the connector components to get the same feeling across all the connectors. So, if you can understand few connectors, then it will be easier for you to work with the remaining connectors.
Latest OIM connectors can be found here – You can download it as well.
OIM Connector Certification (supported systems for OIM for user provisioning) can be found here.
OIM Connector documentation can be found here.

Basic OIM Concepts

Before we talk about Access Policies, we need to understand few other OIM Concepts. OIM has various objects that work together to achieve the necessary functionality. In an ideal way, OIM should manage the complete lifecycle of user accounts in an enterprise – using automatic ways with no manual intervention during entire lifecycle of user creation, modification and deletion phases.
When a user is created in OIM, there will be corresponding entries available in USR table. USR table has many fields delivered OOTB (OOTB – Out of the box). However for some of the enterprises, this may not be sufficient. We can define additional fields as UDFs (User Defined Fields).
In OIM, almost everything revolves around the user account (I think that is what expected from an identity provisioning software such as OIM). User account is the central piece of data here.
In OIM, Users will be provisioned or de-provisioned with Resources. Resources are a target system, such as, Oracle Internet Directory or Active Directory.

What are OIM Access Policies?

There are three types of objects required to perform automatic provisioning based on policies. When you use Access Policies for auto-provisioning, then it is called as “Policy Based Provisioning”. The main objects required for policy based provisioning are:
  • Rules
  • Groups
  • Access Policies
We can use Rules for placing users to some specific OIM Groups. Once a user is a member of a group, then, Access policies can be used to perform policy-based provisioning in OIM. That’s why we need to understand the dependencies between Rules, Groups and Access Policies.
Rules get evaluated whenever an update is made to the user attributes (such as a password change, email address change etc). Also, we can use the OIM API updateUser() function to re-evaluate rules.
In Design Console, you can use “Policy History” form to view the details of the access policies and resources related to users.
Starting from OIM 9.1.0.2 and later versions (in Fusion Middleware Identity and Access Management 11.1.1.x too), there is a scheduled task called “Evaluate User Policies” delivered OOTB. This task will be useful if you want to provision users by validating all the rules, then automatically adding/removing groups, finally provisioning/de-provisioning resources by access policies.

Some Internals of working

POL table holds details about the Access Policies in OIM database. There are other tables related to OIM Access Policies as well. Some of the interesting ones are:
  • POP – data about parent table in Access Policies
  • POC – data about child policies in Access Policies
  • POG – mapping between access policies and OIM groups (based on pol_key and ugp_key)
  • POF – Field Values in Access Policies
In USR Table, there is a field called “USR_POLICY_UPDATE”. I think the values can be null or 1. This field is used when “Evaluate user policies” task is run for the evaluate criteria. This field will determine whether the access policies will be reevaluated next time.
User Policy Profile tables – UPP and UPD tables are important user related tables that stores details about access policies for a user and relevant details. These tables normally referred when “Policy History” form is being used for a user in OIM Design Console.
There are two other history tables UPH and UHD. They are history tables for the corresponding User Policy Profile Tables UPP and UPD.
OIU table has two columns, OIU_POLICY_BASED and OIU_POLICY_REVOKE. Based on my understanding, these two columns are set based on the resources provisioned access policy and “Revoke if no longer applies” setting.
Process form tables (UD_ tables) will contain POL_KEY column populated with Access policy. This POL_KEY column is applicable for the OIM Child tables as well.
In OIM, updating the underlying tables are not recommended and not supported by Oracle. These tables will be used when you investigate to try to find out scenarios such as, why a user was not revoked automatically or why she was not provisioned to a resource automatically.

A Sample Implementation

I was thinking of a scenario to explain the usage of access policies for automatic provisioning of Resources in OIM. You can consider an enterprise trying to move to OIM. They have some of the rules based on which user account will be created or modified or deleted. I just have these few rules as an example (in real world, there can be many up to 100+ or even 200+ rules).
  1. All users in HR Department will be part of the AD Group “HR Department”
  2. All users with “IT Operations” should be having a unix account server in “exadata-200”
So, in first case, you can define an OIM Rule, that will place the users with “HR Department” value in an OIM Group “Group_HR_Department”. Then whenever user is part of that OIM Group, then the user can be provisioned to “HR Department” AD Group automatically.
In the second case, we can check for the department with the Rules, place the user in a group – then we can define an access policy to provision user account to “exadata-200” automatically.

Closing note

Access Policies are just one of the features of OIM. There are many other features there in OIM. Implementing OIM is easy if you understand these underlying basic concepts. Also, understanding about the target systems will be useful when investigating issues during the implementation.
As in every project, collecting the requirements is important. In OIM implementations, this is really important, more than that, documenting the requirements is important. Also, sufficient amount of testing is another consideration for OIM implementation projects. I will cover the logistic details of an OIM implementation in another post.
As the saying goes “The more you know, the more you know what you don’t know”. This is true for OIM (for so many other things in IT too). There are still some things I don’t know about OIM Access Policies. I am just working with OIM on what I know now (and still learning).  J
Okay. I hope that is it for this post. We will meet in another post with more interesting details about OIM. Until then

Wednesday, 27 April 2011

Part I – LDAP Directory for the Cloud – Which one do you recommend?


I am planning to appear for the CISSP exam sometime this year (could be in the month of May – I believe it really needs more time to prepare). For my Exam, I just completed my reading the Access Control chapter. I am using the Shon Harris AIO guide for my CISSP Exam. Whether I take the exam or not, the more knowledge I gain, then I am good with that. Believe me “Access Control” is not an easy chapter for me (though I worked on that domain for last few years. I have to understand lot of terminologies for the CISSP Exam. I still have 9 more domains to complete before start taking other books (Access Control is just one of them). It looks like it needs a lot more preparation than I thought.
Definition of Cloud Computing
Directory as a Service
  • Oracle Directory Server (ODS) – formerly Iplanet or Sun LDAP.
  • Oracle Internet Directory (OID)
  • Microsoft Active Directory (AD)
  • IBM Tivoli Directory Server (ITDS)
  • Novell’s eDirectory
  • OpenLDAP
What do I think?
Anyways, I don’t want to talk about Access Control here. But it is about the webcast by Mark Wilcox from Oracle couple of weeks ago. Mark webcasted a presentation on “Choosing the right Directory for the Cloud”. You can find the recording here.
Let’s try to understand the general definition of cloud computing first. According to “The NIST Definition of Cloud Computing” Version 15, it is a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.
According to the definition, there should be shared pool of configurable computing resources. In this context, we are talking about LDAP Directory as a software service that can be configured to provide access through various resources through. In this webcast, Mark talked more about the OID and ODS (see below).
In this context, let’s try to understand how LDAP Directories can be a service in the cloud.
There are many LDAP Directory offerings from various vendors, such as the ones below:
I want to talk more about LDAP directories for the clouds more on covering famous Directories out there. We serve many customers and everyone has their own preference of a LDAP Directory. So, we can’t ignore the other famous LDAP Directories.
When we talk about LDAP Directory for a cloud, we are talking about an LDAP instance for the Cloud application for authentication purposes (in some cases, we can use it for authorization as well).
If you are working with Oracle Products, such as Oracle EBS etc, and you need to consider a integration with LDAP Directory, then I believe Oracle Internet Directory (OID) has more advantages than the others in the list (Also, Oracle certifies most of the Identity Management products for EBS aligning with OID). Main reason is that Oracle Products are certified with OID as a recommended LDAP Directory – they are easy to integrate with the support from the point of the Vendor. Other reasoning is because the data is stored in the database, so you can take advantage of the Database Security Features.
ODS (formerly Sun Java System Directory Server, before that Iplanet Directory Server) is a great product in itself. I am working with this directory for a long time now. The data is stored in the Operating System Files (it internally uses the database structure). ODS follows LDAP v3 protocol standard.
I don’t want to be Oracle-centric in my approach (both of the above two directory servers I mentioned are from Oracle Corp). Mark Wilcox is from Oracle, So he talked more about these two directories in general. Also,
So, how can we provide an LDAP Directory as a service in the cloud? And more importantly what are the important factors that we need to consider while providing this service?
Also, Let’s talk about other directories in coming posts.
Until then
Vijay Chinnasamy

Monday, 14 February 2011

Provisioning to two Active Directory Domains with Oracle Identity Manager – Connector Cloning – Part I


In many large enterprises, there can be two Active Directory Domains used (sometimes more than two), for example, one for India users and one for North America users (Considering the company has two major locations). This requires two AD Connector instances to be created in OIM, for provisioning and reconciliation purposes. OIM Connector Guide for Active Directory User Management provides following description for creating copies of the Connector to provision into multiple target systems. However detailed instructions are not available in the connector.

From the Oracle Connector Documentation (Oracle Identity Manager Connector Guide for Microsoft Active Directory User Management – Release 9.1.1 – E11197-11 – Page 186):
Section: 4.15.1
To create a copy of the connector:
  1. Create copies of the IT resource, resource object, process form, provisioning process, scheduled tasks, and lookup definitions that hold attribute mappings.
  2. Create a copy of the Lookup.AD.Configuration lookup definition. In the copy that you create, change the values of the following entries to match the details of the process form copy that you create.
      1. ROUserID
      2. ROUserManager
      3. ROFormName
      4. ROUserGUID
  3. Map the new process tasks to the copy of the Lookup.AD.Configuration lookup definition.
Initially I was not sure how I can setup the Cloning. I had two Active Directory Domains. When the users are created in OIM, access policies will identity to which one it has to be provisioned. However I have to setup two AD Connectors for these two domains.
Based on my investigation, following AD Connector Specific objects are involved:
  1. Copy of the IT Resource
  2. Copy of the RO
  3. Copy of the Process form
  4. Copy of the Provisioning Process
  5. Copy of the Scheduled Tasks
  6. Copy of the Lookup Definitions
  7. Copy of the Reconciliation Rule
First, you need to export the relevant objects as XML file, rename them by manually editing the XML file, then re-import them. One recommendation, is run your XML file through “xmllint –format” on Linux, that should make it more readable, so it is easier for you to edit (Thanks to Oracle Support for providing this – xmllint – information).
Here are the steps for cloning a connector – based on my personal experience:
  1. Identify all the connector Objects used by the Active Directory Connector (Mostly the below tables – but I am still not sure whether I covered all the objects – Please let me know if I missed any).
  2. Export these Objects using Deployment Manager Export Utility. This will create an XML File during the export.
  3. Once you have the XML file, you need to identity and replace the values for the objects in the XML file. This is the main reason you should be aware of the AD Connector Objects.
  4. Then, you can import this manipulated XML file into the OIM System. I faced errors during the import. I will write about those errors in the next post.
AD Connector Objects:
S. No. Object Type Object Name for AD Connector
1. IT Resource AD IT Resource
2. Resource Object AD User
3. Process Form UD_ADUSERUD_ADUSRC
4. Provisioning Process AD User
5. Scheduled Tasks Target Recon
6. Lookup Definitions Many…
7. Child Tables UD_ADUSER*
In my current OIM System, I have the default connector configured to the First AD Domain. The cloned connector is configured to the second AD Domain. I thought it was confusing. So, I had a question about this and received the below information from Oracle Support. Hope it is useful.
The best approach is to import the connector twice for the two domains by using the cloning method to clone twice, and leave the original objects installed unused. That way, when you upgrade to newer connector version on top the existing one, you will update the original unused template objects, then clone the change on to the two domain objects.
Second method is, keeping the installed AD Connector for one domain, and the clone the AD Connector for the second AD Domain, will also work.
I liked the approach of keeping two connectors cloned. You may like the other approach, but it is up to you to decide.
I will write a continuation of this post later. Until then

Monday, 1 November 2010

Something I learned about Oracle Database 11g RMAN restore


Last weekend (it was saturday night), I needed to restore a Development database from a old backup. I never did a RMAN restore before until last saturday. As the saying goes, “necessity is the mother of invention”. Though it is not really an invention (rman is there for a long time), for me, I learned to know about RMAN restore last week.
Our DBA was not available on Saturday. I needed to test few things on the Development and for that I need to restore a backup that was taken earlier couple of months ago. So I did the follow the procedures to restore the database using RMAN.
This could be a basic thing all the DBAs know about. But for me this is not something I do everyday. So this was new to me.
First, I ran the “shutdown immediate” command to shutdown my development database. Then I followed these steps to restore the database from a older backup taken by RMAN. Database was running on the Redhat Enterprise Linux Machine and the database version was 11.1.1.6.0.
$ rman

RMAN> list backup;
List of Backup Sets
===================

…….
I got the TAG details from here.
……
RMAN>  restore datafile ‘/u02/oradata/OIMTST/system01.dbf’ from tag = ‘BEFORERECON’;

RMAN>  restore datafile ‘/u02/oradata/OIMTST/sysaux01.dbf’ from tag = ‘BEFORERECON’;
….
RMAN> restore datafile ‘/u02/oradata/OIMTST/undotbs01.dbf’ from tag = ‘BEFORERECON’;

RMAN> restore datafile ‘/u02/oradata/OIMTST/users01.dbf’ from tag = ‘BEFORERECON’;

RMAN> restore datafile ‘/u02/oradata/OIMTST/oimtst4_tspace_01.dbf’ from tag = ‘BEFORERECON’;

RMAN> list backup of controlfile;

RMAN> restore controlfile to ‘/u02/oradata/OIMTST/control01a.ctl’ from tag = ‘TAG20100820T112653′

RMAN> quit
Recovery Manager complete.

$
Copying the Control Files:
============================
cd /u02/oradata/OIMTST  # The conrol files are located here.
cp control01a.ctl control01.ctl
cp control01a.ctl control02.ctl
cp control01a.ctl control03.ctl

$ sqlplus / as sysdba….
SQL> startup
ORACLE instance started.
Total System Global Area 1073131520 bytes
Fixed Size                  2151248 bytes
Variable Size             264244400 bytes
Database Buffers          801112064 bytes
Redo Buffers                5623808 bytes
Database mounted.
ORA-01589: must use RESETLOGS or NORESETLOGS option for database open

SQL>
SQL> alter database open resetlogs;

Database altered.
SQL>
Hurray!!!! It is success!!!
This was my first restore using RMAN. I knew the concepts earlier, but I didn’t really restore a database like this before. I thought of sharing this knowledge.
We will meet in another post. Until then

Wednesday, 29 September 2010

Checking on Oracle Fusion Applications


Sun Blogger Vijay Tatkar wrote in his blog about the eight technology innovations mentioned by Larry Ellision during his Oracle Open World Keynote speech during last week. Nearly half of them were Sun Hardware related (such as Exadata, ExaLogicSun ultraSPARC t3 etc). Here is the list:

  1. Fusion Apps
  2. Unbreakable Linux Kernel
  3. Solaris Express 11
  4. unltraSPARC t3 chip
  5. mySQL 5.5
  6. exadata
  7. exalogic
  8. Java 7 and 8
Since the beginning, I am always interested to know more about Fusion Apps, mainly out of curiosity. Oracle Fusion Applications were formally introduced during the Oracle Open World last week (during Open World 2010). As per Oracle Release, this was one of the major innovation or next big thing for Oracle. As you are aware, Fusion Middleware Products were released already. Now, it is time to talk about the Fusion Applications.
You may be already aware; I started my IT career as a Web Developer in a small web hosting company. I used to write perl CGI code and hosting them on Linux Servers running with Apache Web Server and mySQL Database. I got bored (or I wanted a change, I am not sure!) with that job and then moved into the Unix System Administration. I worked as a Sun Solaris Admin and HP Unix Admin for some time. Then I worked in both Peoplesoft System Administration for nearly 7 years and currently working in an Oracle Identity Management (which is part of Oracle Fusion Middleware products) project for nearly past one year.
So, the question is “now what?” And how can we develop the knowledge for Fusion Apps Administration.
I am not sure when Fusion Apps will be deployed full-fledged instead of the other ERP Applications like Peoplesoft. I don’t think it will be near soon, but may be after few years, Oracle may buy in customers who are going to do a new implementation of some ERP Applications.
You know what, Fusion middleware for Fusion Apps is like PeopleTools for Peoplesoft Applications. PeopleTools Technology is the abstract layer on top of which all the Peoplesoft Applications run on. PeopleTools was originally built on C and C++ and finally evolved into a Java Technology. However I still feel some of them are C++ code. While Fusion Middleware is more Java and J2EE apps, I believe Fusion Apps will be more J2EE apps than Peoplesoft. I need to spend little bit more time on implementing a Fusion Middleware and an application. As of now, I only worked on Identity Management Product Sure and little but of Oracle Portal Technologies.
For an IT Infrastructure Administrator like me (who mainly works on Oracle Server Technologies), I think understanding the Fusion Middleware Stack will be important.
Talk to you later. Until then