Fork me on GitHub

n. Slang a rough lawless young Kuali developer.
[perhaps variant of Houlihan, Irish surname]
kualiganism n

Blog of an rSmart Java Developer. Full of code examples, solutions, best practices, et al.

Sunday, April 10, 2011

Setting up CAS on KC/Rice

Screencast

Just a screencast to show where to get the files and how to set it up.

Instructions


1 Download Source from rSmart

% svn co  https://svn.rsmart.com/svn/kuali/rice/rsmart_rice_core/trunk rsmart_rice_core

2 Copy Example Config

% cp web/src/main/config/example-config/rice-config.xml $HOME/kuali/main/dev/

3 Copy Contents of LDAP Example Config

% cat ldap/src/main/config/example-config.xml

You should see something that looks like:
<!--
Copyright 2008-2009 The Kuali Foundation

Licensed under the Educational Community License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.opensource.org/licenses/ecl2.php

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<config>
<param name="cas.url">http://localhost:8080/${cas.context.name}</param>
<param name="cas.require.https">false</param>
<param name="cas.validate.password">true</param>
<param name="cas.rice.server.name">${appserver.url}</param>

<param name="filter.login.class">org.jasig.cas.client.authentication.AuthenticationFilter</param>
<param name="filter.login.casServerLoginUrl">${cas.url}/login</param>
<param name="filter.login.serverName">${appserver.url}</param>
<param name="filtermapping.login.1">/</param>

<param name="filter.validation.class">org.jasig.cas.client.validation.Cas20ProxyReceivingTicketValidationFilter</param>
<param name="filter.validation.casServerUrlPrefix">${cas.url}</param>
<param name="filter.validation.serverName">${appserver.url}</param>
<param name="filtermapping.validation.2">/</param>

<param name="filter.caswrapper.class">org.jasig.cas.client.util.HttpServletRequestWrapperFilter</param>
<param name="filtermapping.caswrapper.3">/</param>

<param name="rice.ldap.username">uid=user,ou=Ldap Users,dc=localhost</param>
<param name="rice.ldap.password">[secret]</param>
<param name="rice.ldap.url">ldaps://localhost:636</param>
<param name="rice.ldap.base">ou=People,dc=localhost</param>
<param name="rice.additionalSpringFiles">org/kuali/rice/kim/config/KIMLdapSpringBeans.xml</param>
</config>

When you finish, your config should look like:
<!--
Copyright 2008-2009 The Kuali Foundation

Licensed under the Educational Community License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.opensource.org/licenses/ecl2.php

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<config>

<!-- Please fill in a value for this parameter! -->
<param name="http.port">8080</param>
<param name="application.host">http://yourserver</param>
<param name="application.url">${application.host}:${http.port}/${app.context.name}</param>

<param name="plugin.dir">/usr/local/rice/plugins</param>

<!-- set some datasource defaults -->
<param name="datasource.username">rice</param>
<param name="datasource.password">*** password ***</param>

<!-- MySQL example -->
<param name="datasource.ojb.platform">MySQL</param>
<param name="datasource.platform">org.kuali.rice.core.database.platform.MySQLDatabasePlatform</param>
<param name="datasource.url">jdbc:mysql://localhost:3306/${datasource.username}</param>
<param name="datasource.driver.name">com.mysql.jdbc.Driver</param>
<param name="datasource.pool.validationQuery">select 1</param>

<!-- Oracle example
<param name="datasource.ojb.platform">Oracle9i</param>
<param name="datasource.platform">org.kuali.rice.core.database.platform.OracleDatabasePlatform</param>
<param name="datasource.url">jdbc:oracle:thin:@localhost:1521:XE</param>
<param name="datasource.driver.name">oracle.jdbc.driver.OracleDriver</param>
<param name="datasource.pool.validationQuery">select 1 from dual</param>
-->

<param name="attachment.dir.location">/usr/local/rice/kew_attachments</param>
<param name="data.xml.root.location">/usr/local/rice/kew/xml</param>

<!-- log4j settings -->
<param name="log4j.settings.path">/usr/local/rice/log4j.properties</param>
<param name="log4j.settings.reloadInterval">5</param>

<!-- Keystore Configuration -->
<param name="keystore.file">/usr/local/rice/rice.keystore</param>
<param name="keystore.alias">*** key alias ***</param>
<param name="keystore.password">*** password ***</param>

<param name="mail.relay.server">localhost</param>
<param name="mailing.list.batch">mailing.list.batch</param>
<param name="encryption.key">*** encryption key ***</param>

<param name="cas.url">http://localhost:8080/${cas.context.name}</param>
<param name="cas.require.https">false</param>
<param name="cas.validate.password">true</param>
<param name="cas.rice.server.name">${appserver.url}</param>

<param name="filter.login.class">org.jasig.cas.client.authentication.AuthenticationFilter</param>
<param name="filter.login.casServerLoginUrl">${cas.url}/login</param>
<param name="filter.login.serverName">${appserver.url}</param>
<param name="filtermapping.login.1">/</param>

<param name="filter.validation.class">org.jasig.cas.client.validation.Cas20ProxyReceivingTicketValidationFilter</param>
<param name="filter.validation.casServerUrlPrefix">${cas.url}</param>
<param name="filter.validation.serverName">${appserver.url}</param>
<param name="filtermapping.validation.2">/</param>

<param name="filter.caswrapper.class">org.jasig.cas.client.util.HttpServletRequestWrapperFilter</param>
<param name="filtermapping.caswrapper.3">/</param>

<param name="rice.ldap.username">uid=user,ou=Ldap Users,dc=localhost</param>
<param name="rice.ldap.password">[secret]</param>
<param name="rice.ldap.url">ldaps://localhost:636</param>
<param name="rice.ldap.base">ou=People,dc=localhost</param>
<param name="rice.additionalSpringFiles">com/rsmart/kuali/rice/ldap/KIMLdapSpringBeans.xml</param>

<!-- Sample Application Flag -->
<param name="sample.enabled">false</param>

<param name="dev.mode">false</param>
</config>

Notice that the DummyLoginFilter is not to be found. This is important. This handles logins by default. You don't want it around when you're configuring CAS.

4 Make Changes to Config

Right now, CAS is configured for localhost. At your institution, you will want to point this to your REAL CAS server. Also, make sure to set the cas.context.name. It is probably better to use https as well.
    <param name="cas.context.name">webauth</param>
<param name="cas.url">https://webauth.arizona.edu/${cas.context.name}</param>
<param name="cas.require.https">false</param>
<param name="cas.validate.password">true</param>
<param name="cas.rice.server.name">${appserver.url}</param>

<param name="filter.login.class">org.jasig.cas.client.authentication.AuthenticationFilter</param>
<param name="filter.login.casServerLoginUrl">${cas.url}/login</param>
<param name="filter.login.serverName">${appserver.url}</param>
<param name="filtermapping.login.1">/</param>

<param name="filter.validation.class">org.jasig.cas.client.validation.Cas20ProxyReceivingTicketValidationFilter</param>
<param name="filter.validation.casServerUrlPrefix">${cas.url}</param>
<param name="filter.validation.serverName">${appserver.url}</param>
<param name="filtermapping.validation.2">/</param>

<param name="filter.caswrapper.class">org.jasig.cas.client.util.HttpServletRequestWrapperFilter</param>
<param name="filtermapping.caswrapper.3">/</param>

<param name="rice.ldap.username">uid=user,ou=Ldap Users,dc=eds,dc=arizona,dc=edu</param>
<param name="rice.ldap.password">[secret]</param>
<param name="rice.ldap.url">ldaps://eds.arizona.edu:636</param>
<param name="rice.ldap.base">ou=People,dc=eds,dc=arizona,dc=edu</param>
<param name="rice.additionalSpringFiles">com/rsmart/kuali/rice/ldap/KIMLdapSpringBeans.xml</param>

5 Configure LDAP

At UA, we setup there is an institutional directory service called EDS. Your institution may also have one. I configured it like this,
  <param name="rice.ldap.username">uid=user,ou=Ldap Users,dc=eds,dc=arizona,dc=edu</param>
<param name="rice.ldap.password">[secret]</param>
<param name="rice.ldap.url">ldaps://eds.arizona.edu:636</param>
<param name="rice.ldap.base">ou=People,dc=eds,dc=arizona,dc=edu</param>
<param name="rice.additionalSpringFiles">com/rsmart/kuali/rice/ldap/KIMLdapSpringBeans.xml</param>

6 Love

That's it. This is a runtime configuration, so simply restarting my application server will reload this configuration and make the changes live.

The Only Logger You'll Ever Need

The Status Quo for Logging in Kuali

One of the most frustrating things about developing with Kuali Foundation Software is the logging. Setup aside, just adding the logging is frustrating. Mostly because of the immense amounts of copy/paste that is encouraged by it. Here's an example of some of the boiler plate.
...
...
import org.apache.log4j.Logger;
import org.apache.ojb.broker.query.Criteria;
import org.apache.ojb.broker.query.QueryByCriteria;
import org.kuali.kfs.coa.dataaccess.impl.ChartDaoOjb;
import org.kuali.kfs.fp.businessobject.TravelMileageRate;
import org.kuali.kfs.fp.document.dataaccess.TravelMileageRateDao;
import org.kuali.rice.kns.dao.impl.PlatformAwareDaoBaseOjb;

/**
* This class is the OJB implementation of the TravelMileageRate interface.
*/
public class TravelMileageRateDaoOjb extends PlatformAwareDaoBaseOjb implements TravelMileageRateDao {
private static Logger LOG = Logger.getLogger(ChartDaoOjb.class);
...
...
Let's go through this. First, examine the import:
import org.apache.log4j.Logger;
Every single class you write with logging will require this weird, foreign class that really has nothing to do with the functionality of your software. It's awkward, and it's boilerplate. It's everywhere needlessly, and in some cases can cause you to neglect that it's there. Next, is my favorite part. We declare the logger on top of having to import it:
public class TravelMileageRateDaoOjb extends PlatformAwareDaoBaseOjb implements TravelMileageRateDao {
private static Logger LOG = Logger.getLogger(ChartDaoOjb.class);
See anything unusual? This is really what I was getting at behind the copy/paste. To my knowledge, this still exists in the KFS source code. It's misleading. This TravelMileageRateDao is logging as the ChartDaoOjb I doubt this is on purpose. Rather, it is the result of copy/pasting the logger declaration from another class. Many do this because it is tedious. As a result, many forget to change the class name.

I am not going to blame the developer for this. In my mind, it shouldn't even be necessary to do this. Shouldn't the framework just know what class I'm logging from? Is that really so hard?

Researching the Performance of Logging

After putting together this post on another blog, I became determined to devise a simpler way to handle logging. Here were my goals.

  • Limit Logger boilerplate to the import statement
  • Efficient logging where the full log message is not concatenated until it is determined whether the message would be used or not
  • printf style formatting if possible.

Back to the Question

Shouldn't the framework just know what class I'm logging from? Is that really so hard?

No. It's not.

I have created two classes called BufferedLogger and FormattedLogger. These are "The Only Loggers You'll Ever Need".

How to use them

It's easy. Before now, you probably thought static imports are pretty useless. Think again.
import static org.kuali.kra.logging.BufferedLogger.*;

That's your boilerplate. Next, let's use it:
public ActionForward insertProposalPerson(ActionMapping mapping, ActionForm form, HttpServletRequest request, HttpServletResponse response) throws Exception {
...
...
// if the rule evaluation passed, let's add it
if (rulePassed) {
document.getDevelopmentProposal().addProposalPerson(pdform.getNewProposalPerson());
info(ADDED_PERSON_MSG, pdform.getNewProposalPerson().getProposalNumber(), pdform.getNewProposalPerson().getProposalPersonNumber());
// handle lead unit for investigators respective to coi or pi
if (getKeyPersonnelService().isPrincipalInvestigator(pdform.getNewProposalPerson())) {
getKeyPersonnelService().assignLeadUnit(pdform.getNewProposalPerson(), document.getDevelopmentProposal().getOwnedByUnitNumber());
}
...
...

Very easy stuff. Notice that there are multiple objects being passed to the info method. They are not concatenated yet. The info method first checks if the message will be used before concatenating. This is actually a huge timesaver if you consider that the '+' and '+=' concatenation is pretty time consuming.

What about printf style logging? Here's another example:
void prepare(ActionForm form, HttpServletRequest request) {
ProposalDevelopmentForm pdform = (ProposalDevelopmentForm) form;
request.setAttribute(NEW_PERSON_LOOKUP_FLAG, EMPTY_STRING);
ProposalDevelopmentDocument document=pdform.getDocument();
List proposalpersons=document.getDevelopmentProposal().getProposalPersons();
for (Iterator iter = proposalpersons.iterator(); iter.hasNext();) {
ProposalPerson person=(ProposalPerson) iter.next();
if (person.getRole() != null) {
person.getRole().setReadOnly(getKeyPersonnelService().isRoleReadOnly(person.getRole()));
}
}

pdform.populatePersonEditableFields();
handleRoleChangeEvents(pdform.getDocument());

debug(INV_SIZE_MSG, pdform.getDocument().getDevelopmentProposal().getInvestigators().size());

try {
boolean creditSplitEnabled = this.getParameterService().getIndicatorParameter(ProposalDevelopmentDocument.class, CREDIT_SPLIT_ENABLED_RULE_NAME)
&& pdform.getDocument().getDevelopmentProposal().getInvestigators().size() > 0;
request.setAttribute(CREDIT_SPLIT_ENABLED_FLAG, new Boolean(creditSplitEnabled));
pdform.setCreditSplitEnabled(creditSplitEnabled);
}
catch (Exception e) {
warn(MISSING_PARAM_MSG, CREDIT_SPLIT_ENABLED_RULE_NAME);
warn(e.getMessage());
}        
}
You can see that this looks no different than the info illustrated earlier. There is one difference though. Examine the warn statement:
warn(MISSING_PARAM_MSG, CREDIT_SPLIT_ENABLED_RULE_NAME);
It uses a constant called MISSING_PARAM_MSG. This is actually a format string that looks like:
private static final String MISSING_PARAM_MSG = "Couldn't find parameter '%s'";

Which is better? BufferedLogger or FormattedLogger

FormattedLogger has its downside. Formatting actually takes more clock cycles than concatenation. It's friendlier to developers though. With a rather large number of parameters to format, it can be relatively fast. I fall on BufferedLogger the most, but FormattedLogger has its uses.

There you have it. No more copy paste. printf style logging.

KIS Me Kate - RPM Packaging KFS Part 1

The default packaging option from the Kuali Foundation is typically JAR or WAR packaging. These are my observations from modifying typically deployment of KFS at the University of Arizona, to using RPMs. This is fine for libraries and/or web applications. It is platform independent and follows the standards of software deployment. So

Why Repackage KFS?

There are shortcomings tow WAR and JAR packaging. WAR packaging was created with the intent that the application is actually WebSphere, Weblogic, JBOSS, etc..., and the WAR is actually a webapp that is deployed within. With that intent comes the concept that everything is contained within the WAR. What is lacking is:
  • There is no notion of pre/post processing at installation and deployment
  • There is no verification of dependencies or requisites.
  • There is no workflow for software installation
  • No maintenance over documentation vs. configuration files.
  • No platform-specific task hooks.
  • Upgrade software management and configuration management

All of these things may not make sense. For example, "No platform-specific task hooks". WAR is platform independent. Why would you want that, right? Well, that's just it. I think it's great that WAR is platform-independent. It let's you independently define your own packaging around it. Again, why would you want to do that? Double packaging? Let's approach each of these.

Pre/Post-Processing at Installation and Deployment

There may be some actual server information that the application needs to be configured at the point of installation and deployment. For example, server name database configuration, ssh key generation, certificate authority verification, ssl configuration, etc... These are normally configured manually by the system administrator manually after installing a WAR. What if this needs to be installed on several servers in a cloud? Some of this information can be automated. It does not require interaction or input from a user, so why do we require it to be done manually? That shouldn't be necessary.

Verification of Dependencies or Requisites

Sure, WAR files can contain all required libraries. There's no guarantee of this. Further, what if the server has a configuration that supercedes the WAR configuration. There's no way to know that either. Libraries aren't the only requisites that a WAR can have. You can ship with the BSF (Bean Scripting Framework), but that does you absolutely no good whatsoever if you have no native scripting languages installed on your server. What about the application server? The WAR doesn't come with that. Wouldn't it be nice if installing the application meant that even the appserver was installed with it and any software (libraries or not) it depends on? Yes it would, that's why proper software packaging starts to look pretty good.

Workflow for Software Installation

Like your development process, installation itself can have its own phases. A good installation infrastructure allows you to augment or even create and define your own phases. For example, pre/post-processing scripts (described earlier), patching, build, file installation, permission assignments, documentation handling, cleanup, etc...

Maintenance Over Documentation vs. Configuration Files

Documentation and Configuration files are very delicate items in your deployment. Any configuration deployed with the application is usually reference implementation. That is, it is typically replaced manually or at the first deployment. Upon upgrading, configuration files are not something you want to override. You typically want to back these up with each upgrade and identify changes in configuration formats between versions. For example, new configuration entries can be added to a configuration file with each version. Some may become obsolete. It is undesirable to keep these, but you typically do not want to sacrifice the rest of your configuration for this. Kuali software is not a stranger to this situation. Kuali Foundation projects have the concept of an "external" configuration directory that exists outside of the WAR. The purpose of this is to exclude sensitive information from the webapp itself like passwords, any uploaded financial information batch files, or even log files. Such information is kept out of reach for security purposes within the external configuration. The external configuration files will not likely change, so when deploying upgrades, these should remain unchanged.

Platform-specific Task Hooks

Trouble with WAR files is that their plaform independent. This is their greatest strength, but also a huge weakness. WAR files know nothing about the system your are deploying too, this is a hassle for system administrators because the software has to be treated separately from other installations. It is difficult to observe changes and possible security threats. For example, if there is an exploit in the version of bouncy castle that KFS uses, there is no way for a system administrator to know. Further, if installation can be simplified by using platform-specific knowledge about the installed system at installation time, this knowledge can be used to automate the process more. For example, knowing what software and what version is installed can help determine what arguments to pass to utilities at install time.

Upgrade Software Management and Configuration Management

Basically, this is software intelligence. Having metadata about your software before and after it's installed for dependency management (illustrated earlier), configuration file handling (illustrated earlier), bug reporting, vulnerability observing (illustrated earlier), etc... One huge use is when a system administrator needs to manage multiple software installations across several servers. Knowing what version, build number, and configurations are on servers should be as easy as checking the software database of that system. Each platform has one. If a webapp is packaged and installed using that system, the software database will know about it and be able to disseminate that information back to the system administrators. With further scripting and automation tools, system administrators can have much better control over the systems they maintain.

Why is Packaging Important to KFS?


KFS is victim to all of the above illustrated.
  • At the University of Arizona, there are application sanity and maintenance tasks that run before packaging and before package installation. Packaging is done on a system separate from installation because there are actually several installation systems. To provide a repeatable installation process, it was decided that if any were going to be on a remote system then all should. For security purposes I cannot go into much detail about what pre/post-processing the University of Arizona does, but I can say that I would expect just about any university to require it.
  • Since building, packaging, and installation happen on separate systems a workflow has to be maintained. Further, the software is not the only thing that is installed for some implementing institutions. Some may couple data or even database schema information to the software. Therefore, this information may be distributed and deployed with the software. It cannot happen all at once. If there is a problem anywhere during the installation, a fallover path must exist. Installation workflows help this work out.
  • KFS makes use of something called an "external" settings directory. This directory is created and populated with reference information at installation. However, this information is overwritten at each installation by default. It would be good to not have to do this each time. When dealing with configurations on several servers, it can be tedious and problematic to rebuild each time and reconfigure each time. Mistakes are made. It is best to just configure once, and then make modifications only when necessary.
  • I have noticed that some institutions like to give access to the shared files directory via setacls on RedHat and other linux systems. Such capabilities are not available by dropping in a war.
  • Currently KFS does not install all other required software with it. Also, there are no reports or observations on vulnerabilities of libraries distributed with KFS. When upgrading from one release to another, there is no verification of version or software compatibility.


How can KFS Further Benefit from Software Packaging?


I used RPM packaging of KFS at the University of Arizona. When I did, it greatly streamlined installation. System administrators were familiar already with RPM. They were adept with installing packages, handling logs and dependencies, and found it easy to modify and maintain configuration. We were able to integrate RPM building into our CI (continuous integration). Whenever a release was due, a new RPM was created. This RPM could then be manually or automatically installed by the package management system. RPM information could quickly be verified with

% rpm -qa | grep kuali
kuali-coeus-2.0-10
kuali-coeus-settings-dev-2.0-10
kuali-coeus-kittdb-2.0-10

or

% rpm -qi kuali-coeus-settings-dev-2.0-10
Name : kuali-coeus-settings-dev Relocations: (not relocatable)
Version : 2.0 Vendor: (none)
Release : 10 Build Date: Tue 28 Dec 2010 07:19:51 AM MST
Install Date: Tue 11 Jan 2011 01:23:25 AM MST Build Host: uaz-kr-a02.mosaic.arizona.edu
Group : System/Base Source RPM: kuali-coeus-2.0-10.src.rpm
Size : 200685086 License: EPL
Signature : (none)
Packager : leo [at] rsmart.com
Summary : External configuration settings for Kuali Coeus
Description :
Mosaic Kuali Coeus external configuration and settings. These files are located
in /home/tomcat/app

. With dependencies management, they are able to make sure that all the necessary tools exist in the system before the software is installed/updated. Database upgrades are streamlined and integrated into the installation process.

Package Repositories

I mentioned that UA has a package management repository. Right now, it is a crude CIFS share. It is possible through YUM to be more elegant with updates. I plan to in the future create my own YUM repository for managing Kuali software upgrades.

Another package management and build system is APT which uses DEB packages. A package maintainer can create a project on LaunchPad which is a portal for Ubuntu Package/Build management. These packages are then present on a package repository (PPA or Personal Package Archive) where a system administrator can then point a server to this PPA and gain updates to all the software on it. The software and repository are verified against a PGP (Pretty Good Privacy) key. See my launchpad. My goal is to eventually have a working packaged KFS distribution by Summer 2011.

Between YUM and APT, it is possible to get automatic updates and patches to your Kuali Software. This is one of the biggest reasons to use software packaging for KFS.

Saturday, April 2, 2011

Implementing KC 2.0 as an Overlay

Overview

Maven overlays are a way to take 2 web application projects and combine them with one overlaying the other. It sounds like it would be interesting if you want to use another project as the base for your own, right? Almost as if it is slightly wrong. This only makes sense when your project builds on that of another. Literally building upon another project as a foundation. You would only make another project the foundation when you are in absolute control of that other project. Kuali is community source, so the community runs the foundation project in this case. The other project is intended to be an implementation of Kuali Coeus where the implementing institution is a member of the community. Then technically, an overlay makes sense. You wouldn't want to make an overlay of just any project you found that you thought was cool and you wanted to change. It would be much better to fork that project instead. If you overlay, you run the risk of the foundation changing from underneath you. When implementing Kuali software, this is fine because you have the community supporting you and the software.

The Problem with Enterprise Software Maintenance at a University

To truly understand why overlays are a good idea at institutions and particularly universities, you need to understand the problem universities have had implementing software pretty much throughout history.

Universities are used to getting this software implemented, but as time goes business processes and practices change. Business is changing. Universities are changing. Why shouldn't their business practices change? It makes sense. The trouble is that they bankrupt their budgets on implementing the software and have nothing left to maintain it. All they budget for maintaining it is fixing leaks and bandaiding. Eventually, these institutions are left with failing systems on the brink of demise. It all comes down to maintenance. These systems can't just be good now. They need the potential to be good 10 years from now.

Kuali is no different in that respect. Compared to other enterprise software, the source code is very extensive. Making bug fixes to it will have effects on upgrading. Every change an institution adds to the source code, that does not get back to the trunk will cause problems because the institution's code base differs that much from that which is upgraded. The mindset is change as little of the original codebase as possible. Only build upon it if possible. If you're going to change something, find a way to do it without modifying the original distribution. No matter what you do, keep a record between versions so we know what changed. Among all of these, the consistent idea is to modify the distribution as little as possible.

That is what overlays allow. Modifying and customizing the distribution by overlaying it affords institutions the ability to
  • make changes without making patching or upgrading difficult in the future
  • track what changes you made
  • simplify your local distribution

The Screencast


This is a screencast based on the instructions laid out in KC 2.0 Customization.

Instructions

Written instructions for following along with the screencast.

1 Checkout the KC Project

First, you need to download the full KC project. I created a path in my workspace to store all this.
% mkdir -p .workspace/rsmart
Then checkout the source code from it. I used export because eventually, I want to import this into my own svn repository
% cd .workspace/rsmart
% svn export https://test.kuali.org/svn/kc_project/tags/kc-release-2_0-tag

The above creates a new kc-release-2_0-tag directory.

2 Install KC WAR and JAR Files

To install the WAR file in our maven repository, we use
% mvn -Dmaven.test.skip=true install

To create the JAR file, we use
% mvn jar:jar

Installing the JAR is a little different.
% mvn install:install-file -Dpackaging=jar -DgroupId=org.kuali.kra -DartifactId=kc_project -Dversion=2.0 -DgeneratePom=true -Dfile=target/kc_project-2.0.jar

That should be the end of our work with the kc_project.

3 Setup kc_custom

Create the directory structure and pom.xml.

3.1 Create Directory Structure

% mkdir kc
% mkdir -p kc/src/main/java/com/rsmart/kuali/kc
% mkdir -p kc/src/main/java/org/kuali/kra/infrastructure
% mkdir -p kc/src/main/config
% mkdir -p kc/src/main/resources/com/rsmart/kuali/kc
% mkdir -p kc/src/main/webapp/WEB-INF/

3.1 Copy Some Files Over

Might as well copy a couple files from the kc_project.
% cp kc-release-2_0-tag/src/main/webapp/WEB-INF/web.xml kc/src/main/webapp/WEB-INF/
% cp kc-release-2_0-tag/src/main/java/org/kuali/kra/infrastructure/KraServiceLocator.java kc/src/main/java/org/kuali/kra/infrastructure

3.2 Create pom.xml

The new overlay project needs its own pom.xml
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>

<groupId>com.rsmart.kuali.kc</groupId>
<artifactId>kc</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>

<name>kc</name>
<url>http://maven.apache.org</url>

<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>

<dependencies>
<dependency>
<groupId>org.kuali.kra</groupId>
<artifactId>kc_project</artifactId>
<version>2.0</version>
<type>war</type>
</dependency>
<dependency>
<groupId>org.kuali.kra</groupId>
<artifactId>kc_project</artifactId>
<version>2.0</version>
<scope>provided</scope>
<type>jar</type>
</dependency>
<dependency>
<groupId>org.kuali.rice</groupId>
<artifactId>rice-kns</artifactId>
<version>1.0.2.1</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
</dependencies>
<repositories>
<repository>
<id>kuali</id>
<name>Kuali Repository</name>
<url>https://test.kuali.org/maven</url>
<snapshots><enabled>true</enabled></snapshots>
</repository>
<repository>
<id>codehaus</id>
<name>Codehaus</name>
<url>http://dist.codehaus.org</url>
</repository>
<repository>
<id>apache</id>
<name>apache</name>
<url>http://people.apache.org/repo/m2-ibiblio-rsync-repository</url>
</repository>
<repository>
<id>jboss</id>
<name>jboss</name>
<url>http://repository.jboss.com/maven2</url>
</repository>
<repository>
<id>atlassian</id>
<name>atlassian</name>
<url>http://maven.atlassian.com/repository/public</url>
</repository>
<repository>
<snapshots />
<id>maven-repo1</id>
<name>maven2 repo</name>
<url>http://repo1.maven.org/maven2</url>
</repository>

</repositories>

<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>2.1.1</version>
<configuration>
<overlays>
<overlay>
<groupId>org.kuali.kra</groupId>
<artifactId>kc_project</artifactId>
</overlay>
</overlays>
</configuration>
</plugin>
</plugins>
</build>
</project>

The important parts to notice are the
  • dependencies - there are 2 kc_project dependencies. One is for the JAR we installed and the other is for the WAR.
        <dependencies>
    <dependency>
    <groupId>org.kuali.kra</groupId>
    <artifactId>kc_project</artifactId>
    <version>2.0</version>
    <type>war</type>
    </dependency>
    <dependency>
    <groupId>org.kuali.kra</groupId>
    <artifactId>kc_project</artifactId>
    <version>2.0</version>
    <scope>provided</scope>
    <type>jar</type>
    </dependency>
    Notice that the JAR dependency uses the provided flag. This states how to use the jars in classpath when building. Here is an excerpt from Maven POM Reference
    provided - this is much like compile, but indicates you expect the JDK or a container to provide it at runtime. It is only available on the compilation and test classpath, and is not transitive.
  • repositories - I added the rice repository to pick up all the Rice dependencies at build time
     <repositories>
    <repository>
    <id>kuali</id>
    <name>Kuali Repository</name>
    <url>https://test.kuali.org/maven</url>
    <snapshots><enabled>true</enabled></snapshots>
    </repository>
    </repositories>
  • plugins - here's what actually does the overlay
      <build>
    <plugins>
    <plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-war-plugin</artifactId>
    <version>2.1.1</version>
    <configuration>
    <overlays>
    <overlay>
    <groupId>org.kuali.kra</groupId>
    <artifactId>kc_project</artifactId>
    </overlay>
    </overlays>
    </configuration>
    </plugin>
    </plugins>
    </build>

4 Add a Custom O/R Mapping File

Create a file in src/main/resources/com/rsmart/kuali/kc which is my institution's module path. I call it rsmart-repository.xml
<?xml version="1.0" encoding="UTF-8"?>
<descriptor-repository version="1.0">
</descriptor-repository>

5 Add a Custom Spring Beans File

To load our O/R mapping, we'll need to wire it up with Spring. Kuali has a facility to handle this. We just create a CustomSpringBeans.xml file
<?xml version="1.0" encoding="UTF-8"?>
<!--
Copyright 2005-2010 The Kuali Foundation.

Licensed under the Educational Community License, Version 1.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.opensource.org/licenses/ecl1.php

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:p="http://www.springframework.org/schema/p"
xmlns:aop="http://www.springframework.org/schema/aop"
xmlns:tx="http://www.springframework.org/schema/tx"
xsi:schemaLocation="http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans-2.0.xsd
http://www.springframework.org/schema/tx
http://www.springframework.org/schema/tx/spring-tx-2.0.xsd
http://www.springframework.org/schema/aop
http://www.springframework.org/schema/aop/spring-aop-2.0.xsd">
<bean id="customModuleConfiguration-parentBean" class="org.kuali.rice.kns.bo.ModuleConfiguration" abstract="true">
<property name="databaseRepositoryFilePaths">
<list>
<value>com/rsmart/kuali/kc/rsmart-repository.xml</value>
</list>
</property>
</bean>
</beans>

6 Add Custom Struts Config XML File

Struts has this concept of a context specific configuration where you can have more than one configuration. This is very helpful, but we need to list ours in the web.xml and create it. This is a bare one fresh for putting new forms, actions, forwards, etc... into src/main/webapp/WEB-INF/struts-custom-config.xml
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE struts-config PUBLIC "-//Apache Software Foundation//DTD Struts Configuration 1.2//EN" "http://struts.apache.org/dtds/struts-config_1_2.dtd" [
<!ENTITY protocol_forwards SYSTEM "struts_protocol_forwards.xml">
]>
<struts-config>
<data-sources>
</data-sources>

<form-beans>
</form-beans>

<global-exceptions>
</global-exceptions>

<global-forwards>
</global-forwards>

<action-mappings>
</action-mappings>

<controller processorClass="org.kuali.kra.web.struts.action.KraRequestProcessor" />
<message-resources factory="org.kuali.rice.kns.web.struts.action.KualiPropertyMessageResourcesFactory" parameter="" />
<plug-in className="org.kuali.kra.web.struts.action.GlobalFormatterRegistry" />
</struts-config>

Of course, I change the web.xml from
>servlet>
>servlet-name>action>/servlet-name>
>servlet-class>org.kuali.rice.kns.web.struts.action.KualiActionServlet>/servlet-class>
>init-param>
>param-name>config>/param-name>
>param-value>/WEB-INF/struts-config.xml>/param-value>
>/init-param>
>init-param>
>param-name>debug>/param-name>
>param-value>3>/param-value>
>/init-param>
>init-param>
>param-name>detail>/param-name>
>param-value>3>/param-value>
>/init-param>
>load-on-startup>0>/load-on-startup>
>/servlet>

to
>servlet>
>servlet-name>action>/servlet-name>
>servlet-class>org.kuali.rice.kns.web.struts.action.KualiActionServlet>/servlet-class>
>init-param>
>param-name>config>/param-name>
>param-value>/WEB-INF/struts-config.xml,/WEB-INF/struts-custom-config.xml>/param-value>
>/init-param>
>init-param>
>param-name>debug>/param-name>
>param-value>3>/param-value>
>/init-param>
>init-param>
>param-name>detail>/param-name>
>param-value>3>/param-value>
>/init-param>
>load-on-startup>0>/load-on-startup>
>/servlet>

I just added ,/WEB-INF/struts-custom-config.xml to the parameter.

7 Replace the KraServiceLocator.java

This is a little strange because you are not extending, but overriding the KraServiceLocator class. That means that if there are any changes made to it, you will probably not pick those up unless you explicitly know. This is of course, a maintenance issue, but we're only making minor changes. Remaking them is not a hassle.

7.1 Add the Custom Spring Beans as a Constant

First, we create a constant in KraServiceLocator called CUSTOM_SPRING_BEANS
private static final String CUSTOM_SPRING_BEANS = "com/rsmart/kuali/kc/CustomSpringBeans.xml";

7.2 Add the Constant to the springFiles Array

Now we make use of the constant.
...
...
private static final class ContextHolder {

static String[] springFiles = new String[] {COMMON_SPRING_BEANS,BUDGET_SPRING_BEANS, AWARD_SPRING_BEANS, IRB_SPRING_BEANS, COMMITTEE_SPRING_BEANS,
INSTITUTIONAL_PROPOSAL_SPRING_BEANS, QUESTIONNAIRE_SPRING_BEANS, TIME_AND_MONEY_SPRING_BEANS, CUSTOM_SPRING_BEANS};
...
...
}

8 Love

Now you have your overlay project. Just run the following to create your war.
% mvn -Dmaven.test.skip=true package

You will see a new WAR file in target
leo@behemoth~/.workspace/rsmart/kc
(18:54:16) [48] ls target/
classes kc-1.0-SNAPSHOT.war war
kc-1.0-SNAPSHOT maven-archiver

KIS Me Kate - RPM Packaging KFS Part 4

KC Packaging

Packaging KC is much the same as KFS. The only difference is that when you build, you are using maven instead of ant.

Setting up Hudson

Therefore, to get around adding my own hooks to the build, all I did was use Hudson. First, I setup a maven build.
It is the equivalent to
% mvn -Dmaven.test.skip=true package

Then, I created an invoke shell command

TARFILE=kuali-coeus-2.0-$(cut -d= -f 2 kc-2.0/version.properties).tar.gz
rm -rf kuali-coeus-2.0
mkdir kuali-coeus-2.0
cp -rf kc-2.0/target/kc_custom-2.0 kuali-coeus-2.0

for x in $HOME/kuali/main/[a-z]*; do 
mkdir -p kuali-coeus-2.0/kuali/main/$(basename $x)/
cp -rf $HOME/kuali/main/$(basename $x)/kc-config.xml kuali-coeus-2.0/kuali/main/$(basename $x)
done
cp $HOME/lib/ojdbc* kuali-coeus-2.0/kc_custom-2.0/WEB-INF/lib
cp $HOME/*.properties kuali-coeus-2.0
tar -czf $TARFILE kuali-coeus-2.0 kc-cfg-dbs/
mv $TARFILE /mosaic/data/KITT/SOURCES

You can see that I am using a version.properties. It looks like
leo@behemoth~/.workspace/kc
(11:02:12) [21] cat version.properties 
release=11

Finally, I create the package by calling the build-rpm.xml using ant.
The build-rpm.xml looks like
<?xml version="1.0" encoding="UTF-8"?>
<project             name="kc" 
default="build" 
xmlns:kitt-tools="urn:com.rsmart.ant">
<target name="build" depends="filter-spec">
<exec executable="rpmbuild">
<arg value="-bb" />
<arg value="kc.spec" />
</exec>
</target>

<target name="filter-spec">
<property file="version.properties" />
<kitt-tools:filter srcfile="kc.spec.template"
filename="kc.spec" />

</target>

<macrodef uri="urn:com.rsmart.ant" name="filter">
<attribute name="srcfile" />
<attribute name="filename" />
<sequential>
<loadfile property="buildroot.filter.template"
srcfile="@{srcfile}">
<filterchain>
<expandproperties/>
</filterchain>
</loadfile>

<echo file="@{filename}">${buildroot.filter.template}</echo>
</sequential>
</macrodef>
</project>

A lot like our original build.xml for KFS. The differences here are
  • No vendor path, so we use build-rpm.xml instead of build.xml
  • no workflow
  • no changelogs
  • Building doesn't happen here since hudson did that for us
  • No importing the other build.xml
You can see we use a kc.spec.template and filter that just as we did with kfs.

KC spec file


Here is the spec file I used
%define __os_install_post %{nil}
%define debug_package %{nil} 

Summary: Kuali Coeus
Name: kuali-coeus
Version: 2.0
Release: %release
Provides: kuali-coeus
License: EPL
BuildArch: noarch
Source0: kuali-coeus-2.0-%release.tar.gz
BuildRoot: /tmp/kc
Requires: ant
Group: Development/Tools
Packager: przybyls@arizona.edu

%package settings-${build.environment}
Summary: External configuration settings for Kuali Coeus
Group: System/Base
Requires: kuali-coeus

%package changelogs
Summary: Kuali Coeus KITT Customization Schema
Group: System/Base
Requires: kc,liquibase,wget

%description
The Kuali Foundation research administration software

%description changelogs
Mosaic Kuali Coeus Environment Database Schema for KITT customizations based on KITT
modification set %release

%description settings-${build.environment}
Mosaic Kuali Coeus external configuration and settings. These files are located
in /home/tomcat/app

%prep
%setup -q

%install
TOOLSDIR=%{_builddir}/kuali-coeus-2.0/kitt-tools-1.0
mkdir -p %{buildroot}/home/tomcat/kitt-tools
mkdir -p %{buildroot}/home/tomcat/kitt-tools/bin
mkdir -p %{buildroot}/home/tomcat/kitt-tools/config
mkdir -p %{buildroot}/home/tomcat/kitt-tools/lib
mkdir -p %{buildroot}/home/tomcat/kuali/main/2.0-%release/changesets/
mkdir -p %{buildroot}/usr/share/tomcat5/webapps/
mkdir -p %{buildroot}/usr/share/tomcat5/webapps/kra-${build.environment}/WEB-INF/
mkdir -p %{buildroot}/usr/share/tomcat5/webapps/kra-${build.environment}/WEB-INF/classes/META-INF

cp ${build.environment}.properties %{buildroot}/home/tomcat/kitt-tools
cp $HOME/apache-ant/lib/ant.jar %{buildroot}/home/tomcat/kitt-tools/lib
cp $HOME/apache-ant/lib/ant-launcher.jar %{buildroot}/home/tomcat/kitt-tools/lib

cd kc_custom-2.0/WEB-INF/classes/META-INF/
sed -e 's/\(.*build.environment.*"false">\).*/\1${build.environment}<\/param>/' kc-config-defaults.xml | sed -e 's/\(.*build.version.*"false">\).*/\12.0-%release<\/param>/' > /tmp/kc-config-defaults.xml
mv /tmp/kc-config-defaults.xml %{buildroot}/usr/share/tomcat5/webapps/kra-${build.environment}/WEB-INF/classes/META-INF
cd -

cp kc_custom-2.0/WEB-INF/web-${build.environment}.xml %{buildroot}/usr/share/tomcat5/webapps/kra-${build.environment}/WEB-INF/web.xml
cp -rf kuali/main/${build.environment} %{buildroot}/home/tomcat/kuali/main/


cat <<EOF > %{buildroot}/home/tomcat/.rpmmacros
%%_topdir /home/tomcat/.workspace/redhat
%%_dbpath /home/tomcat/rpm
EOF

cat <<EOF > %{buildroot}/home/tomcat/kitt-tools/.envrc
${build.environment}
EOF

mv %{_builddir}/kuali-coeus-2.0/kc_custom-2.0 %{buildroot}/usr/share/tomcat5/webapps/kra/
mv %{_builddir}/kc-cfg-dbs/update* %{buildroot}/home/tomcat/kuali/main/2.0-%release/changesets/

set -x 

ant -f /dev/stdin -Dbuild.environment=${build.environment}<<EOF
<?xml version="1.0"?>
<project name="kitt-tools" default="run" basedir="." xmlns:kitt-tools="urn:com.rsmart.ant">
<target name="run">
<property file="$TOOLSDIR/\${build.environment}.properties" />
<echo file="%{buildroot}/home/tomcat/kitt-tools/credentials.properties">
source.driver=\${dbcopy.default.driver}
source.url=jdbc:oracle:thin:@uaz-kf-d02.mosaic.arizona.edu:1521:UAZKRDEV
source.username=sandbox
source.password=kulowner
source.schema=SANDBOX
target.driver=\${dbcopy.default.driver}
target.url=\${oracle.datasource.url}
target.username=\${datasource.username}
target.schema=KULOWNER
encrypted.password=\${encrypted.password}
</echo>
</target>
</project>
EOF

%clean
rm -rf %{buildroot}

%files settings-${build.environment}
%defattr(2770,tomcat,kuali)
/home/tomcat/.rpmmacros
/home/tomcat/kitt-tools
/home/tomcat/kuali/main/${build.environment}/kc-config.xml
/usr/share/tomcat5/webapps/kra-${build.environment}/WEB-INF/web.xml
/usr/share/tomcat5/webapps/kra-${build.environment}/WEB-INF/classes/META-INF/kc-config-defaults.xml


%files changelogs
%defattr(-,tomcat,kuali)
%config /home/tomcat/kuali/main/2.0-%release/changesets/update.xml
%config /home/tomcat/kuali/main/2.0-%release/changesets/update/

%files
%defattr(2770,tomcat,kuali)
/usr/share/tomcat5/webapps/kra/

%pre
rm -rf /usr/share/tomcat5/webapps/kra-*

%post
set -x

MYPWD=$PWD
cd /usr/share/tomcat5/webapps
mv kra kra-$(cat ~tomcat/kitt-tools/.envrc)
cd $MYPWD

%post changelogs 
set -x

VERSION=%release
CURRENT=$(ls -t ~tomcat/kuali/main/|grep -v %release|head -1|cut -d- -f 2)
REPO_URL=
KC_VERSION=2.0

for x in $(seq $(expr $CURRENT + 1) $VERSION);
do
if [ ! -e ~tomcat/kuali/main/$KC_VERSION-$x ];
then
cd ~tomcat/kuali/main
wget -r -l 2 --no-parent -nH --cut-dirs=5 $REPO_URL/$KC_VERSION-$x/
fi

cd ~tomcat/kuali/main/$KC_VERSION-$x/changesets
liquibase --changeLogFile=update.xml --logLevel=finest update
liquibase --logLevel=finest tag $KC_VERSION.$x
cd 
done

cd -

if [ -e ~tomcat/kuali/main/$KC_VERSION-$(expr %release + 1) ];
then
echo '%release' > /tmp/lquninst
fi

%postun changelogs
if [ -e /tmp/lquninst ];
then

START=%release
END=$(cat /tmp/lquninst)
for x in $(seq $START -1 $END); 
do 
cd /home/tomcat/kuali/main/2.0-$x/changesets
liquibase --changeLogFile=update.xml rollback 2.0.$(expr $x - 1)

cd -;
done
rm /tmp/lquninst
fi 
exit
The script is pretty crazy looking, so I will take a jab at explaining it. It will help to explain the major differences in project organization and structure between KFS and KC.
  • New XML Configuration for KC/Rice in kuali/main/dev
  • no more security.properties (normally where database passwords are stored)
  • No build.properties since we're not using ant. Configuration is now in kc-config.xml in kuali/main/<environment>

Handle PreProcessing

Keeping that in mind, let's have a look at the %install section which is where most of the work that is going on is. Since we can't bootstrap or hook into the regular maven packaging for the WAR, what I've done instead is to handle that preprocessing here. Keep in mind that since this is the %install directive, this is not packaging yet. This is just creating the build sandbox we're going to package later. So first we need to prepare the structure
TOOLSDIR=%{_builddir}/kuali-coeus-2.0/kitt-tools-1.0
mkdir -p %{buildroot}/home/tomcat/kitt-tools
mkdir -p %{buildroot}/home/tomcat/kitt-tools/bin
mkdir -p %{buildroot}/home/tomcat/kitt-tools/config
mkdir -p %{buildroot}/home/tomcat/kitt-tools/lib
mkdir -p %{buildroot}/home/tomcat/kuali/main/2.0-%release/changesets/
mkdir -p %{buildroot}/usr/share/tomcat5/webapps/
mkdir -p %{buildroot}/usr/share/tomcat5/webapps/kra-${build.environment}/WEB-INF/
mkdir -p %{buildroot}/usr/share/tomcat5/webapps/kra-${build.environment}/WEB-INF/classes/META-INF
You can see this is building out the path for the webapp to live. There is also this new concept of a kitt-tools path. These are actually environment utilities for administrating KC. They are outside the scope of this explanation, but we do have to add to them a few things during installation. One of these is the credential.properties
ant -f /dev/stdin -Dbuild.environment=${build.environment}<<EOF
<?xml version="1.0"?>
<project name="kitt-tools" default="run" basedir="." xmlns:kitt-tools="urn:com.rsmart.ant">
<target name="run">
<property file="$TOOLSDIR/\${build.environment}.properties" />
<echo file="%{buildroot}/home/tomcat/kitt-tools/credentials.properties">
source.driver=\${dbcopy.default.driver}
source.url=jdbc:oracle:thin:@uaz-kf-d02.mosaic.arizona.edu:1521:UAZKRDEV
source.username=sandbox
source.password=kulowner
source.schema=SANDBOX
target.driver=\${dbcopy.default.driver}
target.url=\${oracle.datasource.url}
target.username=\${datasource.username}
target.schema=KULOWNER
encrypted.password=\${encrypted.password}
</echo>
</target>
</project>
EOF
Ant is used to populate this based on values set in the configuration. Each environment is going to get its own kc-config.xml. For example, TST would be kuali/main/tst/kc-config.xml. Usually, the there are very slight differences between each environment configuration. Usually, the only difference is the database password. For most of the common settings, we use a kc-config-defaults.xml. These files are maintained on the build system (where hudson is living). As a result our build version is usually clobbered by the default. I do a little hacking to get around that. The following takes care of that for us.
cd kc_custom-2.0/WEB-INF/classes/META-INF/
sed -e 's/\(.*build.environment.*"false">\).*/\1${build.environment}<\/param>/' kc-config-defaults.xml | sed -e 's/\(.*build.version.*"false">\).*/\12.0-%release<\/param>/' > /tmp/kc-config-defaults.xml
mv /tmp/kc-config-defaults.xml %{buildroot}/usr/share/tomcat5/webapps/kra-${build.environment}/WEB-INF/classes/META-INF
We have environment specific web.xml files for any weird changes that are by environment
cp kc_custom-2.0/WEB-INF/web-${build.environment}.xml %{buildroot}/usr/share/tomcat5/webapps/kra-${build.environment}/WEB-INF/web.xml
cp -rf kuali/main/${build.environment} %{buildroot}/home/tomcat/kuali/main/
Once a package is installed, tools that we install take advantage of knowing what environment they're on. We handle this by creating an .envrc file on the environment.
cat <<EOF > %{buildroot}/home/tomcat/kitt-tools/.envrc
${build.environment}
EOF
The last thing done is to move the changelogs and the webapp to it's environment agnostic locations. Remember from Part 2 we move these files into their appropriate locations during post processing
mv %{_builddir}/kuali-coeus-2.0/kc_custom-2.0 %{buildroot}/usr/share/tomcat5/webapps/kra/
mv %{_builddir}/kc-cfg-dbs/update* %{buildroot}/home/tomcat/kuali/main/2.0-%release/changesets/

New packages

Aside from the %install, the spec file and packages are very much the same or at least familiar to the KFS packages. For KC, it was decided that workflow would be handled manually instead of automatically, so the packages are:
  • kuali-coeus-2.0-1.noarch.rpm
  • kuali-coeus-settings-2.0-1.noarch.rpm
  • kuali-coeus-changelogs-2.0-1.noarch.rpm

KIS Me Kate - RPM Packaging KFS Part 3

Overview/Recap


This is the last part of a 3 part series on packaging KFS with RPM. Just when you thought there wasn't anything left to say about the subject, there's more. What didn't we cover last time?
  • Workflow packaging - this is actually useful to separate from the main package. Sometimes, you do not want to install/upgrade/reinstall your workflow definitions
  • Database Upgrade/Installation - if you use liquibase, this is a very useful thing to split out from the main package. It is a useful thing to on-demand upgrade your database.
  • KC Setup/Packaging (Part 4) - backtracking a little and it doesn't really have anything to do with KFS, but you have to admit that if you're interested in packaging KFS, you're also interested in packaging KC.

Workflow Packaging

This refers to workflow as in the workflow XML that one ingests (usually manually). Sometimes workflow changes are couple to java source code changes in Rice. As a project manager/release manager, you want your project to deploy with as little hiccups as possible. This includes all of your changes that are interdependent to be deployed. If you deploy documents that require workflow changes, your application may not work unless you get those changes in somehow. I wouldn't trust a person to do it, so how do you get this done automatically?

In a previous post, I showed some configuration source code
rice.dev.mode=false
rice.standalone=false
rice.kew.xml.pipeline.lifecycle.enabled=true
These are important. I'll explain:
  • rice.kew.xml.pipeline.lifecycle.enabled - turns on a thread that runs periodically to ingest KEW xml (it does not use quartz, but an internal scheduler)
  • rice.dev.mode - you want to set to false because unless, it is in dev, the xml pipeline will not run (regardless of the previous setting). There's a good reason for this. You generally do not want this running in any kind of production environment.
  • rice.standalone - for now we are building KFS with rice running bundled/embedded. If this were set to false, then our rice would run separately, and we wouldn't be ingesting workflow through the KFS application.

Now, let's move from theory to practice. It appears that things are mostly setup through our configuration. What we need next is
  1. Copy workflow xml to the appropriate ingestion directory during the build process
  2. Setup the spec file so that the files are included in their correct locations at packaging

Modify Build to Move Workflow XML

We now need to modify our build.xml in vendor/<your institution>/. In a previous post, there is a target called dist-rpm. It looks like,
<target        name="dist-war" 
description="Kuali distribution plus post processing."
depends="init-classpath,dist">
<fail unless="build.environment">Need the build.environment to build</fail>

</target>

<target name="dist-rpm" depends="prepare-rpm,dist-war" />

We add a new target dist-workflow
 <target        name="dist-war" 
description="Kuali distribution plus post processing."
depends="init-classpath,dist">
<fail unless="build.environment">Need the build.environment to build</fail>

</target>

<target name="dist-workflow"
description="Kuali post processing for KEW XML."
depends="init-classpath">
<fail unless="build.environment">Need the build.environment to build</fail>

<deploy:workflow-sieve release="${build.version}" kfspath="${basedir}" />

<mkdir dir="${rpm.ingestion.directory}" />

<copy todir="${rpm.ingestion.directory}" flatten="true">
<fileset dir="${work.directory}/src/com" erroronmissingdir="false">
<include name="**/workflow/*.xml" />
</fileset>
<fileset dir="${work.directory}/src/edu" erroronmissingdir="false">
<include name="**/workflow/*.xml" />
</fileset>

</copy>
</target>

<target name="dist-rpm" depends="prepare-rpm,dist-war" />

Obviously, we want to find the workflow xml and copy it to our desired location which is
rpm.ingestion.directory=${rpm.external.work.directory}/staging/workflow/pending/
set in the rpm.properties file mentioned in KIS Me Kate - RPM Packaging KFS Part 2. There is a caveat though. In recent versions of Rice, a new ingestion of a workflow document type does NOT replace the old document type. It creates a new one. The old type still exists. This means that with subsequent ingestion, new document types will be created regardless of their differences. If your institution fancies having daily building/packaging, you could find yourself with a rather large list of document types with very little different from each other. How do we get around this? What I did was create a workflow-sieve task in the macros-rpm.xml that determines whether the workflow XML had any changes. This is what it looks like:
<project  xmlns:deploy="urn:com.rsmart.kuali">
...
...
<macrodef uri="urn:edu.arizona.kitt" name="workflow-sieve">
<attribute name="release" />
<attribute name="kfsPath" />
<sequential>
<echo file="/tmp/workflow-sieve.py"><![CDATA[
#!/usr/bin/env python

import os.path
import re
import sys
from subprocess import *

svnpath = "https://subversion.uits.arizona.edu/kitt-anon/kitt/financial-system/kfs/branches"
trunkpath = "https://subversion.uits.arizona.edu/kitt-anon/kitt/financial-system/kfs/trunk"

def findWorkflowFiles(basedir):
retval = []
for root, dirs, files in os.walk(basedir):
for file in (files):
if (re.match(".*workflow$", root)):
newroot = root.split('/work/')[1]
retval.append('/'.join(['work', newroot, file]))

return retval

def getLastReleaseRevision(release):
releaseLoc = svnpath + "/3.0-" + str(release)
return getRevisionFor(releaseLoc)

def getRevisionFor(path):
retval = Popen(["svn", "info", path], stdout=PIPE).communicate()[0]
retval = int(retval.split("Last Changed Rev: ")[1].split("\n")[0])
return retval

def command(command):
print 'Executing: ' + command
os.system(command)


release = int("@{release}".split("-")[1]) - 1
workflowFiles = findWorkflowFiles('@{kfsPath}/work/src/edu/')
workflowFiles.extend(findWorkflowFiles('@{kfsPath}/work/src/com/'))
revision = getLastReleaseRevision(release)

for workflow in workflowFiles:
filename = trunkpath + "/" + workflow
print "Checking revision on " + filename
fileRev = getRevisionFor(filename)
if (revision > fileRev):
print "Removing " + workflow + " from package"
os.remove(workflow)
]]>
</echo>
<exec executable="${user.home}/python/bin/python">
<arg value="/tmp/workflow-sieve.py" />
</exec>
<delete file="/tmp/workflow-sieve.py" />
</sequential>
</macrodef>
...
...
</project>

Above is a simply python script that gets run as part of the workflow-sieve task which occurs during the dist-workflow target! This is great! Now our build is altered sufficiently to handle the workflow XML.

Define Workflow Package

Now I will add workflow to the kfs.spec.template
%define __os_install_post %{nil}

Summary: Kuali Financial System
Name: kfs
Version: ${version}
Release: ${release}
Provides: kfs
License: EPL
BuildArch: noarch
Requires: tomcat5
BuildRoot: /tmp/kfs/
Source0: kfs-${build.version}.tar.gz
Group: Development/Tools
Packager: leo [at] rsmart.com

%package workflow
Summary: Kuali Financial System Workflow Document Types
Group: System/Base
Requires: kfs
...
...

We set our requirement for KFS. this will ensure that all our KFS prerequisites exist before this is installed.
After that, I add my description
%description  workflow
Workflow XML for %release

Notice that after %description, I give the string "workflow". Normally, there is no qualifier for %description. That indicates that it's going into the default package. We qualify with "workflow". This means that there will be a workflow qualifier added to the package name. The resulting package will be called kfs-workflow-4.0-1.noarch.rpm.

Next, we specify our %files just like we did with the default package.
%files workflow
%defattr(-,tomcat,tomcat)
/home/tomcat/app/work/kfs/staging/workflow/


Now when you rerun your packaging, you will have a workflow package that can be installed with your KFS application as well as all your workflow customizations for that release!

Packaging Database Changes

In particular, packaging your liquibase changelogs and having them run on installation! This is actually, a really good idea. Just like workflow changes, your database changes are coupled to your java source code. This is especially the case thanks to ORM. KFS may not even start correctly if you do not have tables mentioned in your mappings. The best way to make sure they exist is to add your liquibase change logs to installation. As an added bonus, you can now have visibility on all changes made to your database from release to release.

Update build.xml

Just like we did with our workflow packaging, we will need to change the build to include the liquibase change logs.
<target name="dist-ddl" depends="init-classpath">
<mkdir dir="${rpm.ddl.directory}" />

<copy todir="${rpm.ddl.directory}">
<fileset dir="${work.directory}/db/">
<include name="changesets/**/*" />
<include name="scripts/**/*" />
</fileset>
</copy>

</target>

<target name="dist-rpm" depends="prepare-rpm,dist-war,dist-ddl,dist-workflow" />
I have added now dist-ddl to the dist-rpm dependencies. I have also created the target for it. All it does is copy files out of kfs-4.0/work/db/changets and kfs-4.0/work/db/scripts into my build directory for packaging.

Add Spec Information

Just like I did with workflow, I now add the %package for changelogs
%package changelogs
Summary: Kuali Financial System KITT Customization Schema
Group: System/Base
Requires: liquibase,kfs,wget

This will create a new package with the name kfs-changelogs-4.0-1.noarch.rpm. Also, notice that my dependency is liquibase. This means that liquibase will required as well as KFS before the changelogs can be run! A program called wget is required too. I will explain why later.

Now I add %description and files.
%description changelogs
Liquibase change logs for %release
...
...
%files changelogs
%defattr(-,tomcat,tomcat)
/home/tomcat/app/ddl/${build.version}/changesets/latest
/home/tomcat/app/ddl/${build.version}/changesets/install.xml
/home/tomcat/app/ddl/${build.version}/changesets/constraints.xml
%config /home/tomcat/app/ddl/${build.version}/changesets/update.xml
%config /home/tomcat/app/ddl/${build.version}/changesets/update/
...
...

The above describes that a new changeset path is created for each ${build.version}. The reason for this is because we want to keep all the changelogs from previous builds. This allows us to quickly revert back in case we need to undo packages. One of the advantages of package management is being able to cleanly and systematically remove the software change without any evidence that it ever happened. The update.xml is listed as a configuration item because we never want this overwritten.

Installation Post processing Script

Unlike with workflow, we now have some pre/post processing to do. Until now, just files are being dropped in for installation. We don't do anything with these files. That is, liquibase is required, but it never runs. The database isn't actually changed yet. We still need to run liquibase on the changelogs.
%post changelogs 
set -x

VERSION=%release
CURRENT=$(ls -t ~tomcat/app/ddl/|grep -v %release|head -1|cut -d- -f 2)
REPO_URL=<your institutions' SVN repo accessible via http>
KFS_VERSION=%version

for x in $(seq $(expr $CURRENT + 1) $VERSION);
do
if [ ! -e ~tomcat/app/ddl/$KFS_VERSION-$x ];
then
cd ~tomcat/app/ddl
wget -r -l 2 --no-parent -nH --cut-dirs=5 $REPO_URL/$KFS_VERSION-$x/
fi

cd ~tomcat/app/ddl/$KFS_VERSION-$x/changesets
liquibase --changeLogFile=update.xml --logLevel=finest update
liquibase --logLevel=finest tag %version.$x
cd
done

if [ -e ~tomcat/app/ddl/%version-$(expr %release + 1) ];
then
echo '%release' > /tmp/lquninst
fi

The above script is running liquibase on the current installation. This is pretty tricky which is why it's a shell script. When we upgrade/install our liquibase changelogs, we need to run all of the changes subsequently between the last and current version. For example, if we are upgrading from release 3 - release 15, we need to run all the scripts in between in order. The caveat is that when we upgrade directly (going from package for release 3 to package for release 15), we're skipping packages that contain the changelogs. This means we cannot assume the system has the changelogs installed. This is where wget comes in. We actually setup an anonymous, read-only accessible SVN rep url in the script
REPO_URL=<your institutions' SVN repo accessible via http>
. We now use wget to retrieve from SVN the missing changelogs and install them before proceeding to process them through liquibase.
   liquibase --changeLogFile=update.xml --logLevel=finest update
liquibase --logLevel=finest tag %version.$x
it is important to note that liquibase is tagging a version after each update. We will use this in a moment.

Uninstallation Post-processing Script

That handles installation/upgrades, but ... what about uninstallations. Well, an uninstallation is pretty much reverting liquibase to the state of the previous changelog. The way we know the state of the previous changelog is that the state was tagged on installation (mentioned earlier). Now we know where to downgrade to. To handle uninstallations, RPM has a directive called %postun. We use that here
%postun kittdb
if [ -e /tmp/lquninst ];
then

START=%release
END=$(cat /tmp/lquninst)
for x in $(seq $START -1 $END);
do
cd /home/tomcat/app/ddl/%version}-$x/changesets
liquibase --changeLogFile=update.xml rollback %version.$(expr $x - 1)

cd -;
done
rm /tmp/lquninst
fi
exit

Just as before, the uninstallation script determines what versions it needs to uninstall, then it loops through each one doing a "rollback" through liquibase.

Part 4 will be packaging KC which is really interesting compared to KFS because it uses maven and the concept of xml configuration by environment instead of property-based configuration.

Friday, April 1, 2011

Constants Done with Spring

Kuali's Antipattern

The Rice pattern for handling constants in Kuali software is really an antipattern. It is basically an interface with public static final's. They chose an interface because it cannot be instantiated. This is an antipattern because it's not considered OO. An interface/class is created without the prospect of ever being instantiated or used for polymorphism. It's also inconsistent because as an interface is used at first, inner classes are used. Here's an example:
public class KFSConstants extends JSTLConstants implements ParameterKeyConstants {
private static final long serialVersionUID = 2882277719647128949L;

public static final String APPLICATION_NAMESPACE_CODE = "KFS";    

public static class ParameterNamespaces {
public static final String KFS = "KFS-SYS";
public static final String CHART = "KFS-COA";
public static final String FINANCIAL = "KFS-FP";
public static final String GL = "KFS-GL";
...
...
}
...
...

Now look at
public interface DisbursementVoucherConstants extends ParameterKeyConstants {

// Text limits
public static final int MAX_NOTE_LINE_SIZE = 90;

// payment methods
public static String PAYMENT_METHOD_CHECK = "P";
public static String PAYMENT_METHOD_WIRE = "W";
public static String PAYMENT_METHOD_DRAFT = "F";

// payee types
public static final String DV_PAYEE_TYPE_EMPLOYEE = "E";
public static final String DV_PAYEE_TYPE_VENDOR = "V";
public static final String DV_PAYEE_TYPE_CUSTOMER = "C";
public static final String DV_PAYEE_TYPE_SUBJECT_PAYMENT_VENDOR = "VSP";
public static final String DV_PAYEE_TYPE_REVOLVING_FUND_VENDOR = "VRF";

public static final List VENDOR_PAYEE_TYPE_CODES = Arrays.asList(DV_PAYEE_TYPE_VENDOR, DV_PAYEE_TYPE_SUBJECT_PAYMENT_VENDOR, DV_PAYEE_TYPE_REVOLVING_FUND_VENDOR);

// document location
public static final String NO_DOCUMENTATION_LOCATION = "N";

public static final String TAX_CONTROL_CODE_ALLOWS_EMPLOYEES = "A";
public static final String TAX_CONTROL_CODE_BEGIN_WITHHOLDING = "B";
public static final String TAX_CONTROL_CODE_HOLD_PAYMENT = "H";

public static class DocumentStatusCodes {
public static final String APPROVED = "A";
public static final String EXTRACTED = "E";
}
...
...
}

One is an class with another inner class. The other is an interface with an inner class. It's really inconsistent. Everyone does it differently.

Use Spring for Constants

I propose getting constants from Spring. Here is an example I used in the KIM Ldap Integration:

Create Constants Interface

package org.kuali.rice.kim.util;

import java.util.Collection;

import org.kuali.rice.kim.bo.entity.dto.KimEntityInfo;

/**
*
* @author Leo Przybylski (leo [at] rsmart.com)
*/ 
public interface Constants {    
Collection getTestPrincipalNames();

String getDefaultChartCode();

/**
* Gets the value of entityPrototype
*
* @return the value of entityPrototype
*/
KimEntityInfo getEntityPrototype();

/**
* Gets the value of externalIdTypeProperty
*
* @return the value of externalIdTypeProperty
*/
String getExternalIdTypeProperty();

/**
* Gets the value of taxExternalIdTypeCode
*
* @return the value of taxExternalIdTypeCode
*/
String getTaxExternalIdTypeCode();

/**
* Gets the value of externalIdProperty
*
* @return the value of externalIdProperty
*/
String getExternalIdProperty();

/**
* Gets the value of employeePhoneLdapProperty
*
* @return the value of employeePhoneLdapProperty
*/
String getEmployeePhoneLdapProperty();

/**
* Gets the value of employeeMailLdapProperty
*
* @return the value of employeeMailLdapProperty
*/
String getEmployeeMailLdapProperty();

/**
* Gets the value of defaultCountryCode
*
* @return the value of defaultCountryCode
*/
String getDefaultCountryCode();

/**
* Gets the value of personEntityTypeCode
*
* @return the value of personEntityTypeCode
*/
String getPersonEntityTypeCode();

/**
* Gets the value of uaidLdapProperty
*
* @return the value of uaidLdapProperty
*/
String getKimLdapIdProperty();

/**
* Gets the value of uidLdapProperty
*
* @return the value of uidLdapProperty
*/
String getKimLdapNameProperty();

/**
* Gets the value of snLdapProperty
*
* @return the value of snLdapProperty
*/
String getSnLdapProperty();

/**
* Gets the value of givenNameLdapProperty
*
* @return the value of givenNameLdapProperty
*/
String getGivenNameLdapProperty();

/**
* Gets the value of entityIdKimProperty
*
* @return the value of entityIdKimProperty
*/
String getEntityIdKimProperty();

/**
* Gets the value of parameterNamespaceCode
*
* @return the value of parameterNamespaceCode
*/
String getParameterNamespaceCode();

/**
* Gets the value of parameterDetailTypeCode
*
* @return the value of parameterDetailTypeCode
*/
String getParameterDetailTypeCode();

/**
* Gets the value of mappedParameterName
*
* @return the value of mappedParameterName
*/
String getMappedParameterName();

/**
* Gets the value of unmappedParameterName
*
* @return the value of unmappedParameterName
*/
String getUnmappedParameterName();

/**
* Gets the value of mappedValuesName
*
* @return the value of mappedValuesName
*/
String getMappedValuesName();

/**
* Gets the value of employeeIdProperty
*
* @return the value of employeeIdProperty
*/
String getEmployeeIdProperty();

/**
* Gets the value of departmentLdapProperty
*
* @return the value of departmentLdapProperty
*/
String getDepartmentLdapProperty();

/**
* Gets the value of employeeTypeProperty
*
* @return the value of employeeTypeProperty
*/
String getEmployeeTypeProperty();

/**
* Sets the value of employeeTypeProperty
*
* @param argEmployeeTypeProperty Value to assign to this.employeeTypeProperty
*/
void setEmployeeTypeProperty(String argEmployeeTypeProperty);

/**
* Gets the value of employeeStatusProperty
*
* @return the value of employeeStatusProperty
*/
String getEmployeeStatusProperty();

/**
* Sets the value of employeeStatusProperty
*
* @param argEmployeeStatusProperty Value to assign to this.employeeStatusProperty
*/
void setEmployeeStatusProperty(String argEmployeeStatusProperty);

/**
* Gets the value of defaultCampusCode
*
* @return the value of defaultCampusCode
*/
String getDefaultCampusCode();

/**
* Sets the value of defaultCampusCode
*
* @param argDefaultCampusCode Value to assign to this.defaultCampusCode
*/
void setDefaultCampusCode(String argDefaultCampusCode);

/**
* Gets the value of the employee affiliation code
* 
* @return the value of employeeAffiliationCode
*/
String getEmployeeAffiliationCodes();

/** 
* Gets the mappings between LDAP and KIM affiliations
* @return mappings of the form "staff=STAFF,affiliate=AFLT"
*/
String getAffiliationMappings();
}

Notice, there are only get methods. You'll see easier in a second, but this is how we get Spring beans to be read-only and constants.

Create the Constants Implementation

It's just going to be a pojo. We can easily put this within the the original interface as an inner class since it's just spring that's going to use it.
/**
* KIM Related Constants Implementation
*
* @author Leo Przybylski (leo [at] rsmart.com)
*/ 
static class ConstantsImpl implements Constants {    
private Collection testPrincipalNames;
private KimEntityInfo entityPrototype;
private String externalIdTypeProperty;
private String taxExternalIdTypeCode;
private String externalIdProperty;
private String employeePhoneLdapProperty;
private String employeeMailLdapProperty;
private String defaultCountryCode;
private String personEntityTypeCode;
private String kimLdapIdProperty;
private String kimLdapNameProperty;
private String snLdapProperty;
private String givenNameLdapProperty;
private String entityIdKimProperty;
private String parameterNamespaceCode;
private String parameterDetailTypeCode;
private String mappedParameterName;
private String unmappedParameterName;
private String mappedValuesName;
private String employeeIdProperty;
private String departmentLdapProperty;
private String employeeTypeProperty;
private String employeeStatusProperty;
private String defaultCampusCode;
private String defaultChartCode;
private String employeeAffiliationCodes;
private String affiliationMappings;

public Collection getTestPrincipalNames() {
return testPrincipalNames;
}

public void setTestPrincipalNames(Collection testPrincipalNames) {
this.testPrincipalNames = testPrincipalNames;
}

/**
* Gets the value of entityPrototype
*
* @return the value of entityPrototype
*/
public KimEntityInfo getEntityPrototype() {
// return this.entityPrototype;
return (KimEntityInfo) getService("entityPrototype");
// return (KimEntityDefaultInfo) getService("entityPrototype");
}

/**
* Sets the value of entityPrototype
*
* @param argEntityPrototype Value to assign to this.entityPrototype
*/
public void setEntityPrototype(KimEntityInfo argEntityPrototype) {
this.entityPrototype = argEntityPrototype;
}

/**
* Gets the value of externalIdTypeProperty
*
* @return the value of externalIdTypeProperty
*/
public String getExternalIdTypeProperty() {
return this.externalIdTypeProperty;
}

/**
* Sets the value of externalIdTypeProperty
*
* @param argExternalIdTypeProperty Value to assign to this.externalIdTypeProperty
*/
public void setExternalIdTypeProperty(String argExternalIdTypeProperty) {
this.externalIdTypeProperty = argExternalIdTypeProperty;
}

/**
* Gets the value of taxExternalIdTypeCode
*
* @return the value of taxExternalIdTypeCode
*/
public String getTaxExternalIdTypeCode() {
return this.taxExternalIdTypeCode;
}

/**
* Sets the value of taxExternalIdTypeCode
*
* @param argTaxExternalIdTypeCode Value to assign to this.taxExternalIdTypeCode
*/
public void setTaxExternalIdTypeCode(String argTaxExternalIdTypeCode) {
this.taxExternalIdTypeCode = argTaxExternalIdTypeCode;
}

/**
* Gets the value of externalIdProperty
*
* @return the value of externalIdProperty
*/
public String getExternalIdProperty() {
return this.externalIdProperty;
}

/**
* Sets the value of externalIdProperty
*
* @param argExternalIdProperty Value to assign to this.externalIdProperty
*/
public void setExternalIdProperty(String argExternalIdProperty) {
this.externalIdProperty = argExternalIdProperty;
}

/**
* Gets the value of employeePhoneLdapProperty
*
* @return the value of employeePhoneLdapProperty
*/
public String getEmployeePhoneLdapProperty() {
return this.employeePhoneLdapProperty;
}

/**
* Sets the value of employeePhoneLdapProperty
*
* @param argEmployeePhoneLdapProperty Value to assign to this.employeePhoneLdapProperty
*/
public void setEmployeePhoneLdapProperty(String argEmployeePhoneLdapProperty) {
this.employeePhoneLdapProperty = argEmployeePhoneLdapProperty;
}

/**
* Gets the value of employeeMailLdapProperty
*
* @return the value of employeeMailLdapProperty
*/
public String getEmployeeMailLdapProperty() {
return this.employeeMailLdapProperty;
}

/**
* Sets the value of employeeMailLdapProperty
*
* @param argEmployeeMailLdapProperty Value to assign to this.employeeMailLdapProperty
*/
public void setEmployeeMailLdapProperty(String argEmployeeMailLdapProperty) {
this.employeeMailLdapProperty = argEmployeeMailLdapProperty;
}

/**
* Gets the value of defaultCountryCode
*
* @return the value of defaultCountryCode
*/
public String getDefaultCountryCode() {
return this.defaultCountryCode;
}

/**
* Sets the value of defaultCountryCode
*
* @param argDefaultCountryCode Value to assign to this.defaultCountryCode
*/
public void setDefaultCountryCode(String argDefaultCountryCode) {
this.defaultCountryCode = argDefaultCountryCode;
}

/**
* Gets the value of personEntityTypeCode
*
* @return the value of personEntityTypeCode
*/
public String getPersonEntityTypeCode() {
return this.personEntityTypeCode;
}

/**
* Sets the value of personEntityTypeCode
*
* @param argPersonEntityTypeCode Value to assign to this.personEntityTypeCode
*/
public void setPersonEntityTypeCode(String argPersonEntityTypeCode) {
this.personEntityTypeCode = argPersonEntityTypeCode;
}

/**
* Sets the value of kimLdapIdProperty
*
* @param kimLdapIdProperty value to set
*/
public void setKimLdapIdProperty(String kimLdapIdProperty) {
this.kimLdapIdProperty = kimLdapIdProperty;
}

/**
* Gets the value of kimLdapIdProperty
*
* @return the value of kimLdapIdProperty
*/
public String getKimLdapIdProperty() {
return kimLdapIdProperty;
}

/**
* Gets the value of kimLdapNameProperty
*
* @param kimLdapNameProperty the value of kimLdapNameProperty
*/
public void getKimLdapNameProperty(String kimLdapNameProperty) {
this.kimLdapNameProperty = kimLdapNameProperty;
}

/**
* Gets the value of kimLdapNameProperty
*
* @return the value of kimLdapNameProperty
*/
public String getKimLdapNameProperty() {
return kimLdapNameProperty;
}

/**
* Gets the value of snLdapProperty
*
* @return the value of snLdapProperty
*/
public String getSnLdapProperty() {
return this.snLdapProperty;
}

/**
* Sets the value of snLdapProperty
*
* @param argSnLdapProperty Value to assign to this.snLdapProperty
*/
public void setSnLdapProperty(String argSnLdapProperty) {
this.snLdapProperty = argSnLdapProperty;
}

/**
* Gets the value of givenNameLdapProperty
*
* @return the value of givenNameLdapProperty
*/
public String getGivenNameLdapProperty() {
return this.givenNameLdapProperty;
}

/**
* Sets the value of givenNameLdapProperty
*
* @param argGivenNameLdapProperty Value to assign to this.givenNameLdapProperty
*/
public void setGivenNameLdapProperty(String argGivenNameLdapProperty) {
this.givenNameLdapProperty = argGivenNameLdapProperty;
}

/**
* Gets the value of entityIdKimProperty
*
* @return the value of entityIdKimProperty
*/
public String getEntityIdKimProperty() {
return this.entityIdKimProperty;
}

/**
* Sets the value of entityIdKimProperty
*
* @param argEntityIdKimProperty Value to assign to this.entityIdKimProperty
*/
public void setEntityIdKimProperty(String argEntityIdKimProperty) {
this.entityIdKimProperty = argEntityIdKimProperty;
}

/**
* Gets the value of parameterNamespaceCode
*
* @return the value of parameterNamespaceCode
*/
public String getParameterNamespaceCode() {
return this.parameterNamespaceCode;
}

/**
* Sets the value of parameterNamespaceCode
*
* @param argParameterNamespaceCode Value to assign to this.parameterNamespaceCode
*/
public void setParameterNamespaceCode(String argParameterNamespaceCode) {
this.parameterNamespaceCode = argParameterNamespaceCode;
}

/**
* Gets the value of parameterDetailTypeCode
*
* @return the value of parameterDetailTypeCode
*/
public String getParameterDetailTypeCode() {
return this.parameterDetailTypeCode;
}

/**
* Sets the value of parameterDetailTypeCode
*
* @param argParameterDetailTypeCode Value to assign to this.parameterDetailTypeCode
*/
public void setParameterDetailTypeCode(String argParameterDetailTypeCode) {
this.parameterDetailTypeCode = argParameterDetailTypeCode;
}

/**
* Gets the value of mappedParameterName
*
* @return the value of mappedParameterName
*/
public String getMappedParameterName() {
return this.mappedParameterName;
}

/**
* Sets the value of mappedParameterName
*
* @param argMappedParameterName Value to assign to this.mappedParameterName
*/
public void setMappedParameterName(String argMappedParameterName) {
this.mappedParameterName = argMappedParameterName;
}

/**
* Gets the value of unmappedParameterName
*
* @return the value of unmappedParameterName
*/
public String getUnmappedParameterName() {
return this.unmappedParameterName;
}

/**
* Sets the value of unmappedParameterName
*
* @param argUnmappedParameterName Value to assign to this.unmappedParameterName
*/
public void setUnmappedParameterName(String argUnmappedParameterName) {
this.unmappedParameterName = argUnmappedParameterName;
}

/**
* Gets the value of mappedValuesName
*
* @return the value of mappedValuesName
*/
public String getMappedValuesName() {
return this.mappedValuesName;
}

/**
* Sets the value of mappedValuesName
*
* @param argMappedValuesName Value to assign to this.mappedValuesName
*/
public void setMappedValuesName(String argMappedValuesName) {
this.mappedValuesName = argMappedValuesName;
}

/**
* Gets the value of employeeIdProperty
*
* @return the value of employeeIdProperty
*/
public String getEmployeeIdProperty() {
return this.employeeIdProperty;
}

/**
* Sets the value of employeeIdProperty
*
* @param argEmployeeIdProperty Value to assign to this.employeeIdProperty
*/
public void setEmployeeIdProperty(String argEmployeeIdProperty) {
this.employeeIdProperty = argEmployeeIdProperty;
}

/**
* Gets the value of departmentLdapProperty
*
* @return the value of departmentLdapProperty
*/
public String getDepartmentLdapProperty() {
return this.departmentLdapProperty;
}

/**
* Sets the value of departmentLdapProperty
*
* @param argDepartmentLdapProperty Value to assign to this.departmentLdapProperty
*/
public void setDepartmentLdapProperty(String argDepartmentLdapProperty) {
this.departmentLdapProperty = argDepartmentLdapProperty;
}

/**
* Gets the value of employeeTypeProperty
*
* @return the value of employeeTypeProperty
*/
public String getEmployeeTypeProperty() {
return this.employeeTypeProperty;
}

/**
* Sets the value of employeeTypeProperty
*
* @param argEmployeeTypeProperty Value to assign to this.employeeTypeProperty
*/
public void setEmployeeTypeProperty(String argEmployeeTypeProperty) {
this.employeeTypeProperty = argEmployeeTypeProperty;
}

/**
* Gets the value of employeeStatusProperty
*
* @return the value of employeeStatusProperty
*/
public String getEmployeeStatusProperty() {
return this.employeeStatusProperty;
}

/**
* Sets the value of employeeStatusProperty
*
* @param argEmployeeStatusProperty Value to assign to this.employeeStatusProperty
*/
public void setEmployeeStatusProperty(String argEmployeeStatusProperty) {
this.employeeStatusProperty = argEmployeeStatusProperty;
}

/**
* Gets the value of defaultCampusCode
*
* @return the value of defaultCampusCode
*/
public String getDefaultCampusCode() {
return this.defaultCampusCode;
}

/**
* Sets the value of defaultCampusCode
*
* @param argDefaultCampusCode Value to assign to this.defaultCampusCode
*/
public void setDefaultCampusCode(String argDefaultCampusCode) {
this.defaultCampusCode = argDefaultCampusCode;
}


public void setDefaultChartCode(String chartCode) {
this.defaultChartCode = chartCode;
}

public String getDefaultChartCode() {
return defaultChartCode;
}

public String getEmployeeAffiliationCodes() {
return employeeAffiliationCodes;
}

public void setEmployeeAffiliationCodes(String employeeAffiliationCodes) {
this.employeeAffiliationCodes = employeeAffiliationCodes;
}

public String getAffiliationMappings() {
return affiliationMappings;
}

public void setAffiliationMappings(String affiliationMappings) {
this.affiliationMappings = affiliationMappings;
}
}

It is static, so it can be instantiated, but it's also package level. This is important. Now, normal classes cannot just instantiate a Constants instance and override spring programmatically. Spring can still instantiate it and set it at startup though, so we are safe. Spring can instantiate and spring only.

Define the Constants Bean in Spring

Now let us set our constants



...
...

















  










...
...


Inject the Constants Bean into a Service

There, now the Constants is instantiated and stored in Spring as a bean. We can now easily inject this into other classes that need it.
*lt;bean id="ldapPrincipalDao" class="org.kuali.rice.kim.dao.impl.LdapPrincipalDaoImpl">
<property name="ldapTemplate"     ref="ldapTemplate" />
<property name="parameterService" ref="parameterService" />
<property name="kimConstants"     ref="kimConstants" />
...
...

Now use it

Here is an example of the usage:
String affiliationCode = getAffiliationTypeCodeForName(primaryAffiliation);
KimEntityAffiliationInfo aff1 = new KimEntityAffiliationInfo();
aff1.setAffiliationTypeCode(affiliationCode == null ? "AFLT" : affiliationCode);
aff1.setCampusCode(getConstants().getDefaultCampusCode());
aff1.setEntityAffiliationId("" + affiliationId++);
aff1.setDefault(true);
aff1.setActive(true);

Use Without Injection

...
String affiliationCode = getAffiliationTypeCodeForName(primaryAffiliation);
KimEntityAffiliationInfo aff1 = new KimEntityAffiliationInfo();
aff1.setAffiliationTypeCode(affiliationCode == null ? "AFLT" : affiliationCode);
aff1.setCampusCode(getConstants().getDefaultCampusCode());
aff1.setEntityAffiliationId("" + affiliationId++);
aff1.setDefault(true);
aff1.setActive(true);
...
public Constants getConstants() {
return SpringContext.getService("kimConstants");
}

That's it! I found this very useful when needing to change constants or finding what the values of constants are. The public static final boiler plate always makes it difficult to read, and the inner classes really make things a mess.