Tuesday, December 27, 2005

using conVOT

Introduction
conVOT is used for converting ASCII or FITS files to VOTable(1.0)format. For ASCII files, it supports both ASCII files with column delimiters and ASCII files with fixed width columns.

Steps for downloading and running conVOT
(1)Download conVOT.zip(version 0.9) file from http://vo.iucaa.ernet.in/~voi/conVOT.htm
(2) unzip downloaded conVOT.zip.
(3) for running conVOT, Java Runtime Environment (1.3 or above)is required. you can download it from http://ava.sun.com
(4) to run conVOT applet, type"java -jar conVOT.jar" on command prompt

Steps for converting ASCII(Data.txt) file containing tabular data into VOTable format
(1) Data.txt file has following content delimited by special character ';'
RA;DEC;NAME;RVEL;E_RVEL;R
010.86;+41.27;N 224;-297;5;0.7
287.43;-63.85;N 6744;839;6;10.4
023.48;+30.66;N 598;-182;3;0.7
Above file has six columns, one header with column names and three rows of actual data.
(2) run conVOT applet by typing "java -jar conVOT.jar" on command prompt. Following figure displays the conVOT applet.



(3) select option 'Convert ASCII' and it will display the screen which allows you to browse directory structure and select 'ASCII file'.Following figure shows the selection of 'Data.txt' file



(4)click the button 'load' which will load the 'Data.txt' file as follows



(5) click 'ok' button as there is no comment line in the file. Clicking 'Ok' will display the next screen. changing 'Unit Line: No Unit Line for the table' and 'First data line: 2" option will result following screen.



(6)After clicking 'ok' it will display following screen



(7)clicking "ok" will display a screen with delimiter option. Select option 'other!! special character only' and put ';' delimiter in the column. This will look like as follows.



(8) clicking 'ok' will display a screen containg table with 'metadata' and 'data' section as follows.



(9)clicking "ok" will generate VOTable file as follows.



(10) Finally save file to appropriate location.

Introduction to VOTable 1.1

****************************************************************
Introduction of VOTable 1.1:http://www.ivoa.net/Documents/latest/VOT.html
****************************************************************
* What is VOTable??
The VOTable format is an XML representation for the tabular data. XML format of VOTable can be easily used for creating web services and With the use of JAVA Parser API for VOTable, one can build spread sheet like tools.

Example:Following is the "velocity and distance estimation" table with six columns and three rows.

Velocities and Distance Estimation
RADECNAMERVELE_RVELR
010.86+41.27N 224-29750.7
287.43 -63.85N 744839610.4
023.48+30.66N 598-18230.7

This can be represented by VOTable document as follows:

<?xml version="1.0"?>
<VOTABLE version="1.1" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"xsi:noNamespaceSchemaLocation="http://www.ivoa.net/xml/VOTable/VOTable/v1.1">
   <COOSYS ID="J2000" equinox="J2000." epoch="J2000." system="eq_FK5"/>
   <RESOURCE name="myFavouriteGalaxies">
      <TABLE name="results">
         <DESCRIPTION>Velocities and Distance estimations</DESCRIPTION>
         <PARAM name="Telescope"datatype="float"ucd="phys.size;instr.tel" unit="m" value="3.6"/>
         <FIELD name="RA" ID="col1" ucd="pos.eq.ra;meta.main" ref="J2000" datatype="float"width="6"precision="2"unit="deg"/>
         <FIELD name="Dec" ID="col2" "pos.eq.dec;meta.main" ref="J2000" datatype="float"width="6" precision="2" unit="deg"/>
         <FIELD name="Name" ID="col3" ucd="meta.id;meta.main" datatype="char" arraysize="8*"/>
         <FIELD name="RVel" ID="col4" ucd="src.veloc.hc" datatype="int" width="5" unit="km/s"/>
         <FIELD name="e_RVel" ID="col5" ucd="stat.error;src.veloc.hc" datatype="int" width="3" unit="km/s"/>
         <FIELD name="R" ID="col6" ucd="phys.distance" datatype="float" width="4" precision="1" unit="Mpc">
            <DESCRIPTION>Distance of Galaxy, assuming H=75km/s/Mpc</DESCRIPTION>
         </FIELD>
        <DATA>
            <TABLEDATA>
               <TR><TD>010.68</TD><TD>+41.27</TD><TD>N 224</TD><TD>-97</TD><TD>5</TD><TD>0.7</TD> </TR>
               <TR> <TD>287.43</TD><TD>-63.85</TD><TD>N 6744</TD><TD>839</TD><TD>6</TD><TD>10.4</TD></TR>
               <TR> <TD>023.48</TD><TD>+30.66</TD><TD>N 598</TD><TD>-82</TD><TD>3</TD><TD>0.7</TD> </TR>
            </TABLEDATA>
         </DATA>
      </TABLE>
   </RESOURCE>
</VOTABLE>

* Document Structure
VOTABLE is major XML element in VOTable document.VOTABLE element is composed of one or more RESOURCE elements.Resource is a collection of tables, so RESOURCE element can contain one or more TABLE elements.Table is collection of Metadata(description of columns) and table data(actual data) so TABLE element is composed of FIELDS(description of columns) and TABLEDATA(rows of data).It is not necessary to have
acual data in TABLE element. TABLEDATA element may not be present in VOTable document. One can also store binary data by using STREAM element instead of TABLEDATA element.

From above VOTable document,
VOTable = hierarchy of Metadata + associated Table Data, arranged as a set of Tables
Metadata = parameters + Infos + Descriptions + Links + Fields + Groups
Tables = List of Fields + TableData
TableData = Stream of Rows
Row = List of Cells
Cell = Primitive or Variable length list of primitives or multidimentional array of primitives
Primitive = integer, float, floatComplex, character etc

* Tools for VOTable
(1)conVOT: Tool for converting ASCII or FITS tables to VOTable format
WebSite: http://vo.iucaa.ernet.in/~voi/conVOT.htm
(2)JAVOT(NVO): A Java API for VOTable 1.0 for reading
WebSite: http://www.us-vo.org/VOTable/JAVOT/
(3)SAVOT(European VO): Simple Access to VOTable 1.1 for reading, writing and editing
WebSite: http://simbad.u-strasbg.fr/public/cdsjava.gml

Saturday, December 10, 2005

Using Maven 2, Part 3

****************************************************************
Maven 2 Notes, Part 3
****************************************************************

----------------------------------------------------------------
Section 1: Writing a plugin intro
----------------------------------------------------------------
* Previously we looked at how to use Maven 2 to deploy WAR files (and portlet applications) from a remote
repository into a local Tomcat directory. This wasn't too successful, so now we're going to look at
writing a plugin to do this.

* In quick summary, the plugin should do the following:
1. Grab a war file from the local repository.
2. Put it in the directory of our choice.
3. That's it.
Note the remote download isn't necessary, since this is done through the existing <dependency/> mechanism.

* There is a Tomcat plugin under development from Codehaus that can perform tasks such as load and install
war files into a running tomcat, so we'll look at that later, maybe, if I can get it to compile.

* Before we go on, here is a quick summary of the important parts:
1. Plugins (or MOJOs) use a simple POJO/javabean structure with directives included
in the comments for some reason.
2. The plugin's execute() method is where you do the thing (ie useful code goes here).
3. You can pass all kinds of variables (not just strings) into the plugin from the POM.

----------------------------------------------------------------
Section 2: Writing a do-nothing plugin
----------------------------------------------------------------
* The real documentation for writing a plugin is here:

http://maven.apache.org/guides/plugin/guide-java-plugin-development.html

Maven plugins are called "Mojos", a play on "POJOs" for reasons that will become clear later.

* First, let me say I was disappointed that there is no plugin archetype. Recall archetypes are Maven 2's way
for generating boilerplate code. So I used the the standard way, discussed in Part 1:

mvn archetype:create -DgroupId=test.plugin -DartifactId=PortletAppPlugin

* The resulting POM requires a few modifications: we specify the packaging to be a maven-plugin, and
we add a dependency on the maven plugin api. This is detailed in the link above, but here is mine:

<project>
<modelVersion>4.0.0</modelVersion>
<groupId>test.plugin</groupId>
<artifactId>PortletAppPlugin</artifactId>
<!-- This specifies the plugin packaging -->
<packaging>maven-plugin</packaging>

<version>1.0-SNAPSHOT</version>
<name>Portlet Webapp Plugin</name>
<url>http://maven.apache.org</url>
<dependencies>
<!-- This is needed for the plugin classes. -->
<dependency>
<groupId>org.apache.maven</groupId>
<artifactId>maven-plugin-api</artifactId>
<version>2.0</version>
</dependency>
</dependencies>
</project>

I omitted the junit dependencies for brevity. You need to either include this or else eliminate the src/test directory.

* Write the code. The starter plugin looks like this:

package test.plugin;

import org.apache.maven.plugin.AbstractMojo;
import org.apache.maven.plugin.MojoExecutionException;

/**
* @goal portlet-war-deploy
* @description Copy a war to a tomcat directory.
*/
public class PortletAppPlugin extends AbstractMojo {

public void execute() throws MojoExecutionException
{
getLog().info("Hollow World!");
}
}

* One really intereting feature of the plugin is that is REQUIRES two comment fields:
@goal
@description
These are used later, and we will also see this when setting parameters.

* To now install, we follow the usual path: "mvn compile" to make sure we have made no mistakes in the java code, and then
"mvn install" to put it in the local repo.

* We can now run our plugin on the command line like so:

mvn test.plugin:PortletAppPlugin:1.0-SNAPSHOT:portlet-war-deploy

Note that the "porlet-war-deploy" at the end of this is taken from "@goal" in the comment field. When we run this
command, the getLog() message will be printed to the screen.

* Note for clarity that we will eventually want to integrate this plugin into other POMs. The above command
is a useful way of testing things in stand-alone mode.

----------------------------------------------------------------
Section 3: Writing a do-something plugin
----------------------------------------------------------------

* To be useful, our plugin needs to have some parameter settings. These can then be used in the execute
statement. This also reveals a bit of POJOing combined with a little Too Much Magic with bean creation.

* Let's modify our code a bit to use a simple string parameter.

public class PortletAppPlugin extends AbstractMojo {

public void execute() throws MojoExecutionException
{
getLog().info("Hollow World!");
System.out.println("Sample Param:" + sampleParam);
}

/**
* Sample parameter setting.
* @parameter expression="Hollow"
*/
private String sampleParam;

}

Re-install this ("mvn install") and then run the plugin.

* The thing to notice is that we have again used comments and annotations: the @parameter is used to not only describe
the sampleParam string, but to provide it with a default value. When this MOJO is executed, you will see that
the sampleParam value has been correctly populated.

* We can provide alternative values for the sampleParam in the plugin configuration description section of the
POM using the plugin. This is described in detail at

http://maven.apache.org/guides/mini/guide-configuring-plugins.html


* Note also that the sampleParam string is private. The container that creates this and other MOJOs must be
generating get and set methods.

* The expression stuff really starts to become useful when we insert Maven parameter expressions. Try the following:

/**
* Sample parameter setting.
* @parameter expression="${pom.version}"
*/
private String sampleParam;

This will correctly print out (in the execute() method) the version in the pom.xml. But what happens when I put this
plugin in another project's pom.xml? Will it print the plugin's version or the container project's version? My guess
from Maven's inheritance behavior is that it will use the parent project's ${pom} information, rather than the plugin's.

* So let's test this out and in the process show how to add our plugin to another project. Take the "jobsubmit-portlet"
project from part 1 of this series and add the following (just put it below the Ant plugin).
<project>
...
<build>
...
<plugins>
<plugin>
<artifactId>PortletAppPlugin</artifactId>
<groupId>test.plugin</groupId>
<executions>
<execution>
<phase>process-resources</phase>
<configuration>
<sampleParam>
${pom.version}
</sampleParam>
</configuration>
<goals>
<goal>portlet-war-deploy</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
...
</project>

There are several things to note. First, Maven will find the Plugin in my local plugin repository because, where
I previously installed it. Second, <sampleParam></sampleParam> is used to set the plugin value--this is passed to
my code above. Third, I must finally specify the plugin's goal. Recall that this was specified using @goal at the top
of the MOJO java code.

* When we run this, we will in fact see that the value of sampleParam that gets printed is the version of the
jobsubmit-portlet project, not the PortletAppPlugin, as expected. Further, we can remove the <configuration> entirely
so that we use default values. We will again see that the value of the sampleParam is the version of the jobsubmit-portlet,
NOT the plugin's version, again as expected.

----------------------------------------------------------------
Section 4: Using more complicated parameters.
----------------------------------------------------------------
* One of the primary reasons for reviewing plugins is that we can no longer write simple Jelly scripts for processing
the POM. For example, recall that the ${pom.dependencies} gives you access to all the dependencies as one big
string. Echoing this value using the Ant plugin gives something like

[Dependency {groupId=junit, artifactId=junit, version=3.8.1, type=jar}]

* The problem is that the POM is actually returning an object (org.apache.maven.model.DependencyManager), and we are
actually seeing the output of the toString() method.

But now with plugins, we can actually get access to this stuff.

* However, we immediately run into a problem: Maven 2 has no javadoc laying around for some reason. So we instead
have to either browse the SVN repo on line or else check out the code and see for ourselves. The source actually has a
non-intuitive layout. The real code for doing stuff seems to be under bootstrap:



http://svn.apache.org/viewcvs.cgi/maven/components/trunk/bootstrap/bootstrap-mini/src/main/java/org/apache/maven/bootstrap/


* By downloading the Maven 2 source code and by snaking around in the on-line SVN repo, I found the following works.
Here is the required dependency in the POM for our plugin:


<dependency>
<groupId>org.apache.maven</groupId>
<artifactId>maven-model</artifactId>
<version>2.0</version>
</dependency>

Add this to the pom.xml for the plugin. You need this to compile the plugin with the additional Dependency class below.


* And here is the Plugin Code:

package test.plugin;

import org.apache.maven.plugin.AbstractMojo;
import org.apache.maven.plugin.MojoExecutionException;

import org.apache.maven.model.Dependency;
import java.util.ArrayList;

/**
* @goal portlet-war-deploy
* @description Copy a war to a tomcat directory.
*/
public class PortletAppPlugin extends AbstractMojo {

public void execute() throws MojoExecutionException
{
getLog().info("Hollow World!");
System.out.println("Sample Param: "+sampleParam);

for(int i=0; i<theDepends.size();i++) {
Dependency deps=(Dependency)theDepends.get(i);
System.out.println(deps.toString());
}
}

/**
* Sample parameter setting.
* @parameter expression="${project.version}"
*/
private String sampleParam;

/**
* Second sample parameter setting.
* @parameter expression="${project.dependencies}"
*/
private ArrayList theDepends;
}

* To test this, clean, compile, install and then run the plugin.

mvn clean install test.plugin:PortletAppPlugin:1.0-SNAPSHOT:portlet-war-deploy

The output is

[INFO] Hollow World!
Sample Param: 1.0-SNAPSHOT
Dependency {groupId=junit, artifactId=junit, version=3.8.1, type=jar}
Dependency {groupId=org.apache.maven, artifactId=maven-plugin-api, version=2.0, type=jar}
Dependency {groupId=org.apache.maven, artifactId=maven-model, version=2.0, type=jar}

* That is, it worked as expected. We can now use the Dependency class to get/set various properties since
it is really just a JavaBean:

for(int i=0; i<theDepends.size();i++) {
Dependency deps=(Dependency)theDepends.get(i);
System.out.println("Artifact ID: "+deps.getArtifactId());
System.out.println("groupId: "+deps.getGroupId());
}

* Thus we can see that the execute() method can be used to do most of the work. Our next step will
be to use more meaningful parameters:
o The path to the Tomcat directory.
o The path to the specified dependency's WAR file.

Monday, December 05, 2005

Using Maven 2, Part 2

****************************************************************
Maven 2 Notes, Part 2: Using Remote Repositories
****************************************************************

----------------------------------------------------------------
Section 1: Putting and getting with remote repos.
----------------------------------------------------------------

* The Maven documentation covers this in detail, so I will instead look at it from a particular point of
view: how to put/get WAR files in a repository. My higher purpose is to use Maven to upload/download
war files for portlets and other web applications so that they can be deployed into Tomcat containers.

* So first, let's put our earlier war project (see part 1) into the local repository. This is pretty
simple:

mvn install

* But where is the repo? $HOME/.m2. Taking a look, you should see the directory struture


~/.m2/repository/xportlets/jobsubmit/jobsubmit-portlet/1.0-SNAPSHOT/

This directory structure is all determined by our POM header:

<project>
<modelVersion>4.0.0</modelVersion>
<groupId>xportlets.jobsubmit</groupId>
<artifactId>jobsubmit-portlet</artifactId>
<packaging>war</packaging>
<version>1.0-SNAPSHOT</version>
...
</project>

* Note that <version/> now maps to a separate directory. The war file itself is


jobsubmit-portlet-1.0-SNAPSHOT.war

So each project is required to have a specific version so that the directory strucutre can be
created properly. If you don't believe me, try deleting <version>..</version> from the above.

* Also in the repo directory that contains the above war are two additional files:

maven-metadata-local.xml
jobsubmit-portlet-1.0-SNAPSHOT.pom

I'll assume that remote repositories have a similar structure. And they do: surf around in
http://www.ibiblio.org/maven2 and you will see the equivalents, plus some additional hash files
to verify.

* The POM file in the repository, by the way, is similar to the pom.xml in your project directory. You
may notice a few differences between the original POM and the repository POM.

o All the namespaces in <project> got removed.

o Is there a POM schema? Probably not, but the .pom in your repo will be normalized a bit,
so if some of your tags are out of the canonical order, they will get moved around.

o Interestingly, <echo message="hollow world"/> got changed to <echo message="hollow world"></echo>.
OK, it isn't really that interesting.

o I assume the above is needed for some lightweight canonicalization, since these XML files must get
hashed when you put them in a remote repo.

* More importantly, the inclusion of the POM in the repository guarantees that you know all dependent jars
for a given project. This MAY mean that maven will also search for these additional jars and download them
as well. I'll have to check that out in Part X.


* You can also create an "internal" repository: this publishes your project's packaging (war, jar, etc) onto a
convient place using ssh/scp, etc. This is not to be confused with your "local" repository in ~/.m2. "Internal"
means "your group's internal repo."

Specify the internal repo in pom.xml as follows:

<project>
....
<distributionManagement>
<repository>
<id>Test</id>
<url>file:///tmp/Test/</url>
</repository>
</distributionManagement>
</project>

Note this does not mean that this repo will be used to find dependencies (?). That is done with a
<repositories/> tag, as shown below.

This specifies that we publish this to the internal repository that happens to be the local file system. Use the
command

mvn deploy

to put the results of "mvn package" into your internal repository.



This internal repository differs a bit from the normal "local" repository ~/.m2: everything is message
digested with MD5 and SHA1 digests.

ls -ltr /tmp/Test/xportlets/jobsubmit/jobsubmit-portlet/1.0-SNAPSHOT/
total 36
-rw-rw-r-- 1 gateway gateway 40 Dec 1 20:53 maven-metadata.xml.sha1
-rw-rw-r-- 1 gateway gateway 32 Dec 1 20:53 maven-metadata.xml.md5
-rw-rw-r-- 1 gateway gateway 330 Dec 1 20:53 maven-metadata.xml
-rw-rw-r-- 1 gateway gateway 40 Dec 1 20:53 jobsubmit-portlet-1.0-20051202.015334-1.war.sha1
-rw-rw-r-- 1 gateway gateway 32 Dec 1 20:53 jobsubmit-portlet-1.0-20051202.015334-1.war.md5
-rw-rw-r-- 1 gateway gateway 3564 Dec 1 20:53 jobsubmit-portlet-1.0-20051202.015334-1.war
-rw-rw-r-- 1 gateway gateway 40 Dec 1 20:53 jobsubmit-portlet-1.0-20051202.015334-1.pom.sha1
-rw-rw-r-- 1 gateway gateway 32 Dec 1 20:53 jobsubmit-portlet-1.0-20051202.015334-1.pom.md5
-rw-rw-r-- 1 gateway gateway 1203 Dec 1 20:53 jobsubmit-portlet-1.0-20051202.015334-1.pom


* You should note that you can stick any final product (ie the result of mvn deploy for a web app is
a WAR file, but your project build dependencies require jars.


----------------------------------------------------------------
Section 2: The war comes home
----------------------------------------------------------------
* Previously we saw that the war file could be placed in a "remote" repository, just like a jar. But you can't/don't
compile against War files, so we show here how to grab a War from a remote repository and put it in your local repo.


* So let's start with a new project (use an artifact to create it). We now need to modify the pom.xml in two ways:
o Specify the repository to use.
o Specify the WAR file dependency (requires non-default settings).

* Here is how you configure an "internal" repo:
<project>
...
<repositories>
<repository>
<id>blah</id>
<url>file:///tmp/Test/</url>
</repository>
</repositories>
...
</project>


Maven will look here (and Ibiblio) for your jars. One interesting question is the order that it does this. The
artifacts should be adequately named to be distinct, but download times can vary a great deal. Also, internal
repos tend to come and go, so you want to avoid the Maven 1 problem of waiting for a connection time out. This is
frustrating, especially when you do multiproject builds (and have to wait for N remote repo connection time outs).

* Anyway, the second thing you need to do is set the dependency correctly, since they default to jars. Use this:

<project>
...
<dependencies>
<dependency>
<groupId>xportlets.jobsubmit</groupId>
<artifactId>jobsubmit-portlet</artifactId>
<version>1.0-SNAPSHOT</version>
<type>war</type>
<scope>package</scope>
</dependency>
...
</dependencies>
...
</project>

That is, you must specify <type>war</type>. Note actually the scope value isn't too important here. This will
download the indicated war from the indicated repository (above) and put it in the local ~/.m2 repository. Note
that it does NOT put the WAR anywhere in the project directory (that I can tell).


* So now we want to take the war file out of the local repository and do something with it. This seems to be harder
than you might think, since Maven only pulls JARS out of the repo into the local build file, as far as I can tell.

* The solution options seem to be
o Find some mvn trick that does this;
o Write some Ant to do it; or
o Write a plugin that does it.


The third option is the way to go, probably, since the plugin increases code reusability and fills an important
general need of the maven community. It is so important and crucial that we will skip it for now.

This leaves the first and second options for now: look for a trick and let Ant do the dirty work.
If you know me, you should have had no doubt that this is what I would do. First, however, we have to take
a detour through Maven 2 properties.

----------------------------------------------------------------
Section 3: Maven 2 properties, or "What happened to ${maven.local repo}?"
----------------------------------------------------------------

* So let's put a couple of things together: we will download the war file from the "remote" repository and then
use Ant to move it to an appropriate "Tomcat" directory. We do all this in the pom.xml.


* Our first problem is accessing various built-in properties in Maven 2. The first thing I noticed (and it took me
an inordinately long amount of time to find this), is that the good old property

${maven.repo.local} //Where'd this go? I'll miss it.

is NOT set within your pom.xml. However (possibly for backward compatibility),

mvn install -Dmaven.repo.local=...

still works. Don't use this, by the way, unless you want to set up a new local repo.

* Luckily, I eventually stumbled across the answer: settings.xml. This file is located in $MAVEN_HOME/conf/. You can
edit it to localize your Maven 2 configuration, but the real value is that it lets you programmatically work with
all sorts of maven "system" properties. See

http://maven.apache.org/maven-settings/settings.html

for the big list.

* So if you want to get the directory path of your local repository, you just need to use ${settings.localRepository}.
The following pom.xml fragment (which uses the Ant plugin) can be used to verify that it works.

<project>
...
<build>
<finalName>test-webapp</finalName>
<plugins>
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<executions>
<execution>
<phase>process-resources</phase>
<configuration>
<tasks>
<echo message="${settings.localRepository}"/>
</tasks>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>

Thus you now have a handle on the property that points to the repo. As we go forward a bit, we will build up
(section by section) the necessary steps to accessing a particular war.


----------------------------------------------------------------
Section 4: Setting/getting arbitrary name-value pairs with Maven 2.
----------------------------------------------------------------
* OK, now let's say you want to set an arbitrary name-value pair for use somewhere else in your POM. You
do this using build profiles (in general). See

http://maven.apache.org/guides/introduction/introduction-to-profiles.html

* For the simplest case, you can add a <profile> tag to your POM like so

<project>
...
<profiles>
<profile>
<activation>
<property>
<name>env</name>
<value>dev</value>
</property>
</activation>

<properties>
<junk.stuff>test</junk.stuff>
</properties>
</profile>
</profiles
...
</project>

I'll leave the description of this to the Maven docs, but basically the first part tells Maven when to activate
the property (the necessary condition), and the second part specifies the actual property.

So if we added

<echo message=<"${junk.stuff}"/>

to our Ant echos (see previous section), we would get the following output

[shell-prompt> mvn process-resources
...
[echo] ${junk.stuff}
...

I chose the process-resources goal (uh, phase) since it is mostly harmless.

* To actually get it to run correctly, you need to do this:

[shell-prompt> mvn process-resources -Denv=dev

This will do the thing: [echo] test

----------------------------------------------------------------
Section 5: Derived POMs and inheritence
----------------------------------------------------------------
* One of the powers of Maven 2 is that local POMs not only can derive from other local POMs, but can
also go out to remote repositories and find the parent POM.

* For example, look at section 2 in which we install the sample WAR file into an "internal" repository. Note
again "internal" repos just mean "internal to your project" to "on your local file system." Let's say we
run
[shell-prompt> mvn deploy

on our project using the /tmp/Test repository. This will place the WAR file in /tmp/Test under the appropriate
target directory. For completeness, the POM is something like this:

<project>
<modelVersion>4.0.0</modelVersion>
<groupId>xportlets.jobsubmit</groupId>
<artifactId>jobsubmit-portlet</artifactId>
<packaging>war</packaging>
<version>1.0-SNAPSHOT</version>
<name>Maven Webapp Archetype</name>
<url>http://maven.apache.org</url>
<build>
<finalName>jobsubmit-portlet</finalName>
<plugins>
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<executions>
<execution>
<phase>process-resources</phase>
<configuration>
<tasks>
<echo message="hollow world"/>
</tasks>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>

<distributionManagement>
<repository>
<id>Test</id>
<url>file:///tmp/Test/</url>
</repository>
</distributionManagement>

</project>


* Now let's configure a child POM that derives from this parent. We need to configure this to use our
alternative /tmp/Test repository, and then we use the <parent></parent> tags to configure things.

* The entire POM might look like this:
<project>

<!-- Define the parent -->
<parent>
<artifactId>jobsubmit-portlet</artifactId>
<groupId>xportlets.jobsubmit</groupId>
<version>1.0-SNAPSHOT</version>
</parent>

<!-- Must define a new artifactId or mvn complains -->
<artifactId>testjunk</artifactId>
<modelVersion>4.0.0</modelVersion>

<!-- This part specifies the "internal" repository -->
<repositories>
<repository>
<id>blah</id>
<url>file:///tmp/Test/</url>
</repository>
<repository>
<id>codehause</id>
<url>http://mojo.codehaus.org/maven2</url>
</repository>
</repositories>

<dependencies>
<!-- This dependency is used to get the war file -->
<dependency>
<groupId>xportlets.jobsubmit</groupId>
<artifactId>jobsubmit-portlet</artifactId>
<version>1.0-SNAPSHOT</version>
<type>war</type>
<scope>runtime</scope>
</dependency>
</dependencies>

<!-- This is the part that runs the Ant command -->
<build>
<finalName>test-webapp</finalName>
<plugins>
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<executions>
<execution>
<phase>process-resources</phase>
<configuration>
<tasks>
<echo message="${settings.localRepository}"/>
<echo message="${pom.artifactId}"/>
<echo message="${pom.groupId}"/>
<echo message="${pom.version}"/>
</tasks>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>


* Basically, this means that our testjunk POM is a child of the jobsubmit-portlet. Now the cool thing is that Maven
will go to your repository and download the specified POM from the remote repository. Thus you can basically
distribute a POM like the above (ie send it to me in an email). I then run the command

[shell-prompt> mvn process-resources

This will download the job-submit SNAPSHOT war file to my local repository (because I specified it as a
dependency). This will allow me to specify the real path to the local repository (settings.localRepository),
the artifactId, versionId, and groupId.

* That is, we now at last have enough information to constuct the full path to a specific war file in the
local repository (after we download it).

* But wait,there is a problem: the output of "mvn process-resources" is

...
[echo] /home/gateway/.m2/repository
[echo] testjunk ! The problem is here.
[echo] xportlets.jobsubmit
[echo] 1.0-SNAPSHOT
....


That is, the pom.artifactId is not inherited from the POM's parent. We are stuck: each project must define its
own <artifactId>, so I can't just delete the tag, and the sensible ${parent.artifactId} property you would hope
is present is there but also set to the child's Id (ie testjunk).

* Darn.

* Well, we almost got there, but not quite. The

----------------------------------------------------------------
Section 6: Using Maven to collect and deploy remote WAR files.
----------------------------------------------------------------

* Before I get any further, let's restate the general problem: I want to use Maven to download a bunch of
wAR files (which are
really portlet apps) from various remote repositories and install them into a local Tomcat. For this particular
POM, I don't want to do
any compilation, just some simple remote WAR management.

* Our challenge now is to get out the dependency information. Unfortunately, now that there is no more Jelly,
this is a little difficult. To see, use

<echo message="${pom.dependencies}"/>

You should see something like this in your standard output when you run a mvn goal:

[echo] [Dependency {groupId=junit, artifactId=junit, version=3.8.1, type=jar},
Dependency {groupId=xportlets.jobsubmit, artifactId=jobsubmit-portlet, version=1.0-SNAPSHOT, type=war}]

In the good old Jelly days (which I refused to use), you could through this into some nested loops and
extract the information you needed (ie you could construct a real path to a jar file in the repo.)

* So what do you do in Maven 2? Again, the real answer is probably "write a plugin" but for the
current problem, I thought of a trick that I have to try. It involves multiproject builds,
so this will segue nicely into another blog.


* THE SOLUTION: we have seen how to do most of this stuff with various Maven tricks:

* Use a "shell" POM that is a child of your real POM. This lets you construct the full path to the
war file in your local repo.
* Use dependencies to download the WAR file to your local repo.
* Use a build profile to specify the correct artifact ID name.
* Use Ant to actually move the WAR file from the repo to the Tomcat webapp.


* That is, we can now embed Ant calls like following in our "shell" POM.

<copy file="${settings.locaRespository}/${groupId}/${my.parent.artifactId}/${version}/*.war"
todir="${tomcat.home}"/>

(where my.parent.artifactId is set using build profiles)

* If you are still paying attention, you might notice that I slipped on the ${groupId} as well, since
xportlets.jobsubmit maps to xportlets/jobsubmit in the repo.

* OK, the above exercise was pretty questionable, and next time I will actually write a plugin to do this,
but the above detours gave me a pretty good idead of
what Maven 2 can and cannot do.

Checking your servers with Ant

Let's say you need to regularly check to make sure that several web servers are up and running. The little Ant script below shows how to do this. On a side note, it also illustrates Ant 1.6 "if" and "unless" conditionals. Add a top level target to call all of your hosts as dependencies.

The target will connect to a port (8080) and see if it is running. If it is up, then darya.server.available is set to not-null (the value doesn't matter) and everything is OK. If it is not running, then darya.server.available is null and the failure condition is applied. You run it with the command "ant test.darya.all".

This will also email you on target failure, assuming you have mail server running on the host that executes the ant script.

Set it up in a CRON job to run regularly.


<!-- These targets test darya -->
<target name="test.darya.ports">
<condition property="darya.server.available">
<socket server="darya.myplace.indiana.edu" port="8080"/>
</condition>
</target>

<target name="darya.success" if="darya.server.available" depends="test.darya.ports">
<echo message="Darya server is running"/>
</target>

<target name="darya.failure" depends="test.darya.ports" unless="darya.server.available">
<echo message="Darya server is down."/>
<mail mailhost="myserver.myplace.indiana.edu"
mailport="25"
subject="Darya server is down">
<from address="marpierc@indiana.edu"/>
<to address="marpierc@indiana.edu"/>
<message>
Darya server is down.
</message>
</mail>
</target>

<target name="test.darya.all" depends="darya.failure,darya.success"/>

Wednesday, November 30, 2005

Using Maven 2, Part 1

****************************************************************
Maven 2 Notes, Part 1
****************************************************************

----------------------------------------------------------------
Section 1: Getting Started
----------------------------------------------------------------

* Just download the binary and unpack. Put Maven 2's bin/ in your PATH.

* This following section is a quick review of the slightly longer intro here:

http://maven.apache.org/guides/getting-started/index.html

The Maven folks have done a good job with their introduction, so my guide below cherry-picks
the parts most relevant to me. I've also tried to supplement the official docs with some pithy observations.

* To create a new project, use the following example command. Both groupId and artifactId are required.

mvn archetype:create -DgroupId=xportlets.jobsubmit -DartifactId=jobsubmit-portlet

This will create the proper directory structure for your project:
1. The "archetype:create" goal will create a directory named after artifactId.
2. It will create a src directory like this: job-submit/src/main/xportlets/jobsubmit/
3. The tests will similarly go in src/test.

Maven 1 had issues with some directory structures matching source code packaging. It's not clear yet
if this is still true, but for now I will take the groupId to be the same as the package name, even
though this is not necessarily the case.

The "create" step unfortunately also populates your src/main and src/test directories with App.java and
AppTest.java, so you probably want to remove these.

* The pom.xml file is the maven2 equivalent of maven 1's project.xml. Fortunately, it looks very similar to the
old project.xml file format, except for a few changes:
1. Projects specify <packaging/>, which controls what happens when you use the "package" goal.
2. <dependencies/> can specify their scope: a jar needed only for testing has a scope <test/>.

* To compile your project, just use this:

mvn compile

This will put files in the usual target/ directory, as in Maven 1. Note this creates classes but does no packaging. To
create a packaged project, use

mvn package

This will first compile, then run any tests, then create a jar, war, etc, in target/. BUT what if you want to
the tests? We need this for OGCE builds, which use HttpUnit and so depend on deployment. Luckily the maven.test.skip
property is still around. Use this:

mvn package -Dmaven.test.skip=true


* To run tests by themselves, use "mvn test". If the classes have not yet been compiled, Maven will do this.

* Run "mvn clean" to clean up.

* One last note here: the various goals (compile, package, clean, etc) are all "lifecycle phases" in Maven 2. More
later.

----------------------------------------------------------------
Section 2: Making Web Applications
----------------------------------------------------------------
* Web application projects obviously have slightly different requirements for their directory structure, so to create
this, use the following convenience method.

mvn archetype:create -DgroupId=xportlets.jobsubmit -DartifactId=jobsubmit-portlet -DarchetypeArtifactId=maven-archetype-webapp

This will make the directories src/main/webapp/ and src/main/resources.

* The problem here is that it is not clear where the actual java code and java tests go. Presumably they can go under
src/main/java and src/tests as before. To test this simply, I created a dummy project (see first section) and copied
over the src/main/java and src/test directories.

This looks like it worked correctly: the resulting war file had all the classes in the right places.

* Next problems: I must specify that Jar A should be included in the WAR's WEB-INF/lib, while Jar B is needed for
compilation but not to be included in the WAR. Simple enough. See the disucssion on "scope" at

http://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html.

I first used the "compile" scope--this means a) use the jar to compile, and b) include it in WEB-INF/lib. Beautiful. Your
POM dependency looks like this:

<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<!-- Magic happens here -->
<scope>compile</scope>
</dependency>
</dependencies>


* Now if I use the jar to compile but do NOT want to include it in the WAR file, I do this:

<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<!-- More magic -->
<scope>provided</scope>
</dependency>
</dependencies>

That is, the "provided" scope compiles with the jar but doens't include it. A quick look in the target/ directory confirms
this.

* But we need to write an archetype that will do all of this automatically.

----------------------------------------------------------------
Section 3: Integrating with Apache Ant
----------------------------------------------------------------
* Maven 2 no longer has that great, cheap maven.xml stuff that let you stick in all of your last-mile deployment stuff
with some ant. However, you can now embed Ant directly within your pom.xml file. The general overview is here:

http://maven.apache.org/guides/mini/guide-using-ant.html.

And the antrun plugin is here:

http://maven.apache.org/plugins/maven-antrun-plugin/

* Basically, the markup in your pom.xml looks something like the following. See the links above for full
exmaples.

<build>
<plugins>
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<executions>
<execution>
<phase>process-resources</phase>
<configuration>
<tasks>
<!-- Insert arbitrary Ant -->
<echo message="hollow world"/>
</tasks>
</configuration>
<!-- Run the ant target -->
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>

There are two important parts: the configuration phase and the execution phase. Your ant scriplet goes in
in the config phase, in between the targets. You must also run your scriptlet using the "run" goal.

* The mysterious <goal>run</goal> just means that we call the "run" target of the associated maven-antrun-plugin.
See (again) http://maven.apache.org/plugins/maven-antrun-plugin/.


* This is typical of Maven 2 plugin behavior, BTW.

Some JSF Portlet Notes

* Formatting notes: I use quotes to denote HTML tags in this blog.

* SRB and file system portlets (gridFTP, for example) would make good uses for JSF general listing/tree view
classes.

* The point is that both of the above have the same general abstract file view of life and so can be repeated.

* UISelectItems is potentially important, since it holds entire groups of values. The display of the values
seems to be tied to the component that surrounds it. For example, the HtmlSelectManyListbox always renders these
with "select" and "options" and "optgroups" if there are multiple values. There are several display objects
associated with these groups (select boxes, checkboxes, radios, etc.)

* HtmlDataTable is a just a "table" at its heart. But it does lots of fancy output--basically you can line up
your mutliple part data into columns and have JSF figure out the for loops.]

* But the question now is this: how do you need to extend HtmlDataTable goodness? Maybe you don't and the
whole issue is mapping this to a backing bean.

* Also, note again that the backing bean can be independent of the display. So for example a tree view of a
directory and a table view (as per current gridftp) are two different views. So a neat trick would be to
convert between one view and another.

* Another common data model may be the "user session data model" in which the user wants to navigate a
logical tree of results. This may be output as either tables with a top navigation link (as per gridftp) or
it may be the common tree view, or so on.

* So the currently unanwsered question is this: what is the data model for a big glob of results. If I do a
gridftp on a directory, how to I abstractly handle the results? That is, I have to map this into some
HTMLDataTable or Tree or something at some point in the future.

* Basic problem statement: portlets are good at some level, but they are not fine-grained enough. For example,
a job submission portlet is marginally useful since what you really want to do is make an application submission
portlet. A GridFTP portlet has some usefulness, but what you really want to do is collect all the files
(input, output, stdout, error, etc) into a single grouping and associate this with common GridFTP-like
tasks. That is, there is a logical filespace associated with a specific application. This may include other
stuff like metadata.

Similarly, SRB is a file system in its own right. So I need to be able to navigate this. But back to the
basic problem: how do I build an application page this way?

I cannot build a generic portlet that is an abstract version of a OGSA-DAI client. This doesn't have
much usefulness. Instead what I would like is a set of widgets and data models that can hold generic
OGSA-DAI stuff in them. These can then be used to build up specific OGSA-DAI application portlets.

* Another issue is this: how do I build up a workflow page? This is not really intended to be a workflow
composer but rather a lower level thing: I often need to associate the backend resources (code, data) with
each other. This is a common task.

* Yet another common job: need a list of all my jobs (past, pending, current, etc). This should be an
abstract view (an arbitrary collection) that gets mapped to a different view (tables, trees, etc.). These
view objects can be associated with events, where events (action events) are really COMMON GRID TASKS.

* So one can think of an ACTION EVENT LISTENER of type "run job" or of type "download file" or of type
"remove value", and so on. These are common actions. There can also be other action events associated with
a particular instance.

* Another possible use: the pages may be constructed using Sakai-like standard components for a common
look to everything. So we can adopt/modify this for a standard OGCE look.

* I think the problem with the current approach is that we confuse ACTIONS with PORTLETS. For example, the GRAM
portlet is really a single action portlet (for the most part): do the thing. GridFTP is a collection of actions
for remote file manipulation, and so on. In the current (velocity) set up, these actions are tied to specific
forms by name. But this is too rigid. I can't (for example) combine the action code a button to multiple
listeners.

* Some JSF virtues:
- Multiple views to same data: my GridFTP view can toggle between Tree and Table views.
- Reusable views for different data: GridFTP, SRB, Job Monitors, OGSA-DAI can all reuse my nift tree widget.
- Multiple event listeners on a single form. Compare this to Velocity, JSP, which basically have
1 action per form. Consider the problem of reusing the action code of a velocity portlet.
- Input form names are auto-generated, so you don't have to keep track of these or break your
portlet if you change names. Or, you don't have to adopt naming conventions to reuse your action
code. Consider again the Velocity example.
- Lots of built-in validators (for strings, ints, etc, or for ranges of values, and so on).
- We can create libraries of stuff: events, nifty display widgets, etc.

* Validators: can write custom validators to make sure that the user has a credential.

Some Globus 4 Notes

* Install as normal. Note however that all of the Globus WS stuff depends up on a postgres DB being properly configured. See the instructions
under RFT manual at http://www.globus.org/toolkit/docs/4.0/admin/docbook/ch10.html. If you don't get this running, then your globus
remote command executions will work but they will not clean up properly and so will never exit.

* TO RESTART POSTGRES SERVER: this is needed to do anything with the ws-gram container. The steps are these:
1. Login to gf1 as the user "postgres".
2. Use the installation in the directory /usr/local/pgsql. There is another installation at /var/lib that should be ignored.
3. Set this environment variable: export PGDATA=/usr/local/pgsql/data.
4. Run the command "/usr/bin/pg_ctl start -o -i" as the postgres user. If everything is OK, you will get something like this:

bash-2.05b$ pg_ctl start -o -i
pg_ctl: Another postmaster may be running. Trying to start postmaster anyway.
LOG: database system was interrupted at 2005-06-08 18:13:55 EST
LOG: checkpoint record is at 0/873128
LOG: redo record is at 0/873128; undo record is at 0/0; shutdown FALSE
LOG: next transaction id: 1543; next oid: 25213
LOG: database system was not properly shut down; automatic recovery in progress
LOG: ReadRecord: record with zero length at 0/873168
LOG: redo is not required
postmaster successfully started
bash-2.05b$ LOG: database system is ready


5. Test to make sure the rftDatabase is running correctly. Use the command "psql rftDatabase". You should see


bash-2.05b$ psql rftDatabase
Welcome to psql 7.3.4-RH, the PostgreSQL interactive terminal.

Type: \copyright for distribution terms
\h for help with SQL commands
\? for help on internal slash commands
\g or terminate with semicolon to execute query
\q to quit

rftDatabase=#


6. The GF1 database is configured to work with the user "manacar" so su manacar. Set up your globus environment as necessary:
"export GLOBUS_LOCATION=/home/globus/nmi-all-7.0-red9/" and then "source $GLOBUS_LOCATION/etc/globus-user-env.sh".

7. Start the container as user manacar: "globus-start-container".

Some Historical GridFTP Issues

* Summary:
- The CoG's file listing GridFTP operations don't work with GridFTP 2.0 servers in GT 4.
- PWD is flaky (sometimes works, sometimes not).

* GridFTP Installations
- palermo.ucs.indiana.edu runs GridFTP v 1.17 from NMI R5 (~ 1 year old).
- danube.ucs.indiana.edu runs GridFTP v 2.0 from the GT4 release (from globus.org).
- gf1.ucs.indiana.edu runs GridFTP v 2.0 (the same) from the NMI R7 release.

* Set up:
- I run all servers in the standard way, as root, started by xinet.d.
- I checked installations using the globus-url-copy tool in various combinations
(gf1 connect to danube, danube to palermo, danube to gf1). After lots of fun with
certificates and signing policies, it all works.
- Note of course this does not let me verify the list operation that causes problems.

* Here is my cog-kit set up:
- I use Mike's maven stuff to download all cog-jars.
- After unpacking, I run

maven ogceDeploy:ogceDeploy -Dtomcat.home=$HOME/GridFTPTest/test-shared-libs

- I set the classpath to point to the bazillion jars in the shared/lib that gets created.
- I then compile the demo FileOperations.java program from the CoGKit website
(I downloaded it directly from the website).

* Sanity check:
- I run several commands successfully against the old GridFTP 1.17 server on palermo.
- OK, rmdir doesn't work but I may just be using it incorrectly.

* Now for GridFTP on gf1. The following work OK.
- cd works (verified with mkdir)
- mkdir
- rmdir (actually doesn't work but doesn't report errors either).
- isDirectory
- exists
- rename
- putfile
- getfile


* Below are the commands that cause problems. These are the same for both the NMI R7 server
and the GT4 version (not unexpected since they are identical versions).

---------------------------------------------------------------------------------
* Problem: ls
---------------------------------------------------------------------------------

Please Enter your command with its arguments
ls
- Control channel sending: PASV

- Control channel received: 227 Entering Passive Mode (156,56,104,81,145,163)
- Control channel sending: LIST -d *

- Getting default credential
Operation failed: File operation failed
Submission Exception: File Operation failed
Please Enter your command with its arguments

----------------------------------------------------------------------------------

----------------------------------------------------------------------------------
Problem: ls
----------------------------------------------------------------------------------
Please Enter your command with its arguments
ls /home/gateway
- Control channel sending: PWD

- Control channel received: 257 "/home/gateway" is current directory.
- Control channel sending: CWD /home/gateway

- Control channel received: 257 "/home/gateway" is current directory.
- Control channel sending: PASV

- Control channel received: 250 CWD command successful.
Operation failed: File operation failed
Submission Exception: File Operation failed
Please Enter your command with its arguments
----------------------------------------------------------------------------------


----------------------------------------------------------------------------------
Problem: PWD. Actually, this always seems to fail when used immediately
after another client command (ls, cd, etc), but then works the second time.
----------------------------------------------------------------------------------
Please Enter your command with its arguments
pwd
- Control channel sending: PWD

- Control channel received: 227 Entering Passive Mode (156,56,104,81,145,164)
Operation failed: File operation failed
Submission Exception: File Operation failed
Please Enter your command with its arguments
----------------------------------------------------------------------------------

----------------------------------------------------------------------------------

SVN Notes

* Installed svn on gf2.ucs.indiana.edu with "yum install svn". First time to use yum, much better than unadorned rpm.

* Logged into pamd.csit.fsu.edu, created a new repository with

svnadmin create $HOME/VLAB_SVN

* Imported vlab source with svn import vlab file:///home/myname/VLAB_SVN/vlab.

* Also was able to list files: svn list file:///home/myname/VLAB_SVN/vlab.

* After fooling around for a bit, I got SVN to work over ssh from gf2. Since my usernames on pamd and gf2 are not the same, I had to use

export SVN_SSH="ssh -l myname"

on the command line. Then the usual SVN command worked:

svn list svn+ssh://pamd.csit.fsu.edu/home/myname/VLAB_SVN/vlab

* Checked out on gf2 successfully with

svn checkout svn+ssh://pamd.csit.fsu.edu/home/myname/VLAB_SVN/vlab

* I deleted the unnecessary "target" directory with the commands

svn delete target svn+ssh://pamd.csit.fsu.edu/home/myname/VLAB_SVN/vlab
svn commit -m "Deleting build directory"

* The "vlab" group already exists on pamd, so I changed group ownerships to that. I then had to set the group bit with

chmod -R g+s .

This means that all new files created in any subdirectories are created with "vlab" group ownership.

* Finally, the umask needs to be set correctly. I used

umask 007

since this gives group write permission. But it is a problem since my default group is "visitor" so anyone in "visitor" can
create files in my account. I think this needs to go in each user's .login or .cshrc file. I got around this by changing all of
my group permissions to "vlab".

* Looks like a cheap way to set permissions is to alias the svn command to "umask 007; svn". This must be done on each client
machine, however, whereas the "put umask 007 in the .login file" requires less work.

uPortal Notes

* Started with vanilla install of the Quick Start version. I recommend doing this since you a) realize you have
to run the DB server separately (not well documented in the full build stuff), and b) you can see what it is
supposed to do.

* Doing vanilla install of the "uportal only" distribution of 2.3.3.

* Got Hypersonic from http://sourceforge.net/projects/hsqldb.

* Had some documentation issues. Finally settled on the instructions in the distribution's
README, but also the ones below. The README worked but had an important omission about starting the
database as a separate process. You need to run this command from the hsqldb/lib directory:
java -cp hsqldb.jar org.hsqldb.Server -database uPortal2 -port 8887

* Note you should (always) explicitly state your classpath (as in the above line) or you will regret it.

* You also must change out the version of hsqldb.jar in the uportal lib directory if using hsqldb-1.7 or later.

* For some reason, I had problems with localhost: I had to change localhost in rdbm.properties's jdbcUrl variable
to the full machine name before uPortal would build.

* The other installation documents had problems--wrong version of Tomcat, etc.
- Found lots of documentation at http://www.uportal.org/administrators/index.html, using
http://www.fair-portal.hull.ac.uk/downloads/uPortalGuide1.pdf. But it had some problems--for
the pre-JSR 168, version.
- There is also the install guide from http://www.uportal.org/administrators/install_v2.html, but it
suffers from excessive hyperlinks, didn't have a printer friendly version of the whole thing.

* Also ran into some Tomcat issues: 5.0.19 DID NOT WORK but 5.0.27 was OK. I note that 5.0.18 comes with
Quick Start, so 5.0.19 must have broken something. And in fact I now see everywhere (including the Pluto site)
that it has a well known bug.

* The portlet war files are located in lib/portlets and include the pluto testsuite and some other stuff. These
do not include the Pluto testsuite source code, so to look at this I had to get them from the pluto site using

[prompt> cvs -d :pserver:anoncvs@cvs.apache.org:/home/cvspublic login
[prompt> cvs -z3 -d :pserver:anoncvs@cvs.apache.org:/home/cvspublic co jakarta-pluto

* Useful Links: there are tons of documentation, a lot of which is on non-jasisg sites, so you should download and save
locally--the links were often broken while I was surfing. Of all the stuff, I found the following useful for
these specific things:
a) Adding tabs, portlets/channels, and content: http://mis105.mis.udel.edu/ja-sig/docs/uPortal_Documentation_Chapter_1.pdf.
Quick and dumb. Don't forget to save those changes.
b) Changing the ugly default (guest) layout and adding users: http://www.uportal.org/implementors/usermgmt.html
c) 168 Portlets in uPortal: http://www.uportal.org/implementors/portlets/workingWithPortlets.html

* Adding portlets is a bit tricky, as described in the above link. You have to get the name correct when editing uPortal's
channels. This is somewhat scruffy:
a) The first set of forms, used for naming the portal, don't matter too much and are probably for human consumption.
b) The second set of forms, "portlet definition", has to be done this way: [war_file_name].[porlet_name], where
[war_file_name] is the name of the war/webapp that contains the portlet code, and portlet_name is the value
of the value of portlet.xml's tag.
c) So for the proxy manager portlet, proxymanager-portlet.war, the name to enter into the uportal form to propertly
create this channel is "proxymanager-portlet.ProxyManagerPortlet".


* Experimenting with adding a portlet: edited the testsuite's portlet.xml file to include TestPortlet3, a clone of TestPorlet1 and 2.
- Restarted Tomcat just in case and logged in as developer/developer.
- Added a new, empty tab for this.
- Clicked the channel manager and went through the steps
- Then edited the blank tab to add the new portlet/channel (it appeared correctly on the list).
- Don't forget to save the changes.

* And it failed--skip ahead to see how to do this correctly.
I got this error:
Channel ID: 53
Message: IChannelRenderer.completeRendering() threw
Error type: Channel failed to render (code 1)
Problem type:
Error message Initialization of the portlet container failed.

* The stackTrace gave
java.lang.NullPointerException
at org.apache.pluto.invoker.impl.PortletInvokerImpl.invoke(PortletInvokerImpl.java:109)
at org.apache.pluto.invoker.impl.PortletInvokerImpl.load(PortletInvokerImpl.java:80)
at org.apache.pluto.PortletContainerImpl.portletLoad(PortletContainerImpl.java:205)
at org.jasig.portal.channels.portlet.CPortletAdapter.initPortletWindow(CPortletAdapter.java:261)
at org.jasig.portal.channels.portlet.CPortletAdapter.renderCharacters(CPortletAdapter.java:453)
at org.jasig.portal.MultithreadedCharacterChannelAdapter.renderCharacters(MultithreadedCharacterChannelAdapter.java:71)
at org.jasig.portal.ChannelRenderer$Worker.run(ChannelRenderer.java:481)
at org.jasig.portal.utils.threading.Worker.run(Worker.java:88)

* Neither logging out/in nor restarting the server helped.

* Also, I could add TestPortal2 with no problem.

* Noticed I had added two different versions of Test Portal #3. Redid everything and now get
Channel ID: 61
Message:IChannelRenderer.completeRendering() threw
Error type: Channel failed to render (code 1)
Problem type:
Error message Initialization of the portlet container failed.

* The stack trace from Pluto was essentially the same as above, despite the different uPortal error. In fact, the
same Pluto error seemed to cause several different uPortal errors.

* So, after a tip from Eric, I redeployed and that worked. I unpacked testsuite.war, edited the portlet.xml file to
add test portlet #3, recreated the war (jar -cf) and redeployed using
ant deployPortletApp -DportletApp=./lib/portlets/testsuite.war
(the usual way) from the uPortal directory. Restarted Tomcat, published the channel, then subscribed and all was well.

JSR 168 Portlet Notes

* These are written to be used with uPortal 2.3.3 or so, but they should be general.

* I'm taking this as the directory structure
src---build.xml [ant file for building, deploying, and warring the portlet
|
--external_libs [pluto/portlet jar files not to be included in the war]
|
--java [portlet source code]
|
--webapp [the deploy directory that gets warred up at the end]

The webapp directory has this structure:
webapp---images
|
--jsp [jsp pages loaded by the portlet]
|
--WEB-INF [classes, lib, web.xml, portlet.xml]

* If using ant, make sure you read the task documents carefully. The directory
structure it creates may not match your source directory--you have to specify lib and classes
explicily.

* I wasted a lot of time trying to deploy a simple portlet into uPortal and finally
found that my web.xml file had a problem. I originally copied this over from the
testsuite's web.xml file:


Junk Testsuite

tomcat



The uPortal tool "ant deployPortletApp" constantly gave "Exception in thread "main" java.lang.StackOverflowError".
This is in the Deployer.java file from uPortal, in the prepareWebArchive() method. The lines

WebApplicationUnmarshaller wau = new WebApplicationUnmarshaller();
wau.init(new FileInputStream(webXml), webModule);

were the source of the error--probably this wraps some castor-like thing that processes the xml. I don't know why it
failed for my test portlet but worked for the testsuite. Anyway, I changed this to


Junk Testsuite


and it worked like a charm.

* I'm doing everything below as uPortal's "admin" user. I don't know or want to learn the uPortal channel control
permissions yet.

* After deploying the portlet, I can add it using the channel manager. Had problems because I did not read the
(uPortal portlet) instructions carefully as usual. You must make sure that you give the channel the correct Portlet
Definition ID. If you don't you will get errors (not repeated here) telling you that uPortal couldn't find
that ID. Just make sure your ID is [webapp_name].[portlet_name] where [webapp_name] is the name of the webapp
that contains the portlet (same name as the .war) and [portlet_name] is defined in the portlet's portlet.xml file.

* And my portlet content did not show up, although I got no errors. After futzing around for a while, I realized I had made
the dumb mistake of omitting the call to the request dispatcher's include() method. Along the way, I managed to learn that
the portlet context is indeed for the webapp that runs it (not uPortal, but "testsuite" or whatever webapp), so the
requestdispatcher should be initialized with paths relative to the specific webapp. In other words, if you want to
load webapps/junk/junk.jsp, you use context.getRequestDispatcher("junk.jsp").

* After a couple of times recreating the war and redeploying it with uPortal's build target, I realized this procedure could
be shortened by just recompiling the portlet directly into the deployment location (ie the
tomcat webapp) and restarting the webserver to reload the classes rather than redeploying the portlet through uPortal
(which introduces uncertainties). This worked fine.

* System.out.println's indicate that the portlet runs doDispatch() after running init() and before doView()--more precisely, doView
is not called if doDispatch is there. If I remove doDispatch(), then doView() is called. Tra-la.

* My portlet definition xml file never shows up in uPortal's WEB-INF/classes/properties/chanpub directory. May have to
do this by hand, but I don't yet see that this is a problem.

* Another flakiness: if you delete a channel, it will still appear in the list of available channels until you log out.
This may be an update issue--if I wait long enough, the system configuration will update itself. But I did not follow up
on this--it is quicker to just logout and login.

* A useful debugging note: load your jsp pages from their deployment directories first to make sure they are debugged before
loading them through portlets. This only gets you so far, but at least it can catch simple compiler errors. Your portlet variables
(from the pluto definedObjects tag) will have null values, so the page won't work in standalone mode outside the container (ie by
loading directly in the browser).

* If you have mistakes in your JSP content (things that prevent it from compiling, noted above) you get
Channel ID: 94
Message: IChannelRenderer.completeRendering() threw
Error type: Channel failed to render (code 1)
Problem type:
Error message null [based on exception: null]

* One issue that will be important (and I think is undefined) is portlet caching rules. That is, how differently do different
containers handle caching? This is important for linked content that makes heavy use of forms. You don't want, for example,
to repost form parameters everytime you reload a tab. Likewise you don't want to lose state just by clicking on different tabs
before returning to your original portlet.

* Interestingly, JSP session variables and cookies loaded by portlets from different webapps have the same ID and the same
session cookie. But session attributes are different! Session attributes put into the session in one jsp page can be retrieved
by other JSP pages loaded/managed by the same portlet, but they can't be retrieved by JSP pages running in other webapps that
are managed by other portlets. Not unexpected, but it was unexpected that the session variables would think that they are the same.