feedburner
Enter your email address:

Delivered by FeedBurner

feedburner count

JavaFX Kit Missing required modules

Labels: , ,

Missing required modules for Plugin JavaFX Kit: requires javafx.sdk

I've been trying to install NetBeans IDE 6.1 with JavaFX. According to Sun JavaFX Java Technology Web Page It provides all the tools you need to build a JavaFX application.
There are two ways to install it:

  1. The recommended one is to download a build of NetBeans IDE 6.1 with JavaFX support included.
  2. If you already have NetBeans 6.1 installed you can install the JavaFX plugin manually.
Sun recommend that you install the NetBeans IDE 6.1 with JavaFX support included. By doing so, you minimize the steps to get started with your JavaFX application development. I decided to follow the recommendation and to use the NetBeans IDE 6.1 with JavaFX support included. When I looked at the different versions in the NetBeans IDE 6.1 with JavaFX download Page, I noticed that there are versions only for windows and mac users. No version for Linux.

With no other choices I had to go with the manual way. I downloaded NetBeans 6.1 from the NetBeans IDE download page and I followed the Adding JavaFX Support to a Previously Installed IDE instructions. I followed the steps up to step 6. I typed javafx in the Search text field to help locate the JavaFX plugins. Two plugins showed up on the list: JavaFX Kit and JavaFX Weather Sample. I marked both of the JavaFX plugins and I pressed the install. To my surprise I got an error message:
Missing required modules for Plugin JavaFX Kit: requires javafx.sdk
Apparently there is an issue for this missing required modules for plugin JavaFX Kit problem. According to the issue, this was fixed and verified in the preview branch.

The only option I was left with was to download and install a developmental build of JavaFX script plugin for NetBeans IDE 6.1.
  1. I downloaded a binary build from the JavaFX Hudson site.
  2. I opened the compressed file to a temporary folder.
  3. On Netbeans I added the temporary folder as a plugin folder: Tools > Plugins > Downloaded > Add Plugins. I selected all the nbms files from the temporary folder /nbms/javafx/. Then I got a list of 20 plugins to install. I pressed the install button and I followed the instructions. The installer installed all the JavaFX plugins and restarted the NetBeans IDE.
This solved the problem. Now I was able to create a JFX Script Application Project using NeatBeans.

VisualVM Java Profiler Causes Profiled JVM to Crash

Labels: , , ,

VisualVM is a tool that integrates several management and monitoring tools for local and remote Java applications including a Java profiler. The VisualVM profiler enable you to analyze CPU and memory usage of local Java applications.
When attaching the VisualVM profiler to a Java application which is running on Java 6, the profiler may cause the Java application to crash. To prevent this from happening, you need to turn off class sharing for the Java application. Any Java application can be start with class sharing turned off by starting the application with the -Xshare:off argument.

Version Mismatch on BIRT Report Viewer

Labels: , , ,

My BIRT report viewer which is deployed on a Tomcat server used to work without any problem. Lately I have been getting the following error:
Exception
version mismatch

and "show exception stack trace" link.

When I clicked on the link "show exception stack trace" I got an empty stack trace. This was strange because nothing was changed on the server which runs the database or the BIRT report viewer. The only thing that did changed was my operating system. I upgraded my Ubuntu Linux to Ubuntu Hardy Heron 8.04 LTS which comes with Firefox 3 beta. It appears that the BIRT report viewer version mismatch problem is somehow related to Firefox 3. This version mismatch problem does not exists when using a different browser.

BIRT is an open source Eclipse-based Java reporting system.

This problem has been fixed in BIRT 2.3

Hadoop File System Java Tutorial

Labels: , , ,

Hadoop Distributed File System (HDFS) Java tutorial.


This Java tutorial contains examples and Java code on how to create, rename, delete and do much more on Hadoop Distributed File System using the Haddop Java API.

Copy a file from the local file system to HDFS


The srcFile variable needs to contain the full name (path + file name) of the file in the local file system. The dstFile variable needs to contain the desired full name of the file in the Hadoop file system.
  Configuration config = new Configuration();
  FileSystem hdfs = FileSystem.get(config);
  Path srcPath = new Path(srcFile);
  Path dstPath = new Path(dstFile);
  hdfs.copyFromLocalFile(srcPath, dstPath);

Create HDFS file


The fileName variable contains the file name and path in the Hadoop file system. The content of the file is the buff variable which is an array of bytes.
  //byte[] buff - The content of the file
  Configuration config = new Configuration();
  FileSystem hdfs = FileSystem.get(config);
  Path path = new Path(fileName);
  FSDataOutputStream outputStream = hdfs.create(path);
  outputStream.write(buff, 0, buff.length);

Rename HDFS file


In order to rename a file in Hadoop file system, we need the full name (path + name) of the file we want to rename. The rename method returns true if the file was renamed, otherwise false.
  Configuration config = new Configuration();
  FileSystem hdfs = FileSystem.get(config);
  Path fromPath = new Path(fromFileName);
  Path toPath = new Path(toFileName);
  boolean isRenamed = hdfs.rename(fromPath, toPath);

Delete HDFS file


In order to delete a file in Hadoop file system, we need the full name (path + name) of the file we want to delete. The delete method returns true if the file was deleted, otherwise false.
  Configuration config = new Configuration();
  FileSystem hdfs = FileSystem.get(config);
  Path path = new Path(fileName);
  boolean isDeleted = hdfs.delete(path, false);

Recursive delete:
  Configuration config = new Configuration();
  FileSystem hdfs = FileSystem.get(config);
  Path path = new Path(fileName);
  boolean isDeleted = hdfs.delete(path, true);

Get HDFS file last modification time


In order to get the last modification time of a file in Hadoop file system, we need the full name (path + name) of the file.
  Configuration config = new Configuration();
  FileSystem hdfs = FileSystem.get(config);
  Path path = new Path(fileName);
  FileStatus fileStatus = hdfs.getFileStatus(path);
  long modificationTime = fileStatus.getModificationTime

Check if a file exists in HDFS


In order to check the existance of a file in Hadoop file system, we need the full name (path + name) of the file we want to check. The exists methods returns true if the file exists, otherwise false.
  Configuration config = new Configuration();
  FileSystem hdfs = FileSystem.get(config);
  Path path = new Path(fileName);
  boolean isExists = hdfs.exists(path);

Get the locations of a file in the HDFS cluster


A file can exist on more than one node in the Hadoop file system cluster for two reasons:
  1. Based on the HDFS cluster configuration, Hadoop saves parts of files on different nodes in the cluster.
  2. Based on the HDFS cluster configuration, Hadoop saves more than one copy of each file on different nodes for redundancy (The default is three).
  Configuration config = new Configuration();
  FileSystem hdfs = FileSystem.get(config);
  Path path = new Path(fileName);
  FileStatus fileStatus = hdfs.getFileStatus(path);

  BlockLocation[] blkLocations = hdfs.getFileBlockLocations(path, 0, fileStatus.getLen());
    
  int blkCount = blkLocations.length;
  for (int i=0; i < blkCount; i++) {
    String[] hosts = blkLocations[i].getHosts();
    // Do something with the block hosts
  }

Get a list of all the nodes host names in the HDFS cluster


This method casts the FileSystem Object to a DistributedFileSystem Object. This method will work only when Hadoop is configured as a cluster. Running Hadoop on the local machine only, in a non cluster configuration will cause this method to throw an Exception.
  Configuration config = new Configuration();
  FileSystem fs = FileSystem.get(config);
  DistributedFileSystem hdfs = (DistributedFileSystem) fs;
  DatanodeInfo[] dataNodeStats = hdfs.getDataNodeStats();
  String[] names = new String[dataNodeStats.length];
  for (int i = 0; i < dataNodeStats.length; i++) {
      names[i] = dataNodeStats[i].getHostName();
  }


Did you like it? Digg it!

libstdc++2.10-glibc2.2 on Linux Ubuntu Hardy 8.04

Labels: , , , , ,

The libstdc++2.10-glibc2.2 package is not on the Ubuntu Hardy 8.04 Linux repositories so it can't be installed using apt-get or synaptic package manager.


$ sudo apt-get install libstdc++2.10-glibc2.2
Reading package lists... Done
Building dependency tree
Reading state information... Done
Package libstdc++2.10-glibc2.2 is not available, but is referred to by another package.
This may mean that the package is missing, has been obsoleted, or is only available from another source
E: Package libstdc++2.10-glibc2.2 has no installation candidate

In order to install libstdc++2.10-glibc2.2 on Ubuntu hardy 8.04 Linux, you need to download libstdc++2.10-glibc2.2_2.95.4-24_i386.deb and install it using dpkg:

wget mirrors.kernel.org/ubuntu/pool/universe/g/gcc-2.95/libstdc++2.10-glibc2.2_2.95.4-24_i386.deb
sudo dpkg --install libstdc++2.10-glibc2.2_2.95.4-24_i386.deb

libstdc++2.10-glibc2.2 contains libstdc++-libc6.2-2.so.3 which is required by Eclipse tptp agent controller, Eclipse based Java profiler.
The upgrade from Ubuntu Gutsy 7.10 Linux to Ubuntu Hardy 8.04 Linux removes the package. Because of this the Eclipse tptp Agent Controller fails to start.

$ ./ACStart.sh
Starting Agent Controller.
ACServer: error while loading shared libraries: libstdc++-libc6.2-2.so.3: cannot open shared object file: No such file or directory
ACServer failed to start.

Distributed Caching Essential Lessons

Labels: , ,

InfoQ have a presentation about distributed caching which describes some of the challenges and points to consider when you plan to implement or integrate a distributed caching system. The presentation was given by Cameron Purdy from Tangosol (Oracle) and was recorded at JavaPolis 2005. Although it was a long time ago, I think it is still relevant and might be helpful to whoever needs a distributed caching system.

Create log4j configuration file using wizardforge log4j wizard

Labels: , , ,

Log4j is a widely used Java based logging framework. If you worked with Java open source projects you definitely used log4j in your applications. Controlling of the output of the logs is done using the log4j configuration. If you decided to use log4j in your Java application or you are using a 3rd party Java library which uses log4j for logging and requires log4j, you will need to write a log4j configuration file.
Using wizardforge log4j wizard we can create a log4j configuration file. This log4j wizard comes in handy when you do not know the log4j configuration file structure, syntax or if you forgot one of the appenders parameters.

How to use JNDI with your J2SE application

Labels: , , , , ,

Have you ever wished you could have a JNDI (Java Naming and Directory Interface) server in your J2SE application? Do you need to unit test a Java class which gets its resources from JNDI? Well, you can. One of the nice things about JBoss is that it's Java open source project and that it's built from different components which can be used own their own. One of these components is the Java Naming Provider (JNP). Using the JBoss JNP we can add naming abilities to our J2SE application.

What we need is the following Java jars files:

   jnpserver.jar
jbossall-client.jar
log4j.jar
Of course we need to have a log4j configuration file (log4j.properties or log4j.xml) in our Java classpath.

In order to start the JNP we need to write the following code
  System.setProperty("java.naming.factory.initial", "org.jnp.interfaces.NamingContextFactory");

NamingBeanImpl jnpServer = new NamingBeanImpl();
jnpServer.start();

This will start the JNP in a different thread.

We can put the following jndi.properties file in the classpath
   java.naming.factory.initial=org.jnp.interfaces.NamingContextFactory
and then we can remove the System.setProperty(...) from the code.

Now lets see how we can use it to test JDBC code which gets its connection to the database from a DataSource object which is bound to JNDI Using JUnit and DbUnit.
Here is the class which we want to test:
public class DatabaseClass {

public static final String DB_JNDI_NAME = ...

public Connection getConnection() throws SQLException, NamingException {
Context ctx = new InitialContext();
DataSource ds = (DataSource) ctx.lookup(DB_JNDI_NAME);
return ds.getConnection();
}

public void insertData() {

Connection con = null;
try {
con = getConnection();
...
...
}
...
}

You can see that the method gets the connection to the database from JNDI.
Here is the JUnit test class:
public class TestClass {

@Test
public void testData() throws Exception {
System.setProperty("java.naming.factory.initial", "org.jnp.interfaces.NamingContextFactory");

DataSource ds = new EmbeddedDataSource();
((EmbeddedDataSource) ds).setUser("");
((EmbeddedDataSource) ds).setPassword("");
((EmbeddedDataSource) ds).setDatabaseName("testdb");
((EmbeddedDataSource) ds).setCreateDatabase("create");

NamingBeanImpl jnp = new NamingBeanImpl();
jnp.start();

Context initContext = new InitialContext();
initContext.createSubcontext(DatabaseClass.DB_JNDI_NAME);
initContext.rebind(DatabaseClass.DB_JNDI_NAME, ds);

DatabaseClass db = new DatabaseClass();
db.insertData();

// assert the database content using DbUnit
}
}

Then we can use DbUnit to assert the content in the database with the expected content. This way we can unit test the Class although it relies on the J2EE application server naming service.

Some problems you might encounter:
  Exception in thread "main" java.lang.NoClassDefFoundError: org/jboss/logging/Logger at org.jnp.server.NamingBeanImpl.(NamingBeanImpl.java:48)
You need to add jbossall-client.jar to your classpath.

I used to run the JNP by writing the following Java code:
  org.jnp.server.Main.main(new String[] {});
as opposed to
   NamingBeanImpl jnpServer = new NamingBeanImpl(); jnpServer.start();
The last time I tried, I got the following exception:
  java.lang.NullPointerException
at org.jnp.server.Main.getNamingInstance(Main.java:301)
at org.jnp.server.Main.initJnpInvoker(Main.java:354)
at org.jnp.server.Main.start(Main.java:316)
at org.jnp.server.Main.main(Main.java:104)
Looking at the source code I noticed the NamingBeanImpl class and Scott Stark comment
  * A naming pojo that wraps the Naming server implementation. This is
* a refactoring of the legacy org.jnp.server
Using the NamingBeanImpl class I managed to solve the problem.

How to Create Connection to an Embedded Derby Database (Java DB)

Labels: ,

The Derby database also known as Java DB is an open source Java database. It is part of the Apache project and it is distribute with Sun Java JDK 6.
In order to create a Connection or a DataSource Object to an embedded Derby database, we need to have the derby jar file in our Java classpath. Derby comes with JDK 6. If you don't have it you can download it from the derby web site or from sun site. On Ubuntu Linux you can download it using apt-ge:
apt-get install sun-java6-javadb

In the following Java examples the database files are at data/testdb, the user name and the password are empty and the driver is set to create the database if it is doesn't exist.

Creating a single connection to an embedded Derby database:
Class.forName("org.apache.derby.jdbc.EmbeddedDriver");
return DriverManager.getConnection("jdbc:derby:data/testdb;create=true", "", "");


Creating a DataSource Object to an embedded Derby database:
javax.sql.DataSource ds = new EmbeddedDataSource();
((EmbeddedDataSource) ds).setUser("");
((EmbeddedDataSource) ds).setPassword("");
((EmbeddedDataSource) ds).setDatabaseName("data/testdb");
((EmbeddedDataSource) ds).setCreateDatabase("create");

Eclipse DbUnit Plugin

Labels: , , ,

The eclipse DbUnit plugin in eclipse 3.3 is a shell project and is missing the DbUnit jars.
Because of this, the DbUnit Test Case wizard fails with the following error:


Creation of element failed.

java.lang.reflect.InvocationTargetException
at org.eclipse.jface.operation.ModalContext.runInCurrentThread(ModalContext.java:383)
at org.eclipse.jface.operation.ModalContext.run(ModalContext.java:313)
at org.eclipse.jface.wizard.WizardDialog.run(WizardDialog.java:934)
at org.eclipse.ui.internal.progress.ProgressManager$5.run(ProgressManager.java:1149)
at org.eclipse.swt.custom.BusyIndicator.showWhile(BusyIndicator.java:67)
at org.eclipse.ui.internal.progress.ProgressManager.runInUI(ProgressManager.java:1142)
at org.eclipse.datatools.enablement.jdt.dbunit.internal.wizards.DbUnitWizard.finishPage(DbUnitWizard.java:69)
at org.eclipse.datatools.enablement.jdt.dbunit.internal.wizards.NewTestCaseCreationWizard.performFinish(NewTestCaseCreationWizard.java:63)
at org.eclipse.jface.wizard.WizardDialog.finishPressed(WizardDialog.java:742)
at org.eclipse.jface.wizard.WizardDialog.buttonPressed(WizardDialog.java:373)
at org.eclipse.jface.dialogs.Dialog$2.widgetSelected(Dialog.java:618)
at org.eclipse.swt.widgets.TypedListener.handleEvent(TypedListener.java:227)
at org.eclipse.swt.widgets.EventTable.sendEvent(EventTable.java:66)
at org.eclipse.swt.widgets.Widget.sendEvent(Widget.java:1101)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Display.java:3319)
at org.eclipse.swt.widgets.Display.readAndDispatch(Display.java:2971)
at org.eclipse.jface.window.Window.runEventLoop(Window.java:820)
at org.eclipse.jface.window.Window.open(Window.java:796)
at org.eclipse.ui.internal.actions.NewWizardShortcutAction.run(NewWizardShortcutAction.java:135)
at org.eclipse.jface.action.Action.runWithEvent(Action.java:498)
at org.eclipse.jface.action.ActionContributionItem.handleWidgetSelection(ActionContributionItem.java:546)
at org.eclipse.jface.action.ActionContributionItem.access$2(ActionContributionItem.java:490)
at org.eclipse.jface.action.ActionContributionItem$5.handleEvent(ActionContributionItem.java:402)
at org.eclipse.swt.widgets.EventTable.sendEvent(EventTable.java:66)
at org.eclipse.swt.widgets.Widget.sendEvent(Widget.java:1101)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Display.java:3319)
at org.eclipse.swt.widgets.Display.readAndDispatch(Display.java:2971)
at org.eclipse.ui.internal.Workbench.runEventLoop(Workbench.java:2389)
at org.eclipse.ui.internal.Workbench.runUI(Workbench.java:2353)
at org.eclipse.ui.internal.Workbench.access$4(Workbench.java:2219)
at org.eclipse.ui.internal.Workbench$4.run(Workbench.java:466)
at org.eclipse.core.databinding.observable.Realm.runWithDefault(Realm.java:289)
at org.eclipse.ui.internal.Workbench.createAndRunWorkbench(Workbench.java:461)
at org.eclipse.ui.PlatformUI.createAndRunWorkbench(PlatformUI.java:149)
at org.eclipse.ui.internal.ide.application.IDEApplication.start(IDEApplication.java:106)
at org.eclipse.equinox.internal.app.EclipseAppHandle.run(EclipseAppHandle.java:169)
at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.runApplication(EclipseAppLauncher.java:106)
at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.start(EclipseAppLauncher.java:76)
at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:363)
at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:176)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.eclipse.equinox.launcher.Main.invokeFramework(Main.java:508)
at org.eclipse.equinox.launcher.Main.basicRun(Main.java:447)
at org.eclipse.equinox.launcher.Main.run(Main.java:1173)
Caused by: java.lang.NoClassDefFoundError
at org.eclipse.datatools.enablement.jdt.dbunit.internal.wizards.NewDbUnitTestCaseWizardPage1.class$(NewDbUnitTestCaseWizardPage1.java:674)
at org.eclipse.datatools.enablement.jdt.dbunit.internal.wizards.NewDbUnitTestCaseWizardPage1.createGetDataSet(NewDbUnitTestCaseWizardPage1.java:673)
at org.eclipse.datatools.enablement.jdt.dbunit.internal.wizards.NewDbUnitTestCaseWizardPage1.createTypeMembers(NewDbUnitTestCaseWizardPage1.java:520)
at org.eclipse.jdt.ui.wizards.NewTypeWizardPage.createType(NewTypeWizardPage.java:2053)
at org.eclipse.jdt.ui.wizards.NewTypeWizardPage$7.run(NewTypeWizardPage.java:2543)
at org.eclipse.ui.actions.WorkspaceModifyDelegatingOperation.execute(WorkspaceModifyDelegatingOperation.java:68)
at org.eclipse.ui.actions.WorkspaceModifyOperation$1.run(WorkspaceModifyOperation.java:101)
at org.eclipse.core.internal.resources.Workspace.run(Workspace.java:1797)
at org.eclipse.ui.actions.WorkspaceModifyOperation.run(WorkspaceModifyOperation.java:113)
at org.eclipse.jface.operation.ModalContext.runInCurrentThread(ModalContext.java:369)
... 46 more
Caused by: java.lang.ClassNotFoundException: org.dbunit.dataset.IDataSet
at org.eclipse.osgi.framework.internal.core.BundleLoader.findClassInternal(BundleLoader.java:434)
at org.eclipse.osgi.framework.internal.core.BundleLoader.findClass(BundleLoader.java:369)
at org.eclipse.osgi.framework.internal.core.BundleLoader.findClass(BundleLoader.java:357)
at org.eclipse.osgi.internal.baseadaptor.DefaultClassLoader.loadClass(DefaultClassLoader.java:83)
at java.lang.ClassLoader.loadClass(ClassLoader.java:251)
at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:169)
... 56 more

In order to fix the problem we need to fetch the missing DbUnit jars by running the fetch-lib.xml ANT script in the DbUnit plugin directory (plugins/org.dbunit_2.2.0.200706071).

<?xml version="1.0"?>

<project default="main" basedir=".">
<target name="main">
<mkdir dir="libs" />
<property name="base.url" value="http://superb-west.dl.sourceforge.net/sourceforge/dbunit"/>
<get usetimestamp="true" src="${base.url}/dbunit-2.2.jar" dest="libs/dbunit-2.2.jar" />
<get usetimestamp="true" src="${base.url}/dbunit-2.2.jar" dest="libs/dbunit-2.2-javadoc.jar" />
<get usetimestamp="true" src="${base.url}/dbunit-2.2.jar" dest="libs/dbunit-2.2-sources.jar" />
<eclipse.refreshLocal resource="org.dbunit" depth="4"/>
</target>
</project>

Another options is to download the DbUnit jar manually to the DbUnit libs plugin directory. We will need to copy the file to libs/dbunit-2.2.jar, libs/dbunit-2.2-javadoc.jar and libs/dbunit-2.2-sources.jar.

JUnit 4 Features

Labels: ,

One of the main differences between JUnit 3 and JUnit 4 is the use of annotation instead of naming convention.

Using @Test to define a test method
using the @Test annotation we can define any method as a unit test no matter what the name of the method is. In JUnit 3 we needed to start a method name with the word 'test' . In JUnit 4 we can write the word 'test' in the beginning of the method name but it will have no effect. We will need to add the @Test annotation before the method

@Test
public void testMethod() {
...
}
or we can just write:
@Test
public void method() {
...
}

Expecting Exceptions
If out test case need to check that exception is thrown we can use the @Test(expected=Exception.class) annotation. The test will fail if no Exception will be thrown.
@Test(expected=IllegalArgumentException.class)
public void checkForException() {
throw new IllegalArgumentException();
}

Test Timeout
If we want to make sure that our code is execute within a time limit, we can define a timeout for the test. The following test will fail after 100 milliseconds.
@Test(timeout=100)
public void method() {
Thread.sleep(1000);
}

Disable Test
In case we want to disable a test but we still want to see it on the tests list we can use the @Ignore annotation. This way we can write why did we disabled it. The comment will be visible in the final report
@Ignore("This is an example on how to ignore JUnit tests")
@Test()
public void method() {
...
}

@Before and @After instead of setUp() and tearDown()
@Before
public void doSomthing1() {
...
}
@After
public void doSomthing2() {
...
}

We can choose whatever method name we want.
We can define as many method as we want with @Before and @After. Each one of them will be execure before or after any unit test method in the class.

In order to define a method that will be execute before or after the whole class we can use the annotations: @BeforeClass and @AfterClass

Test Suites
This test suite will run the classes Test1, Test2, Test3 and Test4 one after the over.
package com.shimi.example;

import org.junit.runner.RunWith;
import org.junit.runners.Suite;

@RunWith(Suite.class)
@Suite.SuiteClasses( {
Test1.class,
Test2.class,
Test3.class,
Test4.class
})

public class TestSuite {

}

Parameterized Unit Tests
Parameterized unit tests is a feature which came to help us with cases when we need to pass different parameters to our unit test. To define a parameterized unit test we need to use the @RunWith(value=Parameterized.class) annotation for the class. We need to define a Collection with the parameters values and the @Parameters annotation. We will need to write a Constructor which will get the parameters values and will set them in the class variables. In the following example the unit test will run 6 times. Each time JUnit will call the class Constructor with a different pair of Strings from the paramsValues Collection. The Constructor will initialize the class members with the given values. Then, it will call the unit test method which will check if the two Strings are equals.

@RunWith(value=Parameterized.class)
public class ParameterizedTest {

private String param1
private String param2

@Parameters
public static Collection paramsValues {
return Arrays.asList( new Object[][] {
{ "abc", "abc" },
{ "ab", "ba" },
{ "a", "b" },
{ "ab", "a" },
{ "abcd", "abcd" },
});
}

public ParameterizedTest(String param1, String param2) {
this.param1 = param1;
this.param2 = param2;
}

@Test
public void test {
assertEquals(param1, param2);
}
}

Assert Results
using the JUnit assert method we can assert different results values and assign an error message.
assertEquals("Object 1 and Object 2 are different", object1, object2);
The assertEquals method assert Objects. If we will use it for primitive
assertEquals("Object 1 and Object 2 are different", 1, 1);
The result will be false. Java will use autoboxing in order to convert the primitives to Object. We can compare primitives by using the method assertTrue()
int param1 = 1;
int param2 = 1;
assertTrue("Param 1 and Param 2 are different", param1 == oparam2);

Hidden Iterator in Collections toString()

Labels: ,

The toString() method in the Java Collection invokes the toString() method of each item using Iterator.
In a multi-threaded environment a call to toString() should be synchronized to prevent ConcurrentModificationException.

Deploying BIRT to J2EE Server

Labels: , ,

Eclipse BIRT report viewer can be deployed on a J2EE Server

In order to use a JDBC driver in the BIRT viewer, the driver jar file needs to be in the viewer drivers directory. On BIRT 2.2.2 it is:
/WEB-INF/platform/plugins/org.eclipse.birt.report.data.oda.jdbc_2.2.2.r22x_v20071206/drivers
On BIRT 2.3 it is:
/WEB-INF/platform/plugins/org.eclipse.birt.report.data.oda.jdbc_2.3.0.v20080610/drivers


If you are using Firefox 3 you might have noticed the Version Mismatch error. In this case you will need to use a different browser or to upgrade your BIRT version to 2.3

TPTP Agent Controller on Ubuntu 7.10

Labels: ,

Prerequisites:

The Eclipse tptp Agent Controller is compiled using libstdc++-libc6.2-2.so.3. If we don't have it under /usr/lib, then we need to install it:

sudo apt-get install libstdc++2.10-glibc2.2

Using Ubuntu Hardy 8.04? take a look here on how to install libstdc++2.10-glibc2.2

Agent Controller Installation:

$ sudo mkdir /opt/agntctrl.linux_ia32-TPTP-4.4.1
$ sudo ln -s /opt/agntctrl.linux_ia32-TPTP-4.4.1 /opt/tptpAC
$ cd /opt/tptpAC/
$ sudo unzip /home/shimi/agntctrl.linux_ia32-TPTP-4.4.1.zip

Run the configuration script. The script will create the Eclipse tptp Agent Controller configuration file /opt/tptpAC/config/serviceconfig.xml

$ cd bin
$ sudo SetConfig.sh

Add the following lines to ~/.bashrc

PATH=$PATH:/opt/tptpAC/bin
LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/tptpAC/lib
TEMP=/tmp


Change the first line in the scripts ACStart.sh and ACStop.sh from
#!/bin/sh
to
#!/bin/bash

How to share objects between different classloaders on Terracotta

Labels: , , ,

Object identity in Terracotta is based on a combination of classloader name and object reference. When using Terracotta to share an object between different applications where the object class is loaded by classloaders with different names, exception is thrown. In my case I shared an object between deployed ear application on JBoss (app-ee.ear) and a console application.

Exception in thread "Thread-10" com.tc.exception.TCClassNotFoundException: java.lang.ClassNotFoundException: No registered loader for description: JBoss.UnifiedClassLoader3:deploy/app-ee.ear, trying to load org.terracotta.datagrid.workmanager.routing.RoutableWorkItem
at com.tc.object.bytecode.ManagerUtil.lookupObject(ManagerUtil.java:335)
at java.util.concurrent.LinkedBlockingQueue$Node.getItem(LinkedBlockingQueue.java:65)
at java.util.concurrent.LinkedBlockingQueue.extract(LinkedBlockingQueue.java:139)
at java.util.concurrent.LinkedBlockingQueue.poll(LinkedBlockingQueue.java:386)
at org.terracotta.datagrid.workmanager.routing.RoutingAwareWorker.start(RoutingAwareWorker.java:46)
at com.shimi.example.StartRoutingWorker$1.run(StartRoutingWorker.java:20)
Caused by: java.lang.ClassNotFoundException: No registered loader for description: JBoss.UnifiedClassLoader3:deploy/app-ee.ear, trying to load org.terracotta.datagrid.workmanager.routing.RoutableWorkItem
at com.tc.object.loaders.StandardClassProvider.getClassFor(StandardClassProvider.java:43)
at com.tc.object.ClientObjectManagerImpl.lookup(ClientObjectManagerImpl.java:522)
at com.tc.object.ClientObjectManagerImpl.lookupObject(ClientObjectManagerImpl.java:423)
at com.tc.object.ClientObjectManagerImpl.lookupObject(ClientObjectManagerImpl.java:412)
at com.tc.object.bytecode.ManagerImpl.lookupObject(ManagerImpl.java:651)
at com.tc.object.bytecode.ManagerUtil.lookupObject(ManagerUtil.java:333)
... 5 more


In order to solve the problem, all the VMs which are synchronized by Terracotta needs to load the shared class with the same classloader name. Since the name of the classloader which loaded the class on JBoss was JBoss.UnifiedClassLoader3:deploy/app-ee.ear, I needed to load the class on the console application using a classloader with the same name. I used a wrapper main class around the application main class. The wrapper class rename the system classloader using Terracotta NamedClassLoader and start the application.



import java.lang.reflect.Method;
import java.net.URL;
import java.net.URLClassLoader;
import com."com/caucho/loader/EnvironmentClassLoader"tc.object.bytecode.hook.impl.ClassProcessorHelper;
import com.tc.object.loaders.NamedClassLoader;

public class CustomLoader {

// usage: java CustomLoader com.example.MyRealMainClass [arg1] [arg2]

public static void main(String[] args) throws Exception {
URL[] systemURLs = ((URLClassLoader) ClassLoader.getSystemClassLoader()).getURLs();

ClassLoader loader = new URLClassLoader(systemURLs, ClassLoader
.getSystemClassLoader().getParent());

// here is the custom classloader name
((NamedClassLoader) loader).__tc_setClassLoaderName("JBoss.UnifiedClassLoader3:deploy/app-ee.ear");
ClassProcessorHelper.registerGlobalLoader((NamedClassLoader) loader);

Thread.currentThread().setContextClassLoader(loader);
Class mainClass = loader.loadClass(args[0]);

Method main = mainClass.getMethod("main", new Class[] { args.getClass() });

String[] nextArgs = new String[args.length - 1];
System.arraycopy(args, 0, nextArgs, 0, nextArgs.length);
main.invoke(null, new Object[] { nextArgs });
}
}

How to configure Firefox with Java

Labels: ,

In order to configure Firefox with installed JRE, a link needs to be created in the browser plugins directory to the JRE plugin directory

cd ~/.mozilla/plugins/
ln -s /usr/lib/jvm/java-6-sun/jre/plugin/i386/ns7/libjavaplugin_oji.so

JMX remote connection failure to Java application running on Ubuntu

Labels: , , ,

I have been trying to remote monitoring my application which was running on Ubuntu server using Jconsole and I kept on getting connection failure.
When I run the same application on my Ubuntu desktop I did managed to connect to it using Jconsole.

I wrote a small program to help me investigate the problem:


public class JmxTest {
public static void main(String[] args) {

try {
Thread.sleep(Integer.MAX_VALUE);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
}

and I ran it with the following properties on my Ubuntu desktop:
-Dcom.sun.management.jmxremote -Dcom.sun.management.jmxremote.port=9004 -Dcom.sun.management.jmxremote.authenticate=false -Dcom.sun.management.jmxremote.ssl=false

I have tried to connect to it using JConsole from a different machine on my network and I got the same error. I have tried to connect to the application using MX4J and I got the following exception:
java.rmi.ConnectException: Connection refused to host: 127.0.1.1; nested exception is:
java.net.ConnectException: Connection refused: connect
at sun.rmi.transport.tcp.TCPEndpoint.newSocket(TCPEndpoint.java:574)
at sun.rmi.transport.tcp.TCPChannel.createConnection(TCPChannel.java:185)
at sun.rmi.transport.tcp.TCPChannel.newConnection(TCPChannel.java:171)
at sun.rmi.server.UnicastRef.invoke(UnicastRef.java:94)
at javax.management.remote.rmi.RMIServerImpl_Stub.newClient(Unknown Source)
at javax.management.remote.rmi.RMIConnector.getConnection(RMIConnector.java:2239)
at javax.management.remote.rmi.RMIConnector.connect(RMIConnector.java:271)
at javax.management.remote.JMXConnectorFactory.connect(JMXConnectorFactory.java:248)
at org.mc4j.console.connection.JSR160ConnectionNode.connect(JSR160ConnectionNode.java:132)
at org.mc4j.console.connection.ReconnectAction.performAction(ReconnectAction.java:47)
at org.openide.util.actions.NodeAction$3.run(NodeAction.java:440)
at org.openide.util.actions.CallableSystemAction$ActionRunnable.actionPerformed(CallableSystemAction.java:247)
at org.netbeans.core.ModuleActions.invokeAction(ModuleActions.java:74)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:585)
at org.openide.util.actions.CallableSystemAction.invokeAction(CallableSystemAction.java:179)
at org.openide.util.actions.CallableSystemAction.access$000(CallableSystemAction.java:31)
at org.openide.util.actions.CallableSystemAction$ActionRunnable.doRun(CallableSystemAction.java:241)
at org.openide.util.actions.CallableSystemAction$2.run(CallableSystemAction.java:111)
at org.openide.util.Task.run(Task.java:136)
at org.openide.util.RequestProcessor$Task.run(RequestProcessor.java:330)
at org.openide.util.RequestProcessor$Processor.run(RequestProcessor.java:686)
Caused by: java.net.ConnectException: Connection refused: connect
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:333)
at java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:195)
at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:182)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:366)
at java.net.Socket.connect(Socket.java:520)
at java.net.Socket.connect(Socket.java:470)
at java.net.Socket.<init>(Socket.java:367)
at java.net.Socket.<init>(Socket.java:180)
at sun.rmi.transport.proxy.RMIDirectSocketFactory.createSocket(RMIDirectSocketFactory.java:22)
at sun.rmi.transport.proxy.RMIMasterSocketFactory.createSocket(RMIMasterSocketFactory.java:128)
at sun.rmi.transport.tcp.TCPEndpoint.newSocket(TCPEndpoint.java:569)
[catch] ... 23 more

I have noticed the Connection refused to host: 127.0.1.1. My /etc/hosts file (where the application is running) contains a line with the entry 127.0.1.1:
127.0.0.1 localhost
127.0.1.1 ubuntu-desktop


I removed the 127.0.1.1 line and I moved the host name to the end of the first line:
127.0.0.1 localhost ubuntu-desktop

After that I tried again to connect to my application using MX4J. This time I got the same exception but instead of 127.0.1.1 I got 127.0.0.1
 java.rmi.ConnectException: Connection refused to host: 127.0.0.1; nested exception is: java.net.ConnectException: Connection refused: connect
at sun.rmi.transport.tcp.TCPEndpoint.newSocket(TCPEndpoint.java:574)
at sun.rmi.transport.tcp.TCPChannel.createConnection(TCPChannel.java:185)
at sun.rmi.transport.tcp.TCPChannel.newConnection(TCPChannel.java:171)
at sun.rmi.server.UnicastRef.invoke(UnicastRef.java:94)
at javax.management.remote.rmi.RMIServerImpl_Stub.newClient(Unknown Source)
at javax.management.remote.rmi.RMIConnector.getConnection(RMIConnector.java:2239)
at javax.management.remote.rmi.RMIConnector.connect(RMIConnector.java:271)
at javax.management.remote.JMXConnectorFactory.connect(JMXConnectorFactory.java:248)
at org.mc4j.console.connection.JSR160ConnectionNode.connect(JSR160ConnectionNode.java:132)
at org.mc4j.console.connection.ReconnectAction.performAction(ReconnectAction.java:47)
at org.openide.util.actions.NodeAction$3.run(NodeAction.java:440)
at org.openide.util.actions.CallableSystemAction$ActionRunnable.actionPerformed(CallableSystemAction.java:247)
at org.netbeans.core.ModuleActions.invokeAction(ModuleActions.java:74)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:585)
at org.openide.util.actions.CallableSystemAction.invokeAction(CallableSystemAction.java:179)
at org.openide.util.actions.CallableSystemAction.access$000(CallableSystemAction.java:31)
at org.openide.util.actions.CallableSystemAction$ActionRunnable.doRun(CallableSystemAction.java:241)
at org.openide.util.actions.CallableSystemAction$2.run(CallableSystemAction.java:111)
at org.openide.util.Task.run(Task.java:136)
at org.openide.util.RequestProcessor$Task.run(RequestProcessor.java:330)
at org.openide.util.RequestProcessor$Processor.run(RequestProcessor.java:686)
Caused by: java.net.ConnectException: Connection refused: connect
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:333)
at java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:195)
at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:182)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:366)
at java.net.Socket.connect(Socket.java:520)
at java.net.Socket.connect(Socket.java:470)
at java.net.Socket.<init>(Socket.java:367)
at java.net.Socket.<init>(Socket.java:180)
at sun.rmi.transport.proxy.RMIDirectSocketFactory.createSocket(RMIDirectSocketFactory.java:22)
at sun.rmi.transport.proxy.RMIMasterSocketFactory.createSocket(RMIMasterSocketFactory.java:128)
at sun.rmi.transport.tcp.TCPEndpoint.newSocket(TCPEndpoint.java:569)
[catch] ... 23 more


After that I removed the host name from the file and I tried again, this time everything worked.

I have found this Blog entry about troubleshooting connection problems in jconsole. In my case the problem was different but it might be helpful to others .

How to Add Command Line Properties to JMeter

Labels: ,

Command line properties are very helpful when running JMeter in non-gui mode.

Passing a property value to the test plan:
-J[prop name]=[value]

Accessing the property value from the test plan:
${__P(prop name,default-value)}
The first value is the property name. The second value is the default value if the test was started without passing a value to the property.

For example, if we want to configure the host name and port number of the server that we want to test, we can write ${__P(server.host,localhost)} in the server name and ${__P(server.port,80)} in the port number of the test plan HTTP Request Defaults config element.
The command line will look like this:
./jmeter -t testplan.jmx -n -Jserver.host=differenthostname -Jserver.port=8080

How to Disable Hadoop File System Permissions

Labels: ,

HDFS permissions is a new feature in Hadoop 0.16.0
The default value for the HDFS permissions in hadoop 0.16.0 is true. In order to disable it you need to add the following block to your Hadoop configuration file (hadoop-site.xml):


<property>
<name>dfs.permissions</name>
<value>false</value>
</property>

How to enable JMX on a Java Application

Labels: , ,

To enable Java with JMX Agent we need to add the following line to the application VM arguments:
-Dcom.sun.management.jmxremote

This will enable only local monitoring. To enable remote JMX connection we need to specify the port which the JMX server will listen for remote connection.
-Dcom.sun.management.jmxremote.port=9004 (or any other port number)

The JMX remote connection is secured by default. To disable the authentication or the SSL in the JMX remote connection:
-Dcom.sun.management.jmxremote.authenticate=false -Dcom.sun.management.jmxremote.ssl=false

much more information about monitoring and management using JMX can be found here