Compilation tools

Posted by & filed under , , .

If you are developing applications as your day job then you are used more than likely to compiling your code and then running it (I know there are exceptions to the rule in those who are using interpreted languages but there’s still a majority of us it seems who use compilers). Some of you might even remember “the old days” when using either a Borland or Watcom or Microsoft C++ compiler on a 486 used to take an hour for a medium sized project, time which was spent by your team around the coffee machine discussing the latest football results – only to find out at the end that “compilation failed” 🙂 When Pentium came on the market and subsequent faster processors and with compilers themselves getting faster and cleverer (for a while) these sort of things happened less and less…until recently!

I use Java mostly nowadays and I use the Eclipse IDE as my editor – coupled with build tools like ant and maven. And I have to say that I’m not that happy with the setup! And before you go on preaching about other IDE’s and how good they are let me explain that I don’t think the problem is with Eclipse but rather with the build tools.

When Eclipse came out (and I embraced it straight away) ant was around as a build tool, however Eclipse also had its own builder which I would use in most cases as it was well integrated with Eclipse and made debugging very easy. Once I was happy with the final result of my work I’d run an ant build from Eclipse which compiled and packaged the whole thing ready to be shipped. Then some egg-heads added ant plugins to eclipse and turning those on meant every time I saved a file a full ant build would start (and like any developer I hit that Ctrl+S after each line of code written!). Granted, the plugins were clever enough to only compile the sources modified — but the fact that Eclipse had to spawn an external process and wait for it made it last forever (as opposed to the internal builder which would compile just one single file when you save it!).

Thing is, the compilation tools are trying to be so clever nowadays and in this “cleverness” they introduce so much complication that to me seem unnecessary — and while arguably occasionally one has to be pay the price of complexity, the problem is that even doing the smaller things adds such an overload that to me makes these compilation tools not just unfriendly but pretty much unusable.

I’ll walk you through a few examples of something just a tad bit more complex than the standard “Hello, World!” : assume you have 2 libraries, one maintained by a friend which handles the GUI side of things and one maintained by you which uses it to just show some sort of “Hello World“-like message. For simplicity, I’ll call the GUI library A and the other one B.

Back in command-prompt times, one would simply write this to compile the 2 packages:

java -d out/ A/*.java B/*.java

Sorted! This produces all the .class files in the out/ directory, creating the package/directory structure as needed.

Obviously in time we get a bit clever and think we could separate the 2 libraries in 2 jars: a.jar and b.jar — so when I compile b.jar I will simply use a.jar in the classpath and not worry about compiling it all the time. So my little script becomes:

java -d out/ -cp a.jar B/*.java

Done! a.jar now gets maintained outside the scope of our B project and we only use it.
Now I might need other “default” libraries in my compilation (log4j? apache commons? etc) so I might change my script slightly to make it more “portable”:

java -d out -cp "${CLASSPATH}:a.jar" B/*.java

and that is pretty standard bash scripting for a java compilation.

With just a few changes you could change this script in a very similar MS-DOS/batch script as follows:

java -d out -cp "%CLASSPATH%;a.jar" B\*.java

If at any point you want to do a full rebuild, simply delete the out folder and you’re ready to re-compile everything:

rm -rf out/

Also if you want to package it as a jar you get this:

jar -cf b.jar out/*

Now you take something like ANT — and the above one line compilation command transforms into this:

<project name="B" default="dist" basedir=".">
 <description> simple example build file for project B (b.jar) </description> 
 <property name="build" location="out"/> 
 <property name="dist" location="dist"/>
   <target name="init">
     <!-- Create the time stamp -->
     <tstamp/>
     <!-- Create the build directory structure used by compile -->
     <mkdir dir="${build}"/>
   </target>
   <target name="compile" depends="init" description="compile the source files" >
     <!-- Compile the java code from current directory into ${build} --> 
     <javac srcdir="." destdir="${build}"/> 
   </target>
   <target name="dist" depends="compile" description="generate the distribution jar" > 
     <!-- Create the distribution directory --> 
     <mkdir dir="${dist}/lib"/>
     <!-- Put everything in ${build} into the b.jar file --> 
     <jar jarfile="${dist}/lib/b.jar" basedir="${build}"/> 
   </target>
   <target name="clean" description="clean up" > 
     <!-- Delete the ${build} and ${dist} directory trees --> 
     <delete dir="${build}"/> 
     <delete dir="${dist}"/> 
   </target> 
</project>

You tell me this simplifies my work?? Bearing in mind that having written this script I still have to invoke:

ant compile

And God forbid if you haven’t named your file build.xml as you know, as you will need an extra parameter to specify that!

And next let’s have a look at Maven — which is gonna save the world apparently:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
	<modelVersion>4.0.0</modelVersion>
	<groupId>com.mycompany.app</groupId>
	<artifactId>my-app</artifactId>
	<packaging>jar</packaging>
	<version>1.0-SNAPSHOT</version>
	<name>Maven Quick Start Archetype</name>
	<url>http://maven.apache.org</url>
 
	<distributionManagement>
		<site>
			<id>dev.site</id>
			<name>dev.site</name>
			<url>scp://server/path</url>
		</site>
		<repository>
			<uniqueVersion>true</uniqueVersion>
			<id>dev.repository.release</id>
			<name>dev.repository.release</name>
			<url>scp://server/path</url>
		</repository>
		<snapshotRepository>
			<uniqueVersion>false</uniqueVersion>
			<id>dev.repository.snapshot</id>
			<name>dev.repository.snapshot</name>
			<url>scp://server/path</url>
		</snapshotRepository>
	</distributionManagement>
 
	<repositories>
		<repository>
			<snapshots>
				<enabled>false</enabled>
			</snapshots>
			<id>central</id>
			<name>Maven Repository Switchboard</name>
			<url>http://repo1.maven.org/maven2</url>
		</repository>
		<repository>
			<snapshots>
				<enabled>true</enabled>
			</snapshots>
			<id>cognitivematch</id>
			<name>My Own</name>
			<url>http://server/path</url>
		</repository>
	</repositories>
 
	<pluginRepositories>
		<pluginRepository>
			<id>onejar-maven-plugin.googlecode.com</id>
			<url>http://onejar-maven-plugin.googlecode.com/svn/mavenrepo</url>
		</pluginRepository>
	</pluginRepositories>
 
	<dependencies>
		<dependency>
			<groupId>junit</groupId>
			<artifactId>junit</artifactId>
			<version>3.8.1</version>
			<scope>test</scope>
		</dependency>
	</dependencies>
</project>

Don’t know if you noticed in the above but we haven’t called this module B anymore and there’s no reference of a.jar anymore — because you can’t quite do that this easily with maven. Nope, if a.jar is just another jar that I put together (maybe to just gather a set of util functions/classes) then to make it available via maven to other applications that I write I have to jump through hoops:

  • first I need a maven repository set up
  • secondly I need to decide for a “proper” artifact name for my simple a.jar — one which would be unique in the whole wide world, should my a.jar ever decide to step outside my PC!
  • then I need to deploy this a.jar into my local repository (oh yeah, did I mention scp/https configuration and all that malarky?)
  • and finally I can then reference it into my maven file.

Ahem, can’t I just copy it locally and use -cp a.jar???

And at the end of it, if I simply just want to make sure that the refactoring that I’ve done compiles only (not interested at this point in whether it does what it’s supposed to do yet) I need to remember to use:

mvn -DskipTests clean install

Or if I know my last refactoring has broken tests but I’m not worried about it in this step as I want to ensure the whole hierarchy of classes I’ve changed compiles so I can get on with the rest of the coding and finally get to unit testing it once everything compiles fine it gets even more obscure:

mvn -Dmvn.test.skip=true clean install

Oh, and you have to use versions! Even though you are writing your code for your own benefit you must use strict version control. No shit! So from simply compiling stuff (in under 1 minute before) and concentrating on the main aspect of what I do (coding that is!) now I have to dedicate a fair amount of that time to project structure management, version control, http/ssh configuration but hey, I’ll be safe in the knowledge that when I finally run a compile (and the damn thing works!) the whole infrastructure I built in order to compile my sources is rock solid!

Or I could just go to a command prompt and type

java -d out *.java

and save myself a few months on setting it up plus a load on EC2 instances probably to set up the repository, authentication and god knows what!

So to all of you out there who are involved in these “great” tools, you know, it would be a good idea if you use it occasionally — if you can! 😀

One Response to “Compilation tools”

    Leave a Reply

    Your email address will not be published.