It's All About Dependencies
Posted on April 27, 2008
Trying not to reinvent the wheel over and over again in software development, reusability has quickly become a strongly desired virtue. Object-orientation was once meant to deal with this, but it soon became obvious that objects are a much too small unit to be really reusable across projects or companies. So in today's world we ended up with modules all over the place - some call them libraries, plugins or bundles, but most often these are simple jar files; in general a module is just a bunch of classes and resources, sometimes enriched with some meta-data.
As it is with objects, modules make only sense if there are dependencies between them. For objects, these dependencies were often hard-coded, but in the age of dependency injection they form rather meta-data that comes aside the classes (e.g. in a Spring application context xml).
For modules, two main solutions have evolved over the past years to describe dependencies: Maven pom-files and OSGi manifest-files. Both have quite a lot things in common: They specify names and versions of modules and list their mandatory or optional dependencies. Of course there are many details where the one offers more and other things than the other, but in general, it is a very similar concept.
The difference between Maven and OSGi is their use case: Maven targets the provisioning, build and distribution of modules, while OSGi is targeted at the development and - more importantly -
the (dynamic) runtime of applications. Both are well established in their domain and have a big community of users - but slowly people realize that they actually want to have them both at the same time. On the one hand, OSGi is introduced into "classical" applications by
Spring Dynamic Modules where the build process is best left to Maven. The OSGi crowd on the other hand has noticed that it is a pain to collect all required dependencies and that there actually should be something like a repository infrastructure to easily retrieve and distribute modules.
The OSGi Alliance has therefore set up the
OSGi Bundle Repository (OBR) - which looks very similar to Maven repositories, I just doubt that it will be equally successful. The Eclipse Foundation, one of the foremost adopters of OSGi, went one step further and introduced
p2, the new provisioning system that will come with Eclipse 3.4 aka Ganymede. p2 replaces the Eclipse Update Manager, brings a new infrastructure for bundle repositories and supports to dynamically install and manage bundles in a running application. In the next phase (after the release of Ganymede), it is planned to extend p2 towards the build process as well.
Although competition is usually very welcome, it means some dilemma for the average configuration manager: Building dynamic applications does not mean that one can decide for the one option or another, but both approaches (Maven and OSGi) need to be combined. There are multiple tools and plugins available, which help converting Maven pom-files into OSGi manifests or vice versa or which allow calling the plugin build from Maven or to call Maven out of Eclipse.
The problem is that with generation one descriptor file out of the other, some information is usually lost or at least the comfortable tools cannot be used that exist to maintain the files. So the right decision very much depends on your application: If you already use Maven for continuous integration, the pom-files will be the source of everything for you and you can introduce OSGi by generating manifest files from these. If you instead work on an Eclipse RCP application and are used to all the comfort that Eclipse offers you for editing the manifest or the plugin.xml, nobody will be able to convince you that you should instead manually put your changes in a pom-file and generate the rest again.
Although p2 is a cool feature, it might make things worse, at least in the beginning: For OSGi-centric applications, its use will be very desirable, while not offering all flexibility of Maven. So if Maven is already in place, it is unlikely that it gets replaced by p2 soon. Instead there will be adapters, mapping Maven repositories or OBRs to p2 and having Maven builds pushing build artifacts back to them.
I would not object to using different tools, if the integration would be smooth - but experience shows that both worlds do not really pay much attention to each other and leave the integration part out of their scope - so usually the integration is done by some annoyed people who actually want to use things productively and have to help themselves to sort out things. Unfortunately these kind of integrations are often only pursued until there is a version that is doing somewhat the expected thing - and then it is left to the depths of Sourceforge. Maybe I am too pessimistic, but the Equinox team already stated that they are not interested in any direct support of Maven, so the way ahead might be bumpy...
Still, at some point in the far future when Spring Dynamic Modules have become a de-facto standard and p2 is stable and fully supports the build process, I am confident that building, integrating and deploying applications can be real fun :-)
Read more...
Some Goodies...
Posted on April 26, 2008
During the JAX'08, I learned about some helpful tools - some being older, some newer, but for Eclipse RCP programmers or Java programmers in general maybe helpful. So here's a short bullet list of what I came across:
- The Nebula Project: Pretty cool widgets to include into your Eclipse RCP applications. If you want to have something that does not look all normal, but with some wow-effect, that's the place for you to go.
- Many people complain that RCP applications look to similar to the Eclipse IDE (and therefore too technical and not fancy enough), so here's another possibility to make something special out of your RCP app: Use Eclipse presentations! Kai Tödter provides an easy example how to do this with his MP3 Manager. If you want to see how far you can go with these presentations, have a look at Lotus Notes Hannover - but make sure to employ some full-time resources just for developing your RCP skin...
- The Plugin Spy will come as a feature of Ganymede (Eclipse 3.4) and really makes life much easier to find out what you are looking at (which class, extension, id etc.).
- The Eclipse Core Tools can be very useful for debugging - they give you some insight into places that are otherwise hidden from your view.
- Everybody who has problems with memory leaks in Java applications will very much enjoy the new contribution from SAP, the Memory Analyzer.
- Something already quite old, but definitely useful, is Pack200, a packager that is specialized on shrinking Jar files - which works very efficiently. Although jars are usually already zipped, their size can be reduced immensely by Pack200. This is useful for distribution, but you should be aware that you need to un-pack200 your jars before they can be read by the JVM. The new Eclipse provisioning system p2 will make use of Pack200, that's how I came across it. Apologies if you might now this tool since years already...
- If you want to "patronize" your developers a bit more, you can use AspectJ to enforce your architectural rules that you define. A nice starting point to do so is the PatternTesting project, which might give you an idea of how powerful AspectJ is for this purpose.
Read more...
Concurrency in the Java World
Posted on April 24, 2008
No this post is not about multi-core cpus or about making your code more thread-safe. It is instead about the latest trends in the Java universe. As Ted Neward says, Java was seen for a long time as the universal (God given) programming language that can and should be used for each and every (software) problem in the world. This ended up in a blown up JDK, an abundance of core APIs and an always busy JCP. Furthermore, Java has become an established programming language. As Rod Johnson notes "Nobody wants to date a Java developer." Java is now mainstream and not sexy anymore.
Out of these two reasons, one could see a clear trend away from the "classical" Java.
The very first step in this direction was IMHO already the rising of Eclipse - the chosen name already shows that it was intended to break the Sun monopoly in the Java world. All of a sudden, there was a "parallel" UI-widget-framework available than AWT/Swing that is an integral part of every JRE.
The second step was an alternative to J2EE: Spring, as I have already mentioned in my last post.
Then there followed the hype of dynamic languages, first of all Ruby with the web app development extension Rails, which many Java (web) developers turned to. The ones who didn't want to miss everything about the Java platform, chose the "light" version of turning Java the back and got on with JRuby or Groovy.
It wasn't only dynamic languages that prospered on the JVM, but there were also new statically typed languages, such as Scala showing up and getting lots of attention and interest.
The world was just seeking for something new and sexy again - the only common denominator was the JVM as everyone agreed on its great benefits of its write-once-run-anywhere concept.
But then came Google around and used Java just the other way round: GWT and Android both let the developer use Java syntax, but their source code is compiled into something very different than Java bytecode: In the former case its Javascript, in the latter its classfiles for the Dalvik VM - an alternative VM with its own compiler.
So has Suns battle to keep the Java community together failed? Despite of all of the above, I tend to say no. Obviously, the idea of an omnipotent programming language must be given up, but not because Java is bad, but because there simply is no one-size-fits-all in programming. In contrast to .NET, the Java community promotes its own freedom and all the movement in the market is a clear sign that this freedom is used to drive innovation and that things will evolve.
The bad news is that you cannot rest on your knowledge of the Java language - the language itself will change in the upcoming years and there might be different favourable options, depending on the environment that you are working on. So stay informed and don't forget to learn at least one new language per year, before it's too late :-)
Read more...
The Emperor's New Clothes
Posted on April 23, 2008
I have been puzzled already many times about how complicated and difficult to understand some things are done in the software development world. For example, I never really fully understood the need for fully-flegded J2EE application servers for any kind of web applications. I have seen so many cases where EJBs were hardly used at all (because of their inherent complexity and their backdraws) and there existed only a set of stateless session EJBs, of course without any distribution or clustering needs.
So from this background, I very much enjoyed a session with
Brian Chan, the CEO of the OS Enterprise Portal
Liferay, who explained why and how the migrated their application from a pure J2EE application to a Spring-based web application running on Tomcat.
Brian came up with a very striking analogy: He stated that the J2EE/EJB history very much resembles the tale of the
Emperor's New Clothes: For years, whoever wanted to build "enterprise" applications automatically set on EJBs. There was no questioning about this - instead, it used to be a perfect selling point as nobody challanged the product's architecture; what was based on EJBs was automatically considered as good and professional. Nobody dared to admit that EJBs were bloody complex at that time and did not solve the real problems - as nobody didn't want to appear as a fool. Until Rod Johnson came around and told the world in 2004 that there are indeed ways to do enterprise applications without EJBs - he introduced Spring. So Brian rightly praised Rod for having the guts to play the child, stand up and point out that the emperor was indeed naked. This was really the advent of a big change in the people's mindset about enterprise applications. It does not mean that there is no right for J2EE applications servers to exist, but that there are many cases where there are actually better (i.e. simpler, cheaper and faster) solutions.
Now I am just waiting for somebody pointing out that HTML is not at all the right technology for building enterprise user interfaces...
Read more...
Keeping Process Alive
Posted on April 21, 2008
New year, new conference, new blog entries - this time I am attending the
JAX'08 and I have just an interesting opening "Agile Day" behind me. This was all about agile development and management practices and how these can be introduced into an enterprise, which has no experience with them yet.
There are many agility-oriented consultants that know how to enable the transformation of a team or company towards an agile player. This is not an easy path to follow, but with enough social skills, it should be possible to get things going. It is key to success to have full support from management as well as from the team members. If any side has doubts or does not welcome any change, the transformation project might be doomed before being started.
Now imagine that despite all possible obstacles the transformation was a success and your department is now following appying agile practices. The external consultant leaves and you are left on your own. What happens? For a while, it is taken as normal that after such a big organisational change there is still some inclarity about one thing or another and one tries to figure out how to fill the gaps. Usually one fills the gaps with something that one is familiar with and so this will bend the agile process a bit towards the "old" style to do things in the department. As things are not smooth yet (to really "live" agile practices, people might need several months or years) and so no real improvement can be seen and only few people will be enthusiastically support it (now that the expert as a backing is missing). New people will arrive, others will leave the company. Spin this further for a couple of months to a few years and most of your agile process will have eroded. It is likely that all you will see that is left is continuous integration - that's the one practice that is so immensely successful, that nobody wants to miss it and so its wide acceptance is almost guaranteed.
But what to do to avoid the erosion of the rest of your practices?
From what I learned there are a few key things that you need to make sure on the long run:
- In each agile team, you must have at least one change agent that actively persues the agile way and continuously drives the team to follow the practices and to improve its adoption. "Supporters" of the agile way are not sufficient - if things start to slip, the change agent must take immediate action, so he plays a very active role.
- New team members must be fully introduced into the practices - not by provided some Wiki pages with coding conventions to follow, but with an active mentor or foster parent during the initial weeks or months.
- You need to pay special attention to your retrospectives - these really forces the whole team to think about the process, where it is not appropriate and what can be improved. Only by these retrospectives it can be achieved that team members identify themselves with the practices and see it as something that they really own (and therefore need to care about it).
- Progress should be made visible (e.g. in burn-down-charts) and achievements should be celebrated - each successful iteration motivates the team members and it gives confidence in the ability of the team to the team itself as well as to the management. A positive effect is also that with each iteration, future estimations will become more precisely.
If you respect all of the above points my guess is that you have a good chance that your people really pick up the agile way, live it and improve it. That's what you should be striving for.
Read more...