Technistas

Matthew D. Laudato writes about software and technology

Archive for the ‘build management’ Category

A Framework for Evaluating Continuous Integration Tools

leave a comment »

For those of you interested in the methodology behind my ongoing series on CI tools, you should check out a new article that I wrote for CMCrossroads. In it I provide a framework for evaluating CI tools, and give you a checklist and ranking system to help you organize and rate your evalution.

The 2nd installment of the hands-on tool evaluation is a few days away – stay tuned for how the four tools (Hudson, Mojo, Bamboo and TeamCity) fare on providing access to common development tools and on enabling you to assemble complex build workflows.

Happy Building!

– Matt

Written by Matthew D. Laudato

June 16, 2010 at 6:36 pm

Comparing Continuous Integration Tools, Part 1

with 6 comments

One of the more enjoyable parts of my job at OpenMake Software is getting to examine and analyze the various build tools on the market. This is partly to see what the competition is up to, and partly to make sure that I can effectively communicate the technical bits with our customers, many of whom have multiple build tools in their environments.

To that end, I recently embarked on a continuous integration tool evaluation. I chose to look at Hudson, an opensource tool commercially supported by Sun Microsystems;  TeamCity, a commercial tool from JetBrains;  Bamboo, a commercial tool from Atlassian; and Mojo, a freeware and commercially supported tool from OpenMake Software. My goal was to compare the tools along several vectors:

  • Installation
  • Configuration
  • Running a simple job
  • Viewing logs
  • Interacting with source control
  • Performing complex distributed build workflows

I decided to break the effort into two parts. The first part, covered in this post, is the ‘getting my feet wet’ portion of the evaluation. I tackled the first four bullets above to get a sense of how the tools were installed and configured, and to see if I could get them each to do something useful. The useful thing was to run a job that spits out the current environment, the equivalent of running the ‘set’ command from a DOS prompt in Windows.

The table below summarizes my findings, and below that, I give some general impressions about the tools and the evaluation process.

PRODUCT: Mojo Bamboo Hudson Team City
FEATURE:        
Version 7.31 2.5.5 1.332 5.1.1
Installation
method
Windows installer Windows installer Executable war file Windows installer
Download
size
50M 84M 27M 268M
Need
License?
Free, unlimited
single-server license
30 day trial Free, unlimited
single-server license
Free, unlimited
single-server license
Installation
notes
Does not ask for default
port as part of install. That is configured once you have started the client.
Server starts as part of installation and gets installed as Windows service.
Start Menu group and icons installed for access to thick and web client.
Asks for default port as
part of install. Does not start server as part of startup. When you do start
the server, does not recognize your port choice.
No issues. Hard to figure
out how to change the default ports.
No issues. Asked for
default port in install wizard. Starts server and build agent as windows
service as part of install and then runs web interface.
Initial
setup
None. If you like the
defaults, then you can create a workflow immediately through the thick client
Asks you to ‘Create a Plan’
as the first activity. Did not like this as it forces me to digest their
meaning of the generic word ‘Plan’
None. If you like the
defaults, then you can create a workflow immediately.
Wants you to create
projects and build configurations but does not define exactly what these are.
Configuring
a simple job (ENVPEEK – prints build server environment vars to build log)
Easy. Create a workflow,
add a ‘Mojo | Execute shell command’ activity, and type in the command
(‘set’).
Difficult. In order to
‘Create a Plan’ you need to go through an 8 step wizard. The second wizard
screen requires you to select an SCM system and a repository location. I had
to give it a repository location from my Subversion server to get past this
screen. Annoying, since for this job I don’t care about SCM. Rest of the
wizard was OK, but way too many steps just to set up a simple job.
Easy. Create a new build
job. Use the ‘Execute Windows batch command’ option and type in the command
(‘set’).
Moderate. Team City asks
you to create a project, which is pretty easy. You then have to create at
least one Build Configuration. There is a web-based wizard that like Bamboo
has an SCM screen, but you can choose to ignore it. You can then choose a
command line Build Runner, in which you specify the ‘set’ command.
Running
a simple job
Easy. Open the workflow,
either in the thick client or in the web interface, and press the run button.
Runs successfully
Moderate. From the Bamboo
home, select the Plan, then select ‘Run Build’ from the Plan Actions menu on
the right. Because of the SCM choice, even jobs that don’t require SCM will
check out from Subversion. Tool is geared for building code projects – does not appear to be a general workflow tool.
Easy. Select the job and
select the ‘Schedule a build’ button.
Easy. From the Project tab,
find the project that you want to run and then click on the Run… button.
Viewing
job logs
Easy. In the thick client,
open the workflow, and go to the History/Trends tab. Select the run that you
want to see and double-click. In the web interface, select the workflow and
submit a query to retrieve the run information. Select the specific run you
want to view.
Moderate. You have to click
on the plan, then the Completed Builds tab, then click on the build you want,
then click on its Logs tab. Lots of drilling down required.
Easy. Click on the Job name
and then select any link from the Build History link.
Easy. From the Projects
tab, click on the Project that you want to view. Select the link for the run
that you want to view.

 

Overall, Hudson and Mojo were the easiest tools to install and use. Hudson definitely takes the cake when it comes to installation, since you don’t have to install it – you just run the executable war file from the command line. Mojo, TeamCity and Bamboo have more traditional installers, of which the Mojo install was the most straight-forward, asking the fewest questions before proceeding with the install. Atlassian’s Bamboo has the most restrictive trial license, but Mojo, Hudson and TeamCity all have a more open approach – you can use them in very useful forms without any cost or special licensing.

Once the tools were installed, I next looked to do any initial configuration, which I define loosely as ‘stuff the tool requires me to do before it lets me do what I really want to do’. On this measure, I again put Mojo and Hudson in the lead, as I didn’t have to do anything – I just went straight to thinking about the job I wanted to run. TeamCity wanted me to create a project and a build configuration, which was fairly easy – but I had to figure out what they meant by ‘project’ and ‘build configuration’. Bamboo was by far the most difficult tool to configure. Any time I see an 8-step wizard just to turn the engine over and get the motor running, my initial response is ‘who wrote this thing’?

Getting the actual job configured was again easy in Mojo and Hudson. The Mojo interface is very straight-forward – you select a machine to run on, and then start adding workflow steps (called activities). There is a large built-in list of activities (around 50) for interacting with commercial and opensource tools. I used the ‘Execute shell script’ activity type to run the set command, and that constituted the entirely of my ‘ENVPEEK’ job. Hudson was also easy to set up. TeamCity and Bamboo were the most painful to set up for actual jobs – you are forced into their concepts, instead of just being able to think about the job at hand. The other comment on both TeamCity and Bamboo is that they are both very ‘source code biased’. By that I mean that they have an implicit assumption that your jobs require interaction with source control. In both tools I was required to specify a location in a source control tool (I used Subversion from Collabnet). Since my initial job was a codeless one, this was annoying.

Running jobs in all tools is fairly easy, as is reviewing the logs – though in Bamboo I did have drill down quite a bit to get to my logs. Going back to my ‘source control bias’ comment, Bamboo needed to check out code from a repository location that I specified – and then ignored it since my inital job was just to run ‘set’.

Next installment: doing actual code builds with each of the tools, and then putting together complex build processes.

Happy Building!

– Matt

Written by Matthew D. Laudato

June 7, 2010 at 4:09 pm

The Build Engineer’s Desktop

with 3 comments

Programming environments have come a long way from when I started in this business. I can recall loading programs from cassette tape into my Timex Sinclair computer in high school, and fumbling with the VAX editor in college. By the early 80’s, I found my self in graduate school with a mix of new (a microVAX) and old (a military surplus Raytheon 700, on which debugging amounted to reading hex codes from lights on the front panel and literally pressing the ‘step’ switch to move through the program).

Fast forward through the 90’s and into the new century, and things have changed quite a bit. Java programmers have Eclipse and other rich programming environments. If you work with the Microsoft technologies, Visual Studio has given you an increasingly powerful and convenient desktop over the past 15 years. Even database engineers have integrated environments where they can program and manage their database deployments. It seems that no matter what your role in the software business, there is a desktop tool for you. Which brings me to the topic of today’s post – the build engineer’s desktop.

It seems to me that the build engineer has drawn the short straw from software vendors. From this engineer we expect solutions to hard problems – complex compile and link sequences, deployments to test, staging and production environments, and a great deal of programming to make it all happen. But as a build engineer, your tool set is limited. You are expected to code, debug and deploy using a plain text editor, and cobble your scripts together ad hoc, with no centralized platform or desktop environment to act as your command center.

Enter OpenMake Meister. If you’re a build engineer, sitting down at the Meister client is like stepping into the cockpit of a 747. In one powerful desktop environment, you can assemble complex compile, link and archive services, manage deployments, do dependency analysis, create distributed workflows, write reusable scripts, and fully control the build, test and deploy services that your company demands of you.

I won’t go into all the details here, but build engineers, here’s a tip for you: stop scripting and start managing your build process. Take a look at OpenMake Meister, the build engineer’s desktop.

Happy Building!

– Matt

Written by Matthew D. Laudato

March 22, 2010 at 2:00 pm

Continuous Build Automation with Subversion and Meister

with one comment

I recently got a chance to work on a project using Collabnet Subversion and OpenMake Meister and put together a short demo on how to get the two tools to work together doing continuous integration. You can view it at http://www.openmakesoftware.com/flashdemo/Meister-SVN/omsvn_small/omsvn_small.html

Meister like most CI tools has several ways to kick off a CI build. You can do a scheduled build, or you can poll the SCM system. The third way of doing a CI build is to call the build from a Subversion hook. In the demo I show two of these methods: a scheduled build in Meister, and calling Meister from the Subversion post commit hook.

The setup is pretty simple. I have a repository in Subversion that has working copies for developers, and what I’ll call a ‘hands off’ working copy that only the build process uses (meaning, no developers are ever in that copy making changes. It receives changes strictly through a ‘svn update’ command run by the CI process). In Meister, I have a workflow that knows how to build a small DOS application from some code in the repository.

In the demo, I first show Meister running a build on a schedule. Meister updates the ‘hands off’ working copy and then compiles and links the code. In the second case, I turn off the scheduler, and instead activate the post commit hook in the Subversion repository. The hook code calls the Meister command line, which looks like this:

 

java -cp c:\openmake-meister\client\bin\omcmdline.jar com.openmake.cmdline.Main
-BUILD "WINDOWS BUILD WITH SVN"

 

The same workflow runs in both cases. The advantage of running from the hook is that you are always guaranteed that every transaction in Subversion gets built. On the other hand, setting a scheduler to run every hour is easy and might be more appropriate for shops with less frequent code changes. In both cases Meister is driving the build with its dependency analysis engine, so the builds are fast and highly parallelized.

Overall it was pretty easy both to get the Subversion repository configured, and to get the Meister workflow up and running. The Meister command line lets you do things like set environment variables (not shown above), so you can control the workflow at a fine level of detail.

Happy Building!
– Matt

Written by Matthew D. Laudato

January 22, 2010 at 7:59 pm

Speedy Java builds slow down productivity

with 2 comments

I never cease to be amazed at how software development management ignores the build problem. In a recent article in SDTimes, Alex Handy reports on a survey undertaken by RedMonk that found Java developers spend nearly 8 minutes per hour doing software builds. In round numbers, that’s an hour a day, 5 hours a week, and for a typical 48 week year, nearly 30 work days every year. Let me repeat that. Your Java developers, who are presumably paid $80-120k per year, are spending 30 days per year staring at their screens waiting for builds. If we take the middle of this range, this is a productivity loss of over $12,000 per year per developer. For a small 5-10 person team, you could hire an extra developer just with the savings from improving your build process.

The article goes on to say that Java developers “have learned to build smaller portions of their projects at a time, or to compile small pieces of code and inject them into running applications”. I’m trying to think of two worse practices in professional software development. If you build a smaller portion of your project, you risk having unsatisfied dependencies, and thus wasting time when your runtime fails to operate properly. Similarly, injecting code into a running application is dangerous even in a test environment for similar reasons – you haven’t spent any time understanding the dependencies, and therefore are at risk of dependency-related failures.

As my readers know, I work for a software company (OpenMake Software) that sells a software build and workflow automation product. We have tens of thousands of Java developers using our product through our Eclipse plugin, which gives you true incremental builds that take dependencies into account. You can build, test and do a test deploy all from within Eclipse, and not have to resort to trickery such as injecting partial code into your test server. Since the cost of Meister (which I am not at liberty to publish here) is significantly less per developer than the annual productivity loss calculated above, it makes sense for development managers to move their Java developers away from the wild west of desktop builds, and consider a formal build system. Anything less and you’re ignoring one of your primary missions as a manager: controlling the development process.

Happy Building!

– Matt

Written by Matthew D. Laudato

December 3, 2009 at 8:27 pm

Integrating OpenMake Meister with Archiva

with one comment

Archiva (http://archiva.apache.org) is an open source repository manager that lets users access binary and other objects for use during software builds and deploys. Its functionality is similar in many ways to traditional version control systems, and OpenMake Meister (http://www.openmakesoftware.com) integrates with it as such. There are three ways to integrate with Archiva using Meister.

1.       Via the file system. Archiva stores objects transparently in a structured way on the file system. To include the contents of an Archiva repository, create an entry in a Meister Dependency Directory with an appropriate name, that contains a path to the repository location. For example, the default archiva installation includes JUnit 3.8.1. Creating a Dependency directory with the name ‘JUNIT381’ and the value ‘C:\tools\archiva-1.2.2\data\repositories\internal\junit\junit\3.8.1’ will enable Meister to include any libraries found in this directory as part of Meister builds. I don’t really recommend this method, since someday the transparency of the files may change in Archiva. But it works in a quick and dirty way.

2.       Through a webdav client. The native interface to Archiva is WebDAV (see http://www.ietf.org/rfc/rfc2518.txt for the WebDAV RFC). A Meister activity can be easily created to execute a command line GET operation against the Archiva repository. For example, if you use the BitKinex file transfer client, you can retrieve junit-3.8.1.jar by using a Meister activity that runs the following command:

URL=http://localhost:8080/archiva/repository/internal/junit/junit/3.8.1/junit-3.8.1.jar
bitkinex.exe cp /noinfo /force $(URL) c:\temp

This retrieves the file and copies it to c:\temp, where it can be used as part of a build. This is probably the best solution. When building, you really want a local copy of the files, at the very least for audit purposes, and there are plenty of WebDAV clients out there.

3.       Mapping a web drive. Most operating systems support mapping a network drive to a web location. For example, on Windows, you can add access to the JUnit 3.8.1 repository by mapping a drive in DOS as:

net use Z: “http://localhost/archiva/repository/internal/junit/junit/3.8.1”

Once this is complete, you can access the files in the repository simply by referencing drive Z: in a Meister Dependency Directory. One restriction, on Windows XP your repository must be running on port 80, as Windows does not support web drives on any other port. This is also a pretty good method, but it eats up drive letters fairly quickly if you have many libraries that you want to uniquely map.

I’ve tried all three methods in my build lab, and while I don’t think I would use Archiva in a production environment (my bias is towards actual SCM systems or a managed file system for storing 3rd party build dependencies), it definitely ‘worked’ and was fairly easy to integrate with.

Happy Building!

– Matt

Written by Matthew D. Laudato

November 12, 2009 at 3:12 am

A Tale of Four Builds

with 2 comments

After a long break from blogging, I’m back. I’d like to invite all my viewers to a webinar that I’m hosting on Wednesday October 21, 2pm EST. The webinar is titled ‘A Tale of Four Builds’. In it I take a single piece of code and build it using four different build technologies. This survey of build methodologies moves from manual, error-prone processes to highly controlled and repeatable processes. If you’re a software engineer, development manager or build and tools manager, this webinar will help you sort out the pros and cons of the various build technologies that are common in today’s software environment.

Hope to see you there. To register, go to: https://www1.gotomeeting.com/register/554771256

Written by Matthew D. Laudato

October 16, 2009 at 1:09 pm

Build Management with AccuRev and Maven

with one comment

This is a cross post of a blog I did for the company I work for, AccuRev, Inc. In it I describe how to use the recently released m2eclipse Maven integration with the AccuRev software configuration management (SCM) product. Maven is a very cool build and project management tool, and the combination of working with Maven via m2eclipse and Accuev via the AccuBridge for Eclipse plugin is pretty powerful. If you are interested, please click here to view the original post.

Written by Matthew D. Laudato

June 25, 2008 at 6:43 pm