Archive for the ‘Design’ Category

Book Review: Domain-Specific Languages by Martin Fowler

Friday, March 18th, 2011

Executive Summary

The book succeeds quite well to demystify and rehabilitate the development of domain-specific languages. It points out in particular all the small specialized languages that many of us use every day. Further, it provides a good catalog of useful techniques illustrated by examples in various languages.

 

Full Review

Introduction

Domain-Specific Languages was written by Martin Fowler with Rebecca Parsons and published by Addison Wesley in 2011. It aims at both making the case for using domain-specific languages in everyday programming and providing a set of techniques often used in their implementation.

Structure

The book is divided into five parts. The first one is dedicated to a discussion about DSLs, what they are, when they can be useful, and a quick overview on how to get started. This is the only part of the book to be read linearly. The last four parts are presenting patterns, for lack of a better word, that can be applied when creating a DSL. It is useful to get acquainted with the material but in depth reading can be done as needed.

Content

The book is very rich in content accompanied by a lot of tailored examples. It is more than enough to get you started writing a DSL and making the case for it. There is also enough to get you confused on the best course to follow for a particular application.
On the negative side, regrettably I still fail to form a good idea of what a ‘Semantic Model’ is and truly feel what its benefits are. Martin Fowler is adamant that it is a very useful piece to have between the syntax and the domain model. It would have been nice to see a non-trivial example in the book or a reference to one that is widely available to illustrate the concept.
The second regrettable element in the book is the lack of references to the academic literature on the topic to support exploring some of the concepts in more depth. Not only does a lot of the research on languages in general apply in the simpler context of DSLs, but a lot has been written on the particular topic. Google certainly is helpful but it would have been nice to have a few curated pointers to things written outside the ‘Thoughtworks world’ , in paricular on grammars and on semantics. I am also suspicious of the way Martin Fowler takes great care to install his vocabulary. I sometimes had the feeling that the book was there as much to support ‘Thoughtworks’ business as it was to instruct the reader. That being said I am naturally quite suspicious. This is particularly the case on how the author defines embedded vs internal vs external DSLs.

Style

This is the first book I have read from Martin Fowler but I do follow his Bliki. Unfortunately the style of the book is close to that of the Bliki. And that does not work for me, as I expect a more formal, authoritative style to a book. As it is, the first-person singular is used way too much to my taste. This often turns to be an opinion book (and that is usually stated in-situ by the author himself). In other cases, the author debates alternatives, then concludes to the superiority of one or the other (in his opinion), rarely resorting to other sources to support either side of the argument. Finally a lot of space is devoted to explaining why some topics are not covered, or not expanded once again without redirecting the reader to other sources that discuss it. While a lot of it would be just fine on the context of Bliki, it turned out to be very irritating when reading the book.

Final Words

The book may be self-serving and sometimes fail to expand beyond the author’s opinions into the wider world. Having said that, it is full of valuable information and the last four sections of the book will certainly prove useful to a lot of us. Many of my criticisms can be remedied by searching the web for material on DSLs, formal grammars and formal semantics.

Test Driven Development and Design by Contract, the return

Thursday, July 8th, 2010

I wrote a long time ago about how I would like these two tools to be combined.

In the end I gave it a shot and just posted that on github. I have not tried it in real life yet but I will soon. The README explains the intent and the design assumptions. Check the tests for some examples.

On the other end, nothing is more than valuable than your input so please let me know what you think about it.

Something for the week-end

Saturday, February 27th, 2010

I used to like overloading methods to provide a default behaviour for any object and a specialised one for some select types. That’s possible when the language supports multiple dispatch. It is not possible in Java:

import junit.framework.TestCase;

public class TestOverloading extends TestCase {
    public static final class OverloadedClass {
        public String thisIsA(final Object o) {
            return "Object";
        }
        
        public String thisIsA(final String s) {
            return "String";
        }
    }
    
    public void testIsCallingObjectFromObject() {
        final OverloadedClass tested = new OverloadedClass();
        final Object value = new Object();
        
        assertEquals("Object", tested.thisIsA(value));
    }
    
    public void testIsCallingObjectFromString() {
        final OverloadedClass tested = new OverloadedClass();
        final Object value = new String();
        
        assertEquals("Object", tested.thisIsA(value));
    }
    
    public void testIsCallingStringFromString() {
        final OverloadedClass tested = new OverloadedClass();
        final String value = new String();
        
        assertEquals("String", tested.thisIsA(value));
    }
}

Configuration or DSL

Wednesday, July 1st, 2009

I have already mentioned that, in my opinion, configuration is often the result of a lack of design. In other words we introduce configuration as a substitute for a design decision. There are some cases however when it is desirable to make things configurable. For example, when a software vendor needs to enable clients to plug in custom behaviour without a restart of their system.

I would contend this is a rare scenario and that in most cases injection of dependency addresses the actual need (think assembly with Spring for instance).

I imagine cases when that is not enough and I was wondering at what point does the configuration parameters start to form a DSL. I was also wondering when it becomes more efficient to create a DSL to program the adaptation to local conditions. An initial guideline could be that it is as soon as you thing of storing the configuration in an XML document.

What is your take ? Does it help to rethink configuration as a DSL ? What is your experience ?

Configuration and the lack of design

Friday, March 13th, 2009

I recently had a discussion with my newly appointed manager over his idea that having to redeploy the system in order to change its behaviour is bad and that configuration files would be a superior solution. As it happens our system is Spring contained so it does not lack configurability and we take advantage of that to support diverse environments. However that is not what he was thinking of. He was thinking of having the actual functions of the system being configurable. It took me a while to find a way to express why it is a bad idea and usually the result of a lack of design.

So let say you want to implement a function that in the context of the system sends an email to the system’s support team. Natural parameters of that function would be the subject and the body as these are known only during the course of the system’s operations. The subject, in this case is a configuration element. The configuration element here is the recipients email address. Depending on the circumstances you may hard-code it to, for instance, “support@digitalbrikes.com” and let the administrators of the e-mailing system figure out this address should map to actual people. You may make it a configuration parameter passed to the system at start-up time, for instance, “sales-support@digitalbrikes.com” or “web-support@digitalbrikes.com”. This can clearly be useful. Another solution would be to configure the actual list of people directly, for instance “denis@digitalbrikes.com, bob@digitalbrikes.com”.

So what the configuration does is it increases the number of parameters of a function. Say we call our function f(x,y), adding one configuration parameter means that now we have a function g(x,y,z). Overall this is not too harmful, especially in an object oriented context where the object would carry the configuration parameter. The development effort and testing effort remain similar.

Now let’s take a more complex example where two configuration parameters are used to determine the parameters of the executed function. So suppose that we have our g(x,y,z) function that sends an email to z. Let’s now suppose that our function is that we want to send emails with a given prefix to management in addition to support. We now have three configuration parameters: support recipient, management recipient and the prefix that will determine who the email goes to.

Now the situation is that we have a function h(x,y,t,u,v) to send our email. Clearly this is more complex to code and test. Lucky for us our particular function can be expressed as the composition k?g i.e. h(x,y,t,u,v) = g(x,y,k(t,u,v)), which is somewhat simpler to code and verify than a single five parameter function.

Throughout this post, it is obvious the configurability may be legitimate in the context of the system. What the examples illustrate is that it significantly and quickly increases the complexity of a system and that therefore it is expensive and should be used only when necessary. In other words reducing configurability should be a design goal (simplicity).

Enforcing package dependencies with JDepend

Thursday, March 5th, 2009

For a long time we had a JDepend report running in our continuous build but it was rarely used or consulted. When I saw xxx presentation at QCon I thought it would be nice to straighten up the structure of our code base and start paying attention at some of the architectural qualities of the project.

Now I am willing to give Sonarj a try. It looks great and enable to control signal breaks in the package structure before they happen. One of the drawbacks is that it uses the package structure to describe architectural modules, which may be possible in green fields but is a bit challenging in an existing code base.

Meanwhile, I have given JDepend another look and in particular the possibility of integrating it with a test suite (JUnit at the moment but TestNG might enable to provide a nicer report rather than the blanket assertion). For now I am concentrating on the description and enforcement of the dependencies between packages.

@Test
public void verifyDependencies() {
    thisPackage()
        .dependsOn("my.super.fantastic.package")
	.andOn("my.sortof.ok.package")
	.andItShouldNotDependOn("that.pesky.vendor.package")
	.orOn("my.very.bad.package");
        
    assertItHasAllAndNoOtherDependencies();
}

This will break if new dependencies are created or existing dependencies are removed (see the base class DependencyTestBase).

A few things I would like to do in the future:

  • a report of undesired dependencies that would give developers clues as to where they can make improvements,
  • when the description of dependencies is not comprehensive it would be nice to forbid some dependencies,
  • be able to set other thresholds for things like abstracted-ness.

As always, what do you think ? What other tools do you use for your architectural needs ? Sonarj (The demo at QCon was quiet interesting) ? Structure 101 ?

The state of my testing art

Sunday, March 1st, 2009

This post is prompted by a recent (or fairly old by modern speed standards) entry by Jay Fields. My own experience over the past 3 years has been exactly what he describes, dead ends, changes, improvements which leave the test code base in a state fit for archeological research.

The biggest pitfalls I fell into (and made other people fall into, but they did not have anything better to offer): too many assertions per test, too many mock expectations, too much set up, too much coupling with the implementation. A lot of those problems stem from the overall quality of our design at the object level which in turn is explained partly by our immaturity in using tests to shape the low level interactions between objects.

I think we are doing a bit better now and I want to share what I try to do now.

Here is an example for a unit test:

@Test
public void putShouldStoreValue() {
        givenAPropertyName();
        givenAPropertyValue();
        
        BaseMutableDataStore testedContainer = new BaseMutableDataStore();
        testedContainer.put(propertyName(), propertyValue());
        
        assertSame(propertyValue(),testedContainer.get(propertyName()));
}

This uses primarily the builder idea that Jay presented to remove the technical details of stubbing and set-ups. The idea, taken from BDD frameworks and rspec/cucumber in particular is to focus the text of the test on the assumptions and the assertions. Nothing here is new. But I had I read an example like this 3 years ago, I would have saved myself and my team-mates a lot of trouble.

I am currently trying to expand the same presentation to integration tests that our QA could review and help improve and refer to as part of their testing plans. Hopefully more on this in a future post.

Also, as I am keen on continuously improving, please do not hesitate to help me get better (my special thanks to Jay for his posts already).

QCon San Francisco 2008 – Kent Beck – Responsive Design

Wednesday, December 3rd, 2008

It seems incredible that the room picked for this talk was so small. As could be expected it was filled and overfilled.

I always seem to walk out of Kent Beck’s talks with mild disappointment. This is probably an effect of me expecting a full hour of breath taking and ground breaking material from him. The reality is that the material he presents is so refined (reduced to its simplest form but no simpler) that most of the talk is filling.

Building on previous work, he started with a definition of design and the influence of values (how do I evaluate the quality of a design) and principles (to some extend an operational effect of the values) on the design as well as the use of patterns.

Then came the core of the talk: design strategies. Kent Beck identified four design strategies from his experience. They are really strategies to evolve a design.

  • First is the leap, jumping from one design to another in a single go.
  • Second is Parallel, where the old design and the new one will co-exist for a while.
  • Third is the stepping stone, whereby successive intermediary steps are defined and implemented before arriving to the new design.
  • Fourth is simplification, whereby the design is made for a simplified version of the problem and then enriched by reintroducing the complexity.

I will craft examples around boat building to better explain what I understood. Leap would be to transform a sail boat into a motor boat by cutting the mast and adding a motor. Parallel would be adding a motor and when that motor works in a satisfying manner removing the mast. Stepping stone would be, starting from scratch to build a hull that floats, then add flooring and oars and verify the boat navigates, finally add a motor. The simplification would be to build boat that can navigate on a lake on a calm day, then enhance it to navigate on a stormy day, then adapt it to navigate close to the coast and finally adapt it to navigate on the open ocean.

I wonder if prototyping is also a distinct design strategy.

Anyway, overall a valuable talk from Kent Beck. Much better than last year’s or this year’s keynotes.

QCon San Francisco 2008 – Tim Bray – The shifting storage spectrum

Thursday, November 27th, 2008

For those who arrived late that day (the day after the conference party), they missed the most important talk of the conference. See Martin Fowler’s reflection on it to get an idea.

Tim Bray QCon 2008 Storage Hierarchy.png

For me it was a liberating talk. First because I had fallen into the habit of not thinking enough about storage, with the reflex of falling back to the familiar RDBMS type. And this is costing my current project a lot of time and effort. Tim Bray managed in fifteen minutes to break that idea by presenting the storage choices available at each point in the storage hierarchy.

For instance when you have to persist objects, why go to object relational mappings and RDBMS at all ? Why not use the file system directly ? As a side note, in my current project we did and had to revert to RDBMS because our system administrators could not make file system replication work !

The talk continued on with a very useful enumeration of the various alternates for each element of the persistence stack.

He concluded on performance comparisons between storage media (memory, networdked memory, solid state disk, disk and tape). And a quick review of the impact of the file system implementation on the disk performances.

A liberation I tell you !

The proxy pattern

Tuesday, October 14th, 2008

I recently had to prepare a four minutes talk to present my favourite design pattern. Proxy was my choice and it made me realise a few things about it.

First that in the pattern (as it was defined by Erich Gamma, Richard Helm, Ralph Johnson, and John Vlissides) the proxy has the same responsibilities as the real subject. This is a much more restrictive definition than what I had come to think as proxies. Also in one form or another the proxy is used to control (mediate ?) access to the real subject. Nothing new here, this is the plain definition.

I have three typical uses for the Proxy. First the access to remote objects which is now so commonplace that people hardly notice it. Second error handling around resources that can become unavailable (connections for instance). Finally access to pooled or cached instances of objects. In that last case it means the proxy can be left as a permanent placeholder and negotiate the use of the actual resource when needed (see of the impact of caching on your design). This last use also realises for dynamic objects what the flyweight pattern intends for static ones.

After thinking about I found why this pattern was significant for me. First is that proxies enable some degree of separation of concerns (say error handling versus and actual function), making the proxy a lightweight aspectisation (see how Spring does aop). Second I like it because it provides a great example of the usefulness of loose coupling through interfaces (another one being unit testing). Finally I find that proxies give a more dynamic feel to an application. They enable easy reconfiguration, hot swapping of objects and controls. This is particularly useful in a language like Java.

I will end this short presentation of my “favourite design pattern” with a note on dynamic proxies in Java. They do not necessarily lead to implementations of the proxy pattern. They can just as well be used to implement other structural patterns (from facade to bridge and adapter).

And that made me realise that design patterns are first and foremost about the intent. The implementation of different patterns can be the same, making them difficult to identify in a code base. One more reason to use sensible naming in your code I guess.