Post Chronology

April 2014
« Mar    


Speaking Engagements

HR Tech, Las Vegas, 10/8-10/2014
HR Tech Europe, Amsterdam, 10/23-24/2014

Workday Predict and Prepare Webinar, 12/10/2013
CXOTalk: Naomi Bloom, Nenshad Bardoliwalla, and Michael Krigsman, 3/15/2013
Drive Thru HR, 12/17/12
The Bill Kutik Radio Show® #110, 8/12
Webinar Sponsored by Workday: "Follow the Yellow Brick Road to Business Value," 5/3/12 Audio/Whitepaper
Webinar Sponsored by Workday: "Predict and Prepare," 12/7/11
HR Happy Hour - Episode 118 - 'Work and the Future of Work', 9/23/11
The Bill Kutik Radio Show® #87, 9/11
Keynote, Connections Ultimate Partner Forum, 3/9-12/11
"Convergence in Bloom" Webcast and accompanying white paper, sponsored by ADP, 9/21/10
The Bill Kutik Radio Show® #63, 9/10
Keynote for Workforce Management's first ever virtual HR technology conference, 6/8/10
Knowledge Infusion Webinar, 6/3/10
Webinar Sponsored by Workday: "Predict and Prepare," 12/8/09
Webinar Sponsored by Workday: "Preparing to Lead the Recovery," 11/19/09 Audio/Powerpoint
"Enterprise unplugged: Riffing on failure and performance," a Michael Krigsman podcast 11/9/09
The Bill Kutik Radio Show® #39, 10/09
Workday SOR Webinar, 8/25/09
The Bill Kutik Radio Show® #15, 10/08

Keynote, HR Tech Europe, Amsterdam, 10/25-26/12
Master Panel, HR Technology, Chicago, 10/9/012
Keynote, Workforce Magazine HR Tech Week, 6/6/12
Webcast Sponsored by Workday: "Building a Solid Business Case for HR Technology Change," 5/31/12
Keynote, Saba Global Summit, Miami, 3/19-22/12
Workday Rising, Las Vegas, 10/24-27/11
HR Technology, Las Vegas 10/3-5/11
HR Florida, Orlando 8/29-31/11
Boussias Communications HR Effectiveness Forum, Athens, Greece 6/16-17/11
HR Demo Show, Las Vegas 5/24-26/11
Workday Rising, 10/11/10
HRO Summit, 10/22/09
HR Technology, Keynote and Panel, 10/2/09

Adventures of Bloom & Wallace

a work in progress

The Future Of HRM Software: Agile, Models-Driven, Definitional Development

 [Updated 6/18/2012 -- Huge shout-out to Stan Swete of Workday and Raul Duque of Ultimate (both of whose firms have been clients, as was Raul's former employer, Meta4) for their review and feedback on my much shorter first draft of this post, which addressed only models-driven development.  Based on their feedback, I've now touched on agile software development and metadata-driven definitional development.  And I would also like to thank Steve Miranda of Oracle for writing to me about the progress that Oracle is making along these same lines and Mike Rossi of SFSF/SAP for briefing me on their aggressive pursuit of metadata-driven applications.  As I learn more, outside of NDAs, about what Oracle, SFSF/SAP, and others are doing to adopt these approaches, I hope to update this post further. 

These three approaches, models-driven development, agile software development, and metadata-driven definitional development, combined and managed properly, are at the heart of achieving the order of magnitude improvement in software economics, including time-to-market and quality-to-market, which are the real focus of this post.  If I embarrass myself in front of the real experts in these areas, the fault is entirely my own,  and largely due to my own lack of expertise.  Hopefully, you'll educate me and my readers with your comments.  But I should also say that I wrote this post to be accessible to  as wide as possible an audience, to respect all the NDAs under which I operate, and to be just a short introduction to these important ideas.

Please note that I've added several additional resources at the end of this post.]

A Little History

And The Magic Happens Here!

In 1984 I published an article in Computerworld entitled “Secret to cutting backlog?  Write less Code!”  The entire article may be worth your time, but then perhaps I’m the only one who is amused by the memories of so distant a past in the history of business applications.  What’s relevant to the here and now is that, even then, I was painfully aware that we were buried in demands for business software — and in ever-changing requirements to extend and change that software — that we were never going to be able to meet unless we changed fundamentally our whole approach to designing and developing same. 

Thus began the intellectual journey that led to this post.  AMS, my employer in 1984, had been given a ton of money by the US Army to develop the requirements for a new personnel system.  But an important requirement of that contract — important to me personally because, on some level, it was the making of my career — was that we launch two parallel projects to define those requirements.  One project used traditional (what we now call Victorian novel) requirements definition, which was the norm in the then standard waterfall systems lifecycle.  The other project used one of the original requirements gathering CASE tools, PSL/PSA, along with the best (then) available event-partitioned data and process modeling techniques.  I ran both projects at AMS, and I learned, quite painfully, three professional life lessons about the limitations, even wrongness, of the then current aproaches to systems design and developmet.

Lessons Learned At AMS

The first big lesson was the wrongness of depending entirely, for system requirements, on asking customers what they want “the system” to do.  What I learned on that project and have practiced ever since is to study the customer’s business, represent the essence of that business in models, and then conceive of how available — and even not quite yet available — technology could be used to reinvent that business.  And while the techniques of modeling have evolved considerably, great domain models have always been intended to let us experiment with a business domain, to understand it and to reinvent it, in ways that we could never do with an existing organization. 

The second big lesson was to focus on the pattern in the problem rather than to get buried in all of the details, to see the essential nature of HRM rather than just the surface confusion.  It’s that study of the pattern in the problem that was the genesis for so much of my thinking about preferred architectural behaviors.  For example, determining eligibility for something is a foundational pattern in HRM, e.g. for participation in a specific developmental event, qualifying for the payout in a specific project success compensation plan, enrollment in a specific health care benefits plan, or getting a yearly replacement for your smartphone.  Once it becomes obvious that the eligibility pattern will be needed across HRM, and that the eligibility criteria represent a bounded albeit large and complex set of Boolean expressions across a domain objects and their attributes, one can conceive of a metadata-driven eligibility “engine” which can be used across all of these examples and many more.  My emphasis on these preferred architectural behaviors — and there are many — is central to my quest not only for writing less code but also for elevating the use of metadata to drive HRM software.

The third big lesson was that the then best practice waterfall lifecycle was based on three fallacies and was, therefore, a complete fool’s paradise.  In the waterfall lifecycle, you spent a ton of time and resources pinning down requirements, written in a Victorian novel style that business users of the era could understand and validate, and then got them signed off in blood before moving on to design, development and more.  The first fallacy was that the users knew what they wanted when they mostly had no clue what was possible.  The second fallacy was that those pesky requirements, even assuming they were correct, would stand still long enough for us to build and deliver the software — let alone continue to stand still once that software was delivered.  We were awash in requirements documents that demanded traceability throughout the lifecycle, and very little energy or time was left for innovation.  But the third fallacy was the real killer.  By the time you were awash in requirements, it was practically impossible to discern those patterns in the problem that would have led to elegant designs, to producing less code because we could build and reuse those patterns.  The evolution of software lifecycles to what we now refer to as agile is a direct response to these and many more fallacies of that older waterfall approach.

AMS made major breakthroughs in all these areas, and the learnings noted above were definitely not achieved on my own. But I do believe that I was early in applying these learnings to the HRM domain.

Lessons Learned At Bloom & Wallace

When I left AMS, I proceeded to build upon what I had learned to model and remodel the HRM domain, using better modeling techniques as I went along and testing my thinking with a serious of large, global clients and their strategic HRM/HRMDS planning projects.  And, something I hadn’t been able to do as fully while still at AMS, I saw many more patterns in the domain, both as to the subject matter and the system capabilities needed to deliver that subject matter, which I expressed as part of a growing body of preferred architectural behaviors.  

Over the last twenty-five years, my work in modeling the HRM domain and seeing those patterns was the basis for creating and supporting my widely-licensed ”starter kit.”  I believe this work has had some impact on the underpinnings of the best of HRM enterprise software, and influenced (again, I hope, at least in a small way) some HRM software product architects and business analysts.  Earlier this year, I announced that my domain model IP would not be licensed beyond 2012, but that doesn’t mean that our work in this area is done — not by a mile.  The good news is that there are now a number of HRM software vendors on this path to getting a lot more bang for their buck — and for their customers.

The promise of CASE tools, which had been the holy grail of software engineering, was that it would be possible to go directly from a completely modeled expression of the desired aspects of the domain directly to usable functionality, delivered functionality, without being touched by human hands.  The hope was that we could build a set of tools, putting all of our computer science and engineering talent to work on those tools, which would be able to gobble up those models, themselves defined to these tools, and presto, chango, out pops the application.  No compilation, no hand-tuning, and no messy/expensive/error-prone/slow-to-market applications programming. 

This concept, which was called models-driven development, has evolved into the more definitional development approaches used first (to my knowledge in the HRM domain) by Meta4 in the late 90′s and, more recently and with much greater visibility in the US by Workday.  We’ve also added to our toolkit the creation of metadata-driven “engines” that can be used and reused across the HRM domain.  There are other HRM software vendors working with these techniques, to include reports on same from both Oracle and SFSF/SAP, and there’s a lot more of such work that is still in stealth mode but with very promising early results.  I hope to see a lot more of this become visible to the market before the end of 2012. 

I believe that this combination of an agile lifecycle with models-driven and definitional development, to include the development of metadata-driven “engines,” are our best hope for being able, finally, to wrestle to the ground the very real challenge of having our business applications evolve as quickly as do our businesses — and without adding to the tremendous technical debt burden which so many HRM software vendors are facing.  Writing less code to achieve great business applications was my focus in that 1984 article, and it remains so today.  Being able to do this is critical if we’re going to realize the full potential of information technology — and not just in HRM.


There’s so much more that I should write about the strengths and some pitfalls of agile software lifecycles, about how modeling a domain in objects helps us see the patterns in that problem domain with enough clarity to build metadata-driven “engines” from those patterns (e.g. an elibility, calculation or workflow engine) rather than creating lots of single purpose applications, and how those models can become applications without any code being written or even generated.  Hopefully, the real experts in our industry will jump in to correct what I’ve written and to expand upon it.  And I’d sure love to hear from HRM software vendors who aren’t my clients but who are practicing and advancing these techniques. 

What’s important here is to make as clear as I can the power of any HRM software architecture, of any development approach, whose robust domain object models become the functionality of the applications with a minimum of human intervention, whose business functionality is therefore built and modified only at the models level.  Such an approach can be very flexible initially and over time, easy and fast to implement, and inexpensive for both the vendor/provider and customer to acquire and maintain.  And this approach provides full employment for anyone who really knows how to elicit well-constructed domain models from the business ramblings of subject matter experts.  Most important, such an approach shortens the intellectual distance between our understanding of the problem domain and our automation of that domain.

I would be remiss if I didn’t point out that the challenges to accomplishing this are huge (but the moat created by any vendors who succeed is equally huge):

  • difficulty of accurately representing the domain in a rigorous modeling methodology, along with the need for extensibility, evolution and modifications over time;
  • difficulty of building tools which can translate those models into operating objects;
  • difficulty of seeing the patterns in the domain with enough clarity to recognize the needed “engines” — and then in building “engines” which can operate solely on metadata;
  • difficulty in abstracting complex HRM business rules to metadata;
  • difficulty of achieving operational performance with large volumes (although in-memory data/object management opens up a lot of possibilities here as well as with both embedded and predictive analytics);
  • difficulty of adjusting those operational objects as the models evolve without human intervention, i.e. without coding; and
  • many more that keep software engineers awake at night.

As big as are the challenges, or even more so, are the benefits if those challenges can be met.  And it’s my opinion that the amount of lift I mentioned above, if it can be achieved and sustained, made to scale both operationally and in a business sense (e.g. finding those few individuals who understand the HRM domain in a profound way and who are able to express that domain in fully articulated models is a huge challenge to scaling these approaches), will change in a fundamental a way the economics of the HRM enterprise software business.  If I’m right, you’ll want to be on the agile, models-driven, definitional development side of the moat thus created, whether you’re an HR leader, working in the HRM software vendor community, or an investor in that community. 

 Some suggested readings:

Metadata-Driven Application Design and Development by Kevin S Perera of Temenos January 2004

Summary: Presents an overview of using a metadata-driven approach to designing applications.  If the use of software frameworks can be defined by patterns, then metadata is the language used to describe those patterns.

Workday’s Technology Strategy:  A Revolutionary Approach To Redefining Enterprise Software from Workday, 2006

The Work Of Jean Bezivin

The work of Jean Bezivin at the University of Nantes, France, where is now an emeritus professor.  You can reach him at or follow him on Twitter @JBezivin, which I do religiously even though I can’t fathom a good bit of his citations, and not just because some of them are in French.  But this blog is in English

The Work Of Curt Monash

Curt, an expert in all things database-related, has written some of the best pieces I’ve seen on the underlying Workday architecture, especially as regards their data architecture.  Of particular interest may be his most recent piece on this:

The Work of Johan Den Haan

Johan is the CTO of Mendix, a vendor of applications delivery PaaS.  He writes on a wide range of topics related to the subject matter of my post, and I’ve learned a ton from following this work.

22 comments to The Future Of HRM Software: Agile, Models-Driven, Definitional Development

  • Hi Naomi. Really good to find this content (thanks, Google) and to know that what we are doing in Imperial College London and in a number of UK universities is not isolated. We have an ambitious initiative to bring together enterprise architecture, interface design and SOA, data warehouse design, data governance and quality, impact analysis, agile development and applications consolidation. We can only make this happen through metadata modelling and by building a queryable information repository and glossary. Have you any knowledge or experience of similar projects in other sectors, whether related directly to HRM or more broadly across other functional areas?

    Although our approach is ambitious, there are quick wins (for instance interface simplification, cross-system prototype development and change impact), which help us to continually monitor the value of this work.

    Without delivering seen value, like many enterprise architecture and modelling projects before, we risk being punted into the long grass while other projects are seen as higher priorities. Do you have any thoughts on keeping such initiatives in the eye of the stakeholders?

    • Naomi Bloom

      Toby, So glad to have your voice here. In my space, HR technology, Workday is the applications vendor which has gone the farthest down the road described in this post as well as having innovated in many other ways. They built their own tooling, and they’ve delivered thus far a pretty comprehensive HRMS/TM suite (recruitment is brand new and will be GA early next year, and learning has just been noted as a potential build down the road), a pretty comprehensive financials suite, and a number of items focused on higher ed, including grants management. At their recent user conference, they also announced a new initiative, their student applications. If you go to their investor Web site, there’s a replay of their financial analyst day that covers some of their plans as well as talking a little more about their architecture. And there are some good posts on their blog from Stan Swete about their technology. While there are many others in my space on the agile, models-driven, metadata-rich, definitional development journey, I believe that Workday is the farthest along. If it would be helpful, I would be happy to connect you to the right person there to discuss their approach to these topics. In full disclosure, they have been a consulting client as have been many of their competitors, and I have licensed my own let of HRM object models and architectural “starter kit” to a few dozen vendors across the HR tech industry.

      From a strictly tools perspective, you may want to look at Mendix, EnterpriseWeb and Procession. I’m not a tools expert by any means, but these three firms all appear to be doing interesting things around the topics of my post. I have good contacts at EnterpriseWeb and Procession if you need introductions. My email is I’ll be on the road until Nov so would need your patience on any intros. I would also need to know a little about your, including title and contact information, if you want me to connect you to anyone mentioned here. Best of luck with you plans.

  • [...] software vendors and outsourcing providers hot on the trail of bringing correct object models and Blooming architecture to their increasingly true SaaS platforms.  Yes, metadata frameworks, proper multi-tenant [...]

  • [...] object models (and getting to correct is no mean feat) as the foundation of their applications,  a metadata-driven definitional development approach to writing a whole lot less code (so code only for the tools and NOT for the [...]

  • [...] vendors and outsourcing providers would be hot on the trail of bringing correct object models and Blooming architecture to their increasingly true SaaS platforms?  Yes, metadata frameworks, proper multi-tenant [...]

  • [...] But there are some objects, like JOB, POSITION, KSAOC and WORK UNIT, whose definition, attributes, methods, and lifecycle events are much more directly under the control of the organization.  There may be relevant regulatory or contractual (and here I’m referring to labor contracts) constraints on how we define objects or on their lifecycles, but much of what goes on here relies on what the organization chooses to do.  And there are considerable similarities in the lifecycles of these constructed objects: the organization conceives of the need for a new JOB, it determines the attribute values and methods for that JOB, it opens up that JOB for use by particular parts of the organization, and so on.  When modeling the HRM domain, it becomes clear that the highest level object classes and many lower level objects fall into distinct groups — including person, external organization, constructed, event — whose object lifecycles are patterned by group.  Knowing these patterns is one key to being able to automate these objects in more of a definitional than procedual way. [...]

  • martin hoyes


    Joel’s ears are probably burning with this discussion here. His upfront vision and patience to go with this definitely deserves a mention.

    You are right in that Joel probably rolled Integral learnings into the second generation Ebiz suite. The initial Ebiz benefits team was a nice blend of ex-Integral and Ebiz core HR transplants based on one floor day and night in California during the dot-com boom. Interesting, the meta-data schema design was firmly influenced by ex-Integral (utilizing CASE tools for schema maintenance). The life event processing perhaps was of equal influence, with the underlying batch API strategy a progression from our Ebiz core HR and HR self service experience.

    In addition to the climate that Joel created to allow this to occur we had very productive and committed deployment partnerships for business, functional and technical aspects which were probably also key outside of the architecture.

    Particularly topical and fresh innovation from the team was the in-memory processing integration directly into the benefits engine. This in-memory gamble by a few of us at the time probably ensured long term commercial viability of the Ebiz benefits platform. Thinking back we probably achieved this in several distinct Agile iterations over three years in parallel with customers deploying and scaling rapidly. Certainly we were a long way from waterfall.

    The biggest individual in-memory surgery was the mapping of the relational meta-data schema defined by the ex-Integral folks roots into the benefits eligibility engine session memory. The technical complexity here was working with tens of Megabyte rather than the Terabytes of today. At runtime this implementation did actually have some object based characteristics whilst working within the limitations of a procedural language.

    Joel probably got more than the architecture and methodology right here. Perhaps the team blend and the selection of partnerships were also as significant. The team did disperse their separate ways fairly rapidly after the dot-com bust but the platform lived on to incrementally evolve with the times at less of a pace.

    You probably know I have to state here that these are obviously my own views and not those of my employer. All of this is backward looking dot-com boom memories rather than forward looking product/platform direction.

    • Naomi Bloom

      Your team was so far ahead of its time, and I thank you for this stroll down memory lane with Joel. We lost a lot when he left the industry.

  • martin hoyes

    I know not necessarily associated with next generation architectures and not often mentioned in the same sentence as Saas vendors. However, the second generation of Ebusiness suite offerings on the benefits foundation back in 2000 tick a lot of the boxes above. Certainly not objects but definitely eligibility/calculation engines and meta-data driven. This original benefits
    eligibility engine has been relatively seamlessly adapted over the decade to underpin areas such as absence management, compensation, grade step progression, contingent worker, irecruitment etc. We definitely took a lot of architectural criticism inside and outside of oracle back in 1998 when we went with 400 meta-data tables and 15 transactional. It was new then and people struggled with it as mentioned in the article. Over the years I’d say the biggest difficulty has been preserving the purity of the engine against the solution development practice of case by case development. I can’t tell you how many proposals I’ve had over the years to re-architect our eligibility engine to accomodate a particular solution. So preserving the purity of the engine has been the standout challenge looking back.

    This is an interesting topic. Just thought I’d point out it’s not as new as people may believe and certainly not exclusive to Saas architecture. I would say very good architectural practice for HRM platforms now and looking forward!

    • Naomi Bloom

      Martin, You are so right about many of these ideas not being new or about SaaS. I know a little something about the EBS work you reference above, and how far ahead of its time this work was. Well done you for preserving the integrity of your design in the face of many pressures. I had worked with Joel Summers at Integral, before he joined Oracle, and I believe we learned a lot together about models-driven development, meta-data driving “engines” and more when Joel led Integral’s (after Dave Duffield left to start PeopleSoft) next gen InPower project. What the newest generation, the true SaaS generation, of design/development has given us is one more opportunity to take everything we collectively know about good architecture, design, project management, etc. and apply it more completely. And I hope we do.

  • This is a great article and I hope it gets people thinking.

    The exploding diversity of sources and targets is driving people to reconsider current approaches. Conventional middleware stacks add accidental complexity as much as they add capability. Every problem in middleware addressed by adding more middleware – it just doesn’t scale.

    It is time to reassess approaches and you are spot on with your assessment – the future is in 5GLs, constraint based architecture. To paraphrase your description – a bounded albeit large and complex set of Boolean expressions across a domain objects and their attributes – a metadata-driven “engine”.

    We’ve got this today and as you and a few others rightly point out – the trick, once you have the right architecture, is in abstracting complexity so non-technical users can build and maintain their own apps, leveraging power of metaprogramming w/o having to understand implementation details.

    Dave Duggal

    • Naomi Bloom

      Thanks so much for your encouragement. I hope folks will take a few moments to check out your Web site and learn more about what you’re doing at Ideate. Some days it feels like I’m pushing noodles uphill with my nose, but I know that we’re on the right track here.

  • There’s a big challenge I’m always trying to wrap my head around with flexible meta-data driven configuration. The more ability there is for someone to configure something, the more complexity, and thus the more knowledge the person doing the configuring needs. At some point the knowledge required to create your own complex configuration to meet your business needs is akin to learning a programming language – so do you really have a configurable tool, or have you just written your own domain specific programming language that needs programmers. We have such a system that we use to develop tailored talent solutions for global organizations, but our experts do the metadata configuration. To put it in the hands of someone who just understands the business need, we have to reduce the configurability to make the system simple enough to use. So it’s hard to see an achievable holy grail out there.

    • Naomi Bloom

      Your point on this is very well-taken. Where you’re dealing with full-scale models-driven, definitional development, very skilled people are need to model and then to define the models to the development environment. And yes, they are developers, albeit working more with the rigors of developing and maintaining the models than with traditional programming languages. Quite a different matter is the level of configuration, of metadata options, that you expose to customers and partners. Here “guard rails” are needed because very few business analysts on the customer side are going to understand the full power of the models, let alone how to manipulate them without getting into awful messes. Thus, the configuration framework that’s intended for end-user analysts is very different from the development environment, however models-driven and definitional.

  • Jim Konandreas

    Love the post.

    One advantage of model driven development that I’ve always felt goes unnoticed is that “the domain model” acts as a language for everybody in the company – not just development – to understand the system.

    Too often the broken telephone setup of Market Requirement to Product Requirement to System Design Doc can never be adequately communicated back up to stake holders on how the system behaves. That leaves critical portions of the company – implementation consultants, training staff and sales – potentially not in lock step with how the system really works. A domain model, described with real world entities (like Employee, Benefit Plan, Job etc), gets everybody speaking the same language – which just so happens to be a language our customers understand too.

    I require it for every feature we build.

    • Naomi Bloom

      Thanks so much for your feedback, and I agree totally with the need not only for a common language but one that is accessible to business people. I’m excited that these are techniques you’re practicing at Dayforce and am looking forward to our meetings in August.

      • The “Map is the App” with every step described in business language all workflow and rules exposed that allows all interested parties users, managers, auditors and regulators to readily understand the deployed application and of course collaborate on any required changes.

  • Jim

    Agile has been working with a fairly good rate of success (as in way better than waterfall) for the last 10 years. MDA on the other hand has been considered snake oil during this same time.

    • Jim
      MDA or MDD just like “agile” has been overhyped and abused by those that jump on bandwagons. Some are good some are not. Believe me when I say that the good really does work – the “model” is the new build environment core code does not change no code generation or compiling. So where are you on this scale?
      All truth passes through three stages.
      First, it is ridiculed.
      Second, it is violently opposed.
      Third, it is accepted as being self-evident. –
      A quote from Arthur Schopenhauer a German philosopher who was among the first to contend that at its core, the universe is not a rational place!
      Sounds like you are still at ridicule stage whereas I think Naomi might be a believer?

    • Naomi Bloom

      Jim, Because there are good examples of models-driven definitional development, of highly metadata-driven “engines” in the HRM domain (some visible, some not), I’m inclined to believe that it’s possible (but very difficult, hence the lift for those who can do it) to use these ideas successfully. You might want to take a closer look at Workday, which is pretty visible, before declaring the referenced development techniques to be “snake oil.”

      • Jim

        We are talking bout software here right? In the soft dev world agile has been a revolution. There are several books, ‘thought-leaders’ and companies involved. There are also many developers embracing it. I could list over 20 books n’ relevant personalities without having to google a thing.

        MDD on the other hand had its prime time during the 90s. With UML at the from lines. We’ve been chasing this dragon of business specs becoming code by fairy magic for quite some time. It’s appealing, makes sense, sounds great, good luck.

        Recently DSLs have finally become more mainstream. Within that wave there’s something called language workbenches which is a second attempt akin to MDD. Sadly there’s no good ref material about that. Fowler decided to skip it on his DSL book. The jury is still debating on this.

        Please don’t let MDD take a free ride on the success of agile. They are not even close at the level of adoption/success/over-hype or community involvement.

  • Hi Naomi
    What a great piece reviewing the aspiration of the past. Over our history we have meet many who described the same story. I recounted our story here
    To create the solution you need to think people and what they require to achieve their individual and collective outcomes. After all they create all source information? The core design philosophy is explained here and is based up the fact business logic just does not change so why should the underlying software code?
    So we have created the “moat” but the forces of self interest on the other side are indeed great!

Leave a Reply




You can use these HTML tags

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>