What's the BIG deal about end user development?
End User Developed Applications (EUDA) also called User Developed Applications (UDA) and End User Development (EUD) is the practice of non-technologists or at least employees who are not in the technology department developing applications themselves and using those in a production setting123.
What's the little deal?
Many of you will be familiar with the issues that arise when non-experts develop applications and/ or when applications are developed in non-typical ways. This is not particular to end user application development but rather reflects immature technology practice wherever it occurs. Here are a few of the issues that can arise:
- inexpert (probably) use of the language and tooling leads to wasted time
- no development standards or poor adherence to standards leads to poor quality work
- no independent testing leads to errors in production
- no version control/ proliferation of old copies leads to wasted time and errors in production
- no quality control leads to errors
- no documentation leads to wasted time
- no resiliency planning leads to wasted time when unexpected circumstances arise
- opaque dependencies means applications depend on data/ connectivity/ user input/ accounts that are not obvious
- 'super glue' effect means applications get into a corporate production environment and are very hard to remove, UDA's get in to production without a rigorous tollgate process
- 'tail wagging the dog' effect means processes form around the application and pretty soon you are 'opening the confirmation spreadsheet' rather than 'confirming the transaction'.
All of that basically boils down to errors, poor process and inefficiency. That is all bad, I agree. But it's not the really BIG deal.
What's the BIG deal?
At one point in my professional past I worked for an organisation that did payment processing. They processed millions of transactions daily, all across the globe and had very robust and suitable systems for this (and I dare say larger) workloads.
They also had over 4000 user developed applications in current or recent use. Presuming every one of those applications had a single core use case then the end users in this business thought there were an additional 4000 ways to do this business.
That is the BIG deal!
This is interesting for so many reasons and I'll discuss a few in the next couple of paragraphs. Let me first state however that I didn't dig in to this 4000 so I cannot make any empirical statements about the complexity of these applications or the duplication in this population (although I know from anecdotal evidence that it wasn't particularly high). That certainly would have been nice to have but I'll continue without it in a more general sense of a mature business with a non-trivial number of user developed applications.
Let's go!
Firstly, looking at this from the point of view of wasted effort, there simply aren't 4000 extra use cases in a mature business operating in a well understood market. The payment processing business is very competitive and is only profitable at scale therefore any organisation that is in it must have the systems to do it. They've got the core use cases covered. They've got the non-core use cases covered. They are working on new use cases, at scale. If there were 4000 use cases they hadn't covered they'd be out of business. So those applications represent wasted effort.
Secondly, looking at this from the point of view of organisational dysfunction, end users are the ones in the trenches, doing the job/ using the systems day in and day out and they have requirements that aren't being met. They have 4000 of them! I know from first hand experience that end users are not knocking up spreadsheets just for the fun of it. Many End User Developers find it is a frustrating process but that frustration must have been outweighed by the frustration of an unmet requirement. 4000 times. I think it's reasonable to suggest this evidences a degree of the technology department being out of touch with its user base.
Thirdly, looking at this from the point of view of individual responsibility, how can an end user who understands the payment processing business think there are an additional valid 4000 things the organisation should be doing? There aren't! Now, granted, there's typically no global view, in other words the end user doesn't know that they are making the 4000th UDA but they must know they are not expert developers and consequently they are doing something potentially dangerous for the business. It's certainly true that the negative aspects of end user development are more widely known now than they once were. But very often it is easier to just 'bang out a spreadsheet' than to raise a requirement against a strategic system, present the case for escalation and argue succesfully for prompt fulfilment. So users have been doing the easy thing and not the right thing.
Finally, looking at this from the point of view of business agility, these applications are business quicksand. For all the reasons stated in the little deal section earlier, these applications tend to be very brittle. They very often cannot be run: by a different user; on a different machine; on a different date; in a different context or with different data. They are almost never scale invariant (in other words they will break with more volume) and they are almost never defect free. They are invisible at a global level but indispensable at a local level meaning they don't feature in planning and strategy but they are an impediment to execution of that strategy.
That's the big deal about end user development.
What should be done about it?
Maybe nothing. I've focussed only on the negatives but end user development delivers technology solutions in short order which capture business that would otherwise be lost and save time that would otherwise be wasted. Quick and dirty is quick afterall.
Maybe something. Most operations in a mature business have to be performed in a consistent manner. Application development is no different, whether done by a user or an expert. End user development is chaotic, standards-lite or standards-free and generally badly done. It shouldn't continue unsupervised in the organisation. There are technology solutions to the little deal problems provided by the likes of Cimcon and ClusterSeven among others. It's also reasonably straightforward to put surveillance in place to inventory end user applications and guard against proliferation.
Regarding the big deal problems, well UDAs can actually be part of the solution. UDA's are like bandaids. The very existence of a UDA is a pointer to management saying 'things may not be right here, come and have a look'. Reactionary policies like blanket bans on UDAs and hyper-aggressive retirement schedules will hide the underlying problems again. UDAs are the hitherto acceptable expression of end user frustration and they should be considered a valuable indicator for technology, operations and management.
That said the proliferation cannot go unabated. Local operations management must better manage the circumstances that lead to the creation of UDAs. For example: they shouldn't agree to take on processes with technology gaps. Technology product owners must provide transparent issue management and product roadmaps and actively engage with their user communities. Look at the evangelism programs that technology vendors commit to for their products. That's practically unheard of inside a corporate but it should exist. Technology strategy and business strategy must be hand-in-glove and better communication must explain why users may need to suffer short term pain in order for the organisation to gain in the long term. Quick and dirty is dirty afterall.
As always, I'd be very interested in your comments.
Have fun!
End User Computing (EUC) while sometimes mentioned in the same context is something different. I define this as the practice of end users consuming technology rather than creating it. In my mind there is also an implication that the use of that technology is broadly inline with its expected use. ↩
I'm specifically discussing the creation of technology in this post as opposed to the use of an existing technology, for example: using a browser plugin that is non-standard for your organisation. I include the latter practice under the broader umbrella of Non-standard Technology Use (NTU) to which many of the same arguments in this post apply but it's not under discussion here. ↩
I'm mostly discussing spreadsheet applications as that seems to be the predominant technology for end user development in the corporate space. However batch files, shell files, script files (given an execution context), MS Access databases, MSOffice and OpenOffice macros are all available to be 'developed' by the end user on most corporate desktops and this discussion applies equally to them. ↩