Often perceived as a key ingredient of a firm’s IT strategy, simplification is essential to achieving business goals. Scalability, robust and flexible data platforms and agility are all impeded by complex IT infrastructures. Introducing new technologies offers the opportunity to deliver a step change in capability, efficiency and robustness – enabling improved products, data analytics, client reporting and business intelligence.
Yet all too frequently, projects focus on adding more, new things, missing the crucial benefits of decommissioning, simplification and rationalisation.
Simplification is not some passive inevitability: it needs to be actively managed. Implementing a new system or data tool does not necessarily mean legacy technologies are retired. Without a clear strategy to reduce complexity, firms end up with a shiny veneer sitting on top of a clunky, risky and inefficient core, all of which adds cost, creates risk and reduces agility.
This challenge is then further exacerbated by the fact that so many core capabilities in asset management are now delivered by service providers. Particular care needs to be taken to ensure that the risks and issues related to legacy systems become one, two or possibly more steps removed from the latest initiative.
So what does good look like?
First, there needs to be a clear view of what constitutes IT simplicity. Is it better to have a few systems used by multiple functions or teams? Should the goal be to minimise the overall IT footprint and rely on services and cloud-based solutions? Does standardisation of the underlying technology stacks across applications help? Should touchpoints between systems and teams be minimised?
A good simplification strategy will address many of these goals in parallel. In particular it will identify:
- strategic technologies – firms need a clear view of which technologies to focus on and invest in, and to demise or replace. It is unlikely that firms will want to undertake a largescale technical refresh purely for the sake of simplicity, so such goals often need to be delivered as a by-product of other initiatives.
- key-person risk – tracking and managing key-person dependencies – either specialists with critical technical skills essential to keep legacy systems running or people with knowledge of intricate and complex architectures need to be tracked.
- service provider constraints – getting your own shop in order is a good start, but could you also be held back because your service providers are using out of date technologies and are not able to adapt their offerings flexibly and rapidly? Ensuring your oversight and governance structures track and manage service providers is equally important.
As Drucker says, “What gets measured gets managed”. Having an actual complexity or simplification measure will enable the impact of change to be assessed, and should be included in business cases to incorporate clear simplification benefits. (Although we mustn’t forget that the full Drucker saying is “What gets measured gets managed — even when it’s pointless to measure and manage it, and even if it harms the purpose of the organization to do so.” )
Given the pain many firms are experiencing because of IT complexity, it makes sense to incorporate a principle of simplification as a key tenet of your IT strategy. You may wish to target and track levels of complexity over time, but it is important to construct a hybrid measure that aligns with your definition of what good is. The key is an ongoing focus on keeping things simple!
Read more from Phil Hannay
- Citizen Developed IT – harnessing the opportunities offered by new technologies
- Data maturity – it’s not all about the numbers
If you’d like you receive regular news and opinions from our specialist Asset Management team, please register your details below.