Organisations working in the superannuation industry have a lot to benefit from getting data management systems and processes right. But doing so is not an easy task. While data is front of mind for most superannuation industry participants, we often see organisations that have a wealth of data, but don’t know what to do with it.
I believe there are currently three main megatrends driving the superannuation sector with regards to data – data driven operating models, sector wide interoperability and the data challenges of consolidation and mergers. These three ‘tectonic forces’ are making the industry very keen on working with us as these are our key areas of expertise.
Given the changes in the regulatory environment for the superannuation industry in the last decade, I can hardly blame them for focusing on their extensive responsibilities in these areas. But getting data right can lead to a wealth of efficiencies which can make everything – including the regulatory burden – easier for the sector. The Productivity Commission’s “5 Year Productivity Inquiry: The Key to Prosperity Interim Report”, released last week, observes that Australia’s productivity has slowed over the past 20 years, largely driven by services industries. According to the report, “new approaches, such as digital technologies and the better use of data (through artificial intelligence, for example) hold great promise for broad based productivity gains, including in services”.
Fintan Thornton is Head of Institutional Solutions at Allianz Retire+ and has been working in the superannuation industry for decades.
“I think organisations historically have been focused on building out their core administration platforms,” he says.
“And I think there has been a gap in thinking beyond that to categorising and organising their data so that they could extract insights from it and … then merge that with third party data to build data driven operating models, especially for member-based organizations”
Other sectors of financial services, such as banks, have already pushed forward with data driven operating models which means super funds have a bit of catching up to do.
1. Data driven operating models
Participants in the superannuation industry are starting to transition rapidly towards the implementation of data driven operating models that offer them greater flexibility, increased automation and ultimately better member outcomes. Unfortunately they are also rapidly realising that the data throughout their systems is very low quality.
As a result, they are examining their technical data infrastructure and realising they need to get that under control. This is leading to a significant number of large scale upgrades and initiatives to uplift internal data management platforms and the associated governance models and frameworks. Part of this process is employing the right people who understand data.
“We’re seeing a lot of appointments in both the retirement space but also in the data IT space of senior executives coming across from other financial services backgrounds,” Thornton explains.
“That’s a great start by the super industry to acknowledge that there’s a lot of work for them to do – supported by third parties, of course, who have got that expertise – in relation to data management and the implementation of data driven operating models.”
We are often engaged as one of those third-parties to assist with this. For example, a large client recently appointed us to write their data strategy and are now working with us to implement it. A large component of that strategy involved identifying the need for a data governance framework to be implemented. That included measurement frameworks, and a reorganisation of operating models so that people could collaborate better around data technical enhancements. It also looked at how their data can be stored more centrally in a high-quality way.
2. Ecosystems and interoperability
There is no doubt that the superannuation industry is becoming an ecosystem of best of breed service providers, product providers and customer/member facing solutions. Financial services firms are realising that if they want to participate effectively in this increasingly open architecture and data ecosystem, they need to have their internal data management model in order first.
Adam Gee is Head of Strategy at technology company and super administrator GROW Inc. He also points out that super funds need to get their data requirements organised before they can even think about participating effectively in the ecosystem.
“The ecosystem model is obviously a benefit, but if your data is a week old, trying to use it effectively in the system doesn’t really help too much. The breadth of the data model, the ability to pass a range of information, rather than a limited amount, is also important,” he says.
Open architecture and interoperability are catch-all terms increasingly used in our sector when talking about data management but what do they really mean, and what do they mean for member-based organisations like super funds?
Essentially open architecture allows data to become interoperable between the entities (organisations and their applications) in a technical ecosystem. The ability for the various entities in the ecosystem to communicate with each other in this type of architecture is fundamental and requires standards and protocols to be established. This is a huge issue in our sector, where a peer to peer model has proliferated resulting in enormous complexity and cost to administer.
More and more organisations in the sector are seeking to exploit the opportunities that the industry wide ecosystem presents. Products will evolve and associated servicing models will become more efficient and flexible. For example, open architecture is critical to enable higher levels of automation of end to end business processes and requires the ability for all organisations, and their technical systems, involved in the process to be highly interoperable.
Enabling these models, which enables the data within certain parts of their organisations – for example member services and administration – to integrate seamlessly with internal and external systems, is complex and on average beyond the capability of internal technology teams. It’s a specialist field and our sector is only just waking up to the value and complexity of making the transition.
It is no longer really possible for a super fund to operate in a black box administration model where the data is only integrated within an organisation. It’s just not competitive for an organisation to have that data locked away and then have manual mechanisms for that data to be integrating and interoperating with other participants in the sector.
Governments and regulators such as APRA are also taking much more of a direct interest in the quality of data and the value that exists within organisations. They are now wanting to actually interoperate in real time to see and govern the information and the underlying services that it represents. Business to Government interoperability such as the ATO’s platform, has been a significant driver and will continue to be a considerable challenge for the sector in coming years.
3. The challenges of merging data sets and replatforming
It is a unique time in the sector. I can’t think of a fund that I engage with that is not involved in a merger, considering one, doing a technical replatforming project or again, considering one. There is a massive amount of data migration happening across the sector.
Barely a month goes by without at least one super fund merger being announced. While at some point, mergers between funds – both large and small – will peter out, in the meantime those funds that merge have huge challenges ahead of them in terms of combining the data and systems they have in place.
GROW Inc’s Gee suggests that one thing super funds should consider before merging is a data quality review.
“We would love to see funds undertaking a data quality exercise prior to merger to clean up some of the data,” he says.
“The biggest challenge we have as an industry is we still don’t get fantastic data in the ‘front end’ of the ecosystem. So, through SuperStream, through which most members actually join funds, and through a range of other entry points, the data provided is still less than adequate, resulting in an inability to engage and appropriately service a member. As such, some form of data cleansing exercise or a data quality review prior to merger will add a lot of value.”
The process of actually moving the data between systems is a complex problem and data quality at the source is indeed a sector wide problem. The data in our system is also uniquely complex, having highly configured and customised product definitions that evolve over what is now fairly significant timeframes. Take for example the complexity that exists in the various defined benefit schemes. The evolution of insurance offerings in accumulation funds can also be very complex.
We have been working in this domain for some time now and have developed some interesting applications of advanced data techniques such as machine learning and AI to make the process more efficient and effective. There are significant risks and complexities at each of the data migration phases – Extraction, Transformation and Loading (ETL) – and working with a partner with the specialist skills, tools and experience is important.
There are plenty of data challenges facing the superannuation sector in the current environment, but they are not insurmountable, and they are all areas where Novigi has specialised expertise. We’d love to help with any data issues your organisation may have so please reach out to one of our consultants.
Ash Priest is the Chief Executive Officer at Novigi
For more information about anything you’ve read here, or if you have a more general inquiry, please contact us