According to TabbFORUM, investment managers annually spend millions of dollars in avoidable direct costs due to poor data management technology and operations architecture. Within many large organizations, the management of data is left solely to the business verticals, and in some cases to individual teams. This breeds data duplication, and the quality of the final data can be questionable at best. To add insult to injury, data is obfuscated through poor usage of tools like Excel. BMA’s, (Business Managed Applications) when built in Excel are fraught with data linage issues, along with maintenance of the sheet, which in the best of cases is tedious. Through these current entrenched practices, companies are throwing millions of dollars away.
This blog originally appeared on TabbFORUM.
The big push over the past few years has been the move to create enterprise Data Lakes to provide a “single source of the truth,” along with control, efficiency and responsiveness. This often is done in line with major efforts, particularly in:
- Risk, where it was aimed at creating a consistent basis to draw the self-reconciling views across market, credit and finance; and in
- Legacy consolidation & re-engineering efforts, where it was to remove the duplication of data and its maintenance as a part of a mop up and displace process.