On a warm Summer’s evening at Sea Containers London, overlooking the Thames, Hitachi and Google hosted a successful dinner roundtable which included delegates from top global financial services institutions.

The event was chaired by myself, with colleagues from various teams at Hitachi and our partners bringing to bare their knowledge and expertise.

It was a forgone conclusion that computation heavy seasonal workloads, such as risk and pricing, are ideal candidates for the Cloud. There were a few key themes that emerged throughout the evening:

The first was the regulatory conundrum. At this point, there is a fair amount of challenge from the regulators and internal compliance to ensure that the Cloud (and the associated ecosystem) is actually ready and secure enough to host or process confidential data. On the other hand, the level of regulatory metrics demanded by the regulators is making it increasingly difficult to justify an expansion of on-premise and data centre spend. One the delegates, responsible for the FRTB changes to CVA in a major global bank quite rightly said, “I’m surely not going to buy 6 times my current compute footprint in my data centre to be ready for a computation workload that only happens for a 12-hour window once a month”.

Speaking of confidential data, the second theme focused on how data anonymisation and synthesis could be used to obfuscate confidential data for Cloud computation, while still not compromising on the statistical significance of the data. Gerry Rankins, from data privacy firm Anonos, has been working on this with Hitachi for the last couple of years. Gerry spoke of scenarios where banks can reach out to external clouds in a fully and secure (and if necessary) GDPR compliant way to process high complexity and high volume computations.

The third theme was around cost vs value. “If a Cloud initiative is being led purely out of cost concerns, then there is surely something. If my data centre is running at close to 100% capacity, then a move to the Cloud is up to 3 times more expensive”, noted the Head of Grid Computation at another Tier 1 Investment Bank.  This should instead be seen as a value statement. The potentially infinite scalability of the Cloud brings with it opportunity that has been hitherto unheard of. Computations can be distributed and brought down from days to minutes – and this speed and agility brings with the power to react much faster to the market and regulators.

My colleagues Marwan Tarek, Director Consulting Services and Andrew Harding, Director Consulting services, Insights and Analytics, added a touch of reality to conversations based on the real-life scenarios that they have faced in the early years of the Cloud and how organisations (within and beyond banks) have overcome the teething challenges that any new paradigm bring with it.

As a recap the key topics covered were:

  • Using cloud for hyperscale risk analytics platform for managing cost and performance
  • Data Governance and Data Security in the cloud
  • Vision of how banks can adopt Risk Analytics Platform as a service