CategoriesIBSi Blogs Uncategorized

Why financial services need to rethink authentication for a digital-first world

The last 10 years have seen significant changes in the debate around what the most important channel for business is – brick and mortar or digital channels. But in the shadow of the pandemic, and the accelerated shift towards digital it spurred across the world, that debate has been well and truly put to rest. We’re now in the digital-first era, and this has meant financial organisations such as banks have needed to significantly transform business models.

Amir Nooriala, Chief Commercial Officer, Callsign

by Amir Nooriala, Chief Commercial Officer, Callsign

Banks have traditionally relied on branches to drive loyalty and deliver experiences to customers. But with digital channels such as websites and mobile apps now becoming the most important avenues for business, old methods of driving experience and loyalty have been replaced by digital factors, such as app ratings.

So, it’s no surprise that in a digital-first world, businesses must leverage digital-first solutions. But if satisfaction levels are judged by the quality of a digital experience, then businesses not only need to ensure smooth and seamless access to services but also make sure security is a top priority.

However, analogue solutions that many organisations in finance still use for customer authentication – such as one-time passwords (OTPs) – fall short of achieving these goals.

The pressure is on for businesses to find new, truly digital-first solutions that make their customers’ lives easier and differentiate their experience from competitors. But this is a balancing act that needs to also ensure the safety and security of every online interaction because if ease comes at a cost to security, businesses risk losing customers altogether.

Vulnerable systems built on outdated foundations

When it comes to financial institutions, in particular, the ability to adapt to the times has always been part of their success. Some banks are centuries old, and to remain relevant they’ve had to adapt their business models, services, and cultures countless times, all while maintaining the integrity of the sensitive information they hold.

However, over time, that imperative to maintain the safety of their data has led to the accrual of legacy systems for most organisations. While this has been a long-standing problem, in a digital-first world, legacy systems present a particularly glaring vulnerability when it comes to authentication.

Throughout the pandemic, hackers and ransomware attackers took advantage of global uncertainty and thousands of people faced a barrage of text message-based scams over this period. This highlighted the already significant problem with commonly used authentication methods that so many businesses rely on.

In fact, almost a quarter of people questioned for a recent Callsign survey said they received more texts from scammers than their own friends and family.

Organisations’ unwillingness or inability to stop utilising outdated authentication processes such as OTPs are fuelling a worsening crisis in scams and fraud. Businesses developing their digital transformation strategy need to see it as a chance to approach everything they do with a digital-first mindset, and not simply try and recreate digital versions of existing solutions.

And to make this a reality, organisations must rethink their systems in line with how their customers actually behave online and build their solutions accordingly.

The verified path to digital-first innovation

There are a number of technologies that are ideal for financial organisations looking to elevate their authentication methods to digital-first levels that seamlessly and frictionlessly integrate into their customer journeys.

One such solution is passive behavioural biometrics, software capable of taking into consideration millions of data points when verifying the identity of a user. The key difference between behavioural and physical biometrics – and why behavioural biometrics is the superior authentication method – is that it isn’t reliant on a single device (it’s device agnostic).

So, when combined with device and threat intelligence, the solution can circumvent the single point of failure issue that so many other authentication methods fall foul of.

This makes it ideal for our modern, digital-first world where customers want to use a variety of devices and channels to access online services. By making your authentication method device-agnostic, user experience can be ensured in a secure and non-disruptive way, whatever device is being used.

And as this solution can be seamlessly integrated into any point of the customer’s journey, it’s a much more fitting solution for a sector that has always been on the cusp of innovation.

Re-building for a new era of authentication

In order to thrive, organisations in the financial sector need to be prepared to interrogate every aspect of their operations to make sure they are delivering the most convenient and secure service to customers online.

To achieve this, organisations need to be prepared to re-lay their technological foundations. Elevating the importance of digital identity authentication is one such vital change, and businesses need to realise it will only continue to grow as a priority for customers going into the future.

Because the threat from bad actors and cyberattacks is only worsening. So, while innovation is key to attracting customers, security must be at the core if businesses are to retain loyalty. And it will take collaboration both internally and with partners to ensure that customers have the security they deserve.

CategoriesIBSi Blogs Uncategorized

Finance firms can strive for greater efficiency with easy access to trusted data

Financial services firms are seeing rapid growth in data volumes and diversity. Various trends are contributing to this growth of available data across the sector. One of the drivers is that firms need to disclose more to comply with the continuing push towards regulatory transparency.

by Neil Sandle, Head of Product Management, Alveo

finance
Neil Sandle, Product Management Director, Alveo

In addition, a lot more data is being generated and collected through digitalisation, as a by-product of business activities, often referred to as ‘digital exhaust,’ and through the use of innovative new techniques such as natural language processing (NLP), to gauge market sentiment. These new data sets are used for a range of reasons by finance firms, from regulatory compliance to enhanced insight into potential investments.

The availability of this data and the potential it provides, along with increasingly data-intensive jobs and reporting requirements, means financial firms need to improve their market data access and analytics capabilities.

However, making good use of this data is complex. In order to prevent being inundated, firms need to develop a shopping list of companies or financial products they want to get the data from. They then need to decide what information to collect. Once sourced, they need to expose what data sets are available and highlight to business users, what the sources are, when data was requested, what came back, and what quality checks were taken.

Basically, firms need to be transparent about what is available within the company and what its provenance has been. They also need to know all the contextual information, such as was the data disclosed directly, is it expert opinion or just sentiment from the Internet and who has permission to use it?

With all this information available it becomes much easier for financial firms to decide what data they wish to use.

There are certain key processes data needs to go through before it can be fully trusted. If the data is for operational purposes, firms need a data set that is high-quality and delivered reliably from a provider they can trust. As they are going to put it into an automated, day-to-day recurring process, they need predictability around the availability and quality of the data.

However, if the data is for market exploration or research, the user might only want to use each data set once but are nevertheless likely to be more adventurous in finding new data sets that give them an edge in the market. The quality of the data and the ability to trust it implicitly are nevertheless still critically important.

 Inadequate existing approaches

There is a range of drawbacks with existing approaches to market data management and analytics. IT is typically used to automate processes quickly, but the downside is financial and market analysts are often hardwired to specific datasets and data formats.

With existing approaches, it is often difficult to bring in new data sets because new data comes in various formats. Typically, onboarding and operationalising new data is very costly. If users want to either bring in a new source or connect a new application or financial model, it is not only very expensive but also error-prone.

In addition, it is often hard for firms to ascertain the quality of the data they are dealing with, or even to make an educated guess of how much to rely on it.

Market data collection, preparation and analytics are also historically different disciplines, separately managed and executed. Often when a data set comes in, somebody will work on it to verify, cross-reference and integrate it. That data then has to be copied and put in another database before another analyst can run a risk or investment model against it.

While it is hard to gather data in the first place, to then put it into a shape and form, and place it where an analyst can get to work on it is quite cumbersome. Consequently, the logistics don’t really lend themselves to faster uptime or a quick process.

The benefits of big data tools

The latest big data management tools can help a great deal in this context. They tend to use cloud-native technology, so they are easily scalable up and down depending on the intensity or volume of the data. Using cloud-based platforms can also give firms a more elastic way of paying and of ensuring they only pay for the resources they use.

Also, the latest tools are able to facilitate the integration of data management and analytics, something which has proved to be difficult with legacy approaches. The use of underlying technologies like Cassandra and Spark makes it much easier to bring business logic or financial models to the data, streamlining the whole process and driving operational efficiencies.

Furthermore, in-memory data grids can be used to deliver a fast response time to queries, together with integrated feeds to streamline onboarding and deliver easy distribution. These kinds of feeds can provide last mile integration both to consuming systems and to users, enabling them to gain critical business intelligence that in turn supports faster and more informed decision-making

Maximising Return on Investment

In summary, all firms working in the finance or financial services sector should be looking to maximise their data return on investment (RoI). They need to source the right data and ensure they are getting the most from it. The ‘know your data’ message is important here because finance firms need to know what they have, understand its lineage and track its distribution, which is in essence good data governance.

Equally important, finance firms should also ensure their stakeholders know what data is available and that they can easily access the data they require. Ultimately, the latest big data management tools will make it easier for finance to gain that all important competitive edge.

CategoriesIBSi Blogs Uncategorized

How FS organisations can protect themselves from cyber threats during the peak period

Policymakers and regulators around the world have pointed to cyber threats from criminal and state actors as an increasing threat to financial stability. Last month, US Treasury Secretary Janet Yellen – along with finance ministers and central bank chiefs from the Group of Seven nations – conducted an exercise covering how G7 members will seek to cooperate in the hypothetical event of a significant, cross-border incident affecting the financial sector.

Fabien Rech, EMEA Vice President, McAfee

by Fabien Rech, EMEA Vice President, McAfee

Such concerns are widespread, with 80% of UK IT professionals anticipating a moderate or even substantial impact by increased demand for their services or products this holiday season. The extra demand is compounded by the reduced size of teams and greater online activity. With cyber threats to the financial industry front of mind, and organisations across the sector coming under scrutiny as to whether they are doing enough to protect themselves, this year’s peak season – and subsequent rise in online activity – is cause for concern.

While this paints a bleak picture, organisations can be proactive in defending their networks, data, customers, and employees, against the anticipated increase in holiday cybercrime by implementing certain security measures.

Using technology to bolster teams

Demand for cybersecurity is surging, and today there are a number of technologies that can help to bolster security measures, providing additional support for often stretched security teams. Threat intelligence can offer unique visibility into online dangers such as botnets, worms, DNS attacks, and even advanced persistent threats, protecting FS organisations against cyberthreats across all vectors, including file, web, message, and network.

In addition, taking a Zero Trust approach to security enforces granular, adaptive, and context-aware policies for providing secure and seamless Zero Trust access to private applications hosted across clouds and corporate data centres, from any remote location and device. This will be particularly useful as more employees choose to work remotely.

Prioritising employee awareness

Beyond technologies, the adoption of an awareness-first approach is vital. Proactive cybersecurity awareness training for all employees – not just those in the security team – is essential, especially when encountering holiday phishing emails. As the cyber threat is always evolving, so too must organisations – ensuring that their team’s knowledge and ability to identify, avoid and negate those threats also grow in turn.

This awareness-first strategy requires leaders to move away from a ‘breach of the month’ approach, instead of using proactive training measures to build security into the fabric of their organisation, breaking down siloes of threat and information intelligence across the business, so that all employees are aware of how they can contribute to the battle against cyberthreats during the peak period and beyond.

Some banks are already taking a proactive approach to testing employee understanding when it comes to cybersecurity, for example, resistance to spam or phishing emails, and knowing not to plug unknown USB keys into their laptop. If employees don’t appear to have sufficient knowledge of threats and best practices, they will automatically be required to take part in further training.

Other key steps to take in this proactive approach include increasing the frequency (and testing) of software updates, boosting the number of internal IT-related communications to keep everybody informed, and implementing new software solutions with due diligence.

Implementing a response plan

It’s also important to recognise that protective measures might not work 100% of the time. As hackers become ever more sophisticated, it’s vital for FS organisations to design a holistic, clearly communicable plan for if (and when) things do go wrong.

Developing a robust incident response plan could mean the difference between being able to respond and remedy a security breach in minutes rather than hours, ensuring the least amount of downtime possible. When asked, 43% of businesses reported suffering from downtime due to a cyber concern in the last 18 months – for 80% this happened during peak season and lasted more than 12 hours for almost a quarter (23%)

Again here, training forms a big part – making sure employees know what to do and who to inform when an incident does occur is at the heart of any effective response plan, as is encouraging a culture of honesty and transparency. An organisation in which employees are wary of acknowledging a mistake or informing someone of a possible accidental breach is not a secure one.

The year is full of challenging peak periods, from the public holidays at the end of the year to summer vacations and various religious/spiritual holidays. The need for vigilance has never been greater or more constant, and financial services organisations, in particular, have a need to protect the data and money of their customers, as well as the resilience of their own organisations.

By using technology, training, and incident response awareness, leaders in the sector can help to bolster teams against the increasing sophistication of cyberthreats, staying safe while staying connected. The peak season offers unique challenges, but ultimately the goal is to develop a resilient and adaptable organisation that can ensure security year-round, allowing employees to thrive, wherever they choose to work without having to worry about threats.

CategoriesIBSi Blogs Uncategorized

An ideal match: Why payments platforms are buying into machine learning

Buy Now Pay Later (BNPL) has seen a surge in growth since the start of the pandemic, but in order for the BNPL industry to sustain its development, it must be underpinned by comprehensive technology designed to optimise experience both for the merchant and the customer.

by Tom Myles, Chief Technology Officer, Deko

The most influential technology for business has been the rise of AI, machine learning and big data, having now permeated almost all sectors. Indeed, recent research highlights the key role that AI is expected to play in the future of fintech as a whole, with two-thirds of fintech firms predicting it to have more impact on the sector than any other tech in the next five years. As for BNPL products in particular, a forecast market of £26.4 billion by 2024 would mean it more than doubling in three years, so there is huge growth potential for AI to help unlock.

Payment optimisation

Payments
Tom Myles, Chief Technology Officer, Deko

Lenders can use AI and machine learning to extract more value from BNPL platforms. Machine learning models, for instance, are built on data, and payments systems generate large quantities of data from which potential lenders can gain insights into consumer behaviour.

These data-driven decisions can streamline the process of matching the right lender to the right buyer at the right time, which will be all the more useful for unlocking the potential of platforms that operate on a multi-lender, multi-product basis. AI and machine learning models can match consumers to the right member of a multi-lender roster, enabling an increased provision of credit for consumers.

Delivering a rewarding experience for consumers is just as important as the flexibility of the financial technology involved, as increasingly tech-savvy consumers continually elevate the minimum standard of the online experiences in which they are prepared to engage. In particular, consumers are increasingly reticent to engage in elongated sign-up processes and demand a mobile-first approach. Offering a visibly streamlined, omnichannel payments process will therefore prove to be a vital differentiator in an ever more competitive market.

Moreover, empowering consumers and lenders to make more streamlined transactions will also be beneficial for merchants. AI-enabled technology will enhance the flexibility of retail finance products, enabling platforms to process more transactions at a higher pace. This, combined with reduced rates of basket abandonment, will help retailers increase their sales volume.

Fraud prevention

While merchants may relish the prospect of driving more sales, machine learning can also support them in another critical area that may be somewhat sobering to consider. As commerce continues to migrate online, the threat of fraudulent transactions looms larger than ever.

Leveraging AI and machine learning means that payment platforms can learn to recognise patterns in consumer behaviour, based on analysis of the data generated from previous transactions, so even the smallest changes in behaviour can be identified. These data-driven insights will enable automated flagging of potentially risky transactions, which will reliably protect merchants, lenders and consumers from fraud.

Automating this process is vital in a world where online fraud is increasingly sophisticated. An older, rules-based model might be able to test for numerous different types of fraud that have been recorded previously, but the system would remain vulnerable to as yet undetected types of fraud.

Future of FinTech

Using technology both to streamline payments and to prevent fraud will require FinTech platforms to strike a delicate balance. Security is of course vital, but rigorous fraud prevention processes should not come at the expense of a streamlined, speedy checkout from the user’s point of view. Indeed, for fintech platforms as well as for users, improved security and streamlined payment processes will ideally go hand-in-hand.

FinTech firms must therefore invest due consideration as well as resources into AI-enabled functionality for their platforms. And this investment has already proven well worth making. Given the benefits for both merchants and lenders, it is perhaps unsurprising that 83% of financial services professionals agreed that “AI is important to my future company’s success”. Indeed, these forward-thinking technologies and firms are integral to the development of the payments sector.

CategoriesIBSi Blogs Uncategorized

Banking in the public cloud

IBS Intelligence is partnering with Sopra Banking Software to promote the Sopra Banking Summit, which takes place 18-22 October 2021. The summit is tackling the biggest issues in the financial sector, including public cloud. This weeklong festival of FinTech will touch on the hottest topics in financial services and highlight the new paths industry leaders are taking.

The following article was originally published here.

Cloud computing has long been an attractive option for banks looking to optimise costs, improve flexibility and facilitate digital expansion. Historically, cloud adoption has meant private or limited deployments due to concerns ranging from security to compliance. But in a post-Covid world, the drumbeat of digitisation has gotten louder, and more banking leaders are moving past their reservations. And recently, the use of public cloud platforms is gaining traction. This is due to ongoing margin compression, the need to reduce costs, and the imperative for banks to innovate faster—all things the public cloud can help solve.

by Martin Lee, Head of Managed Services & Cloud, Sopra Banking Software

As more financial institutions move to leverage cloud providers’ huge investment in their tech stacks, it’s worth pausing to survey the landscape and understand the most important considerations for banks moving forward.

The state of cloud banking 

The phrase ‘cloud computing’ spans a range of classifications, types and architecture models. In simple terms, a private cloud means a dedicated cloud computing network. In contrast, a public cloud is cloud computing delivered via the internet and sharing underlying infrastructure across organisations. A hybrid cloud is an environment that utilises both physical and cloud hosting.

In the last decade, the use of public cloud computing via services like Microsoft Azure or Amazon Web Services has turned into a $240 billion industry. In banking, the public cloud is already commonplace for non-critical tools, and a typical bank’s computing environment already includes on-premise systems, off-premise systems and multiple clouds.

Indeed, 19 of the top 20 US banks have already announced public cloud initiatives. In late 2020, IBM rolled out a financial services-specific public cloud featuring 10 of the world’s largest banks as customers. While there is a lot of activity, maturity levels vary. For instance, according to a recent report, 80% of UK banks have migrated less than 10% of their business to the public cloud as of 2020.

But that’s rapidly changing.  Banks know they can no longer ignore the benefits if they want to stave off competition and remain profitable. Data from McKinsey underscores this point. According to one of its surveys, more than 60% of banks plan to move the bulk of their operations to the public cloud in the next 5 years.

Martin Lee, Head of Managed Services & Cloud, Sopra Banking Software, discusses public cloud solutions
Martin Lee, Head of Managed Services & Cloud, Sopra Banking Software

Benefits of running a bank in the public cloud  

The trend of public cloud adoption is, to some degree, traditional banks following the model that FinTech proved, i.e., using the cloud to be flexible, agile, and responsive. Through our experience, we have seen several specific use cases where banks in the UK have benefited from the use of cloud-based services. These include:

  • Avoiding high CAPEX costs for new and replacement hardware with a more efficient OPEX model, removing the need for future one-off hardware investment
  • The ability to access and leverage a wide range of digital products, offerings, and integrations, which are only possible with cloud-based technologies
  • Reducing the risk and impact of Covid-19 and other potential business continuity issues by avoiding the need for staff to physically access specific locations to deliver services
  • Increasing and ensuring operational resilience in a cost-effective manner to meet the latest regulations
  • Improving efficiency via automation and infrastructure-as-code reduces manual effort and improves response times—plus, it reduces complexity and technical debt related to legacy systems

Considerations and keys to success

While there is a general trend toward public cloud adoption across banks in the UK and elsewhere, there are a few important areas to consider to ensure the greatest odds of success.

Be clear on the specific benefits that cloud is bringing to your organisation 

The public cloud brings many potential benefits, but not all of these will apply to every circumstance. And a move to the cloud is generally most effective as part of an overarching business strategy, rather than a strategy unto itself. It’s therefore critical to define which business benefits a cloud migration is expected to deliver.

Don’t assume that the cloud will be secure by default  

Most cloud-related data breaches are due to misconfiguration, not a flaw in the infrastructure itself. This means that it’s essential to either develop the expertise in-house or work with a managed-services provider to ensure that issues around configuration and methodology are avoided.

Understand where your data is 

well-established benefit of the public cloud is that it enables users to ‘go global in minutes.’ However, for European banks, there are often specific requirements around data residency and transit.

A proven and audited system design, combined with the right technical controls, is key to making sure that data is administered correctly and housed only in approved locations.

Maintain the right level of capacity 

While virtually unlimited amounts of capacity are available in the cloud, utilising it means there is a chance of paying for unnecessary resources. To make sure cloud environments are being used efficiently, it’s important to collect, monitor, and react to the data and metrics available.

Leverage cloud technologies to increase automation and agility 

A significant advantage to hosting applications in the public cloud is to leverage its elasticity, integration and orchestration facilities. Yet, to make use of these, a suitable level of staff expertise, modern working practices and system design is required. A simple “lift and shift” of current hardware will be unlikely to bring improvements with no other changes made.

Improve operational resilience 

The major cloud providers—Amazon, Microsoft and Google—operate at scales that few can match. By adopting best practices like AWS’s Well-Architected Framework, it is possible to increase operational resilience via many layers of redundancy, recovery automation, and the use of multiple regions and availability zones. However, these areas can be maximised only if the right skills, experience and approaches are used.

Select the right partner 

While the public cloud presents exciting opportunities, little can be done without the right partner. Unless significant investment is made in recruiting staff with extensive experience in cloud migrations, transformations and optimisation, capturing the full potential is challenging. When selecting a public cloud partner, there are several vital aspects to consider. These include certification and standards, technologies, data security and governance, reliability, migration support, and service dependencies.

Moving forward 

Adopting cloud technology isn’t a catch-all solution for banks. However, increasingly, doing so has clear benefits when compared to traditional IT deployments. By choosing the right partner, banks can enjoy all the benefits of the public cloud while ensuring security, compliance and support are maintained. In today’s world, progressive banks are meeting customers where they are—and that’s increasingly using services hosted in the public cloud.

CategoriesIBSi Blogs Uncategorized

How to build a digital bank

Unsurprisingly, building a digital bank – either from scratch or from an existing infrastructure to become truly digital – is no mean feat, and there are plenty of obstacles along the way. (We’ll come onto those later).

by Max Johnson, Global Head of Business Solutions at Fidor Solutions, a Sopra Banking company

Max Johnson, Global Head of Business Solutions at Fidor Solutions, a Sopra Banking company, discusses how to build a digital bank
Max Johnson, Global Head of Business Solutions at Fidor Solutions, a Sopra Banking company

Meantime the number of challenges that legacy banks are facing is stacking up: Keeping pace with new and changing regulations; loss of market share caused by struggling to meet the needs of digital and non-digital native consumers alike; maintenance of outdated systems. They’re all issues that banks have to contend with.

And while new industry entrants are also up against problems of their own, they’re often immune to some of the issues faced by their incumbent competitors.

To remedy this, many legacy banks look to their modern, more agile competitors for inspiration, and digital transformation is often at the heart of their strategic response. By digitising their existing processes – and doing so at speed – and offering customers truly digital, innovative products and services, they hope to beat the digital banks at their own game.

Likewise, non-legacy bank organisations sense an opportunity. The number of new digital banks being created has risen exponentially in recent years, quadrupling from 60 worldwide in 2018 to 256 in January of this year.

Nevertheless, whether it’s legacy banks looking to accelerate their digital transformation or new industry entrants interested in building something from scratch, the question remains: How can you build a digital bank?

What is a digital bank?

At a quick glance, a digital bank is simply an organisation that provides traditional banking services via a computer or mobile device. Indeed, the core products and services offered by digital banks don’t necessarily differ from those of their incumbent competitors. However, there are some key differences that set digital banks apart.

For a start, digital banks tend to target digital native customers who, oftentimes, feel neglected by legacy banks. Some important customer needs met by digital banks include:

  • Transparency: Digital banks rarely have hidden or excessive fees
  • Experience: Digital banks typically offer fast and easy-to-use services and support
  • Accessibility: Digital banks often allow their customers to access their services at any time, from anywhere

Targeted customer-centric service is at the heart of what makes a digital bank. Rather than resting on their laurels, digital banks are known for continuously adapting their value proposition to better meet the needs of the market. Fidor bank, for instance, has focused on customer engagement by giving customers a voice in how the bank is run, by “discussing the future interest rates, or naming the current account card that the bank will use.”

Such an approach has reaped rewards, even during the pandemic. In the US, for instance, the number of customers served by digital banks rose by 40% from 2019 to 2020, per a recent Forrester report.

By offering user-friendly and relevant services, digital banks can set themselves apart from their incumbent competitors. It’s both the definition of what makes a digital bank and part of the key building blocks required in building one.

Challenges in building a digital bank

It’s easy to say that building a digital bank is the future, but there are of course challenges.

First and foremost, acquiring customers, deposits and active accounts is always the biggest hurdle to overcome, especially getting money into the system to begin with. Offering prospective customers with USPs and attractive products and services (such as those mentioned above) is a great place to start, but building such a strong portfolio can be time consuming and costly.

There’s also the red tape to bear in mind. While it differs from region to region, banking is an extremely regulated sector, and there are plenty of administrative hoops to jump through. This can be a lengthy and expensive process, which is why many emerging digital banks often partner with legacy banks. The US-based digital bank Chime, for instance, partnered with Bancorp, who provides the banking license and deposit insurance.

Of course, legacy banks wishing to go digital are less concerned with banking licences and building a customer base from scratch, but they do have different challenges to overcome.

Many legacy banks have cultures and technologies that are difficult to change. Becoming truly digital means having agile technology capable of continuously adapting to an ever-changing market, as well as having an open culture of change within the organisation. And in the same way that creating the building blocks of a digital bank can be time consuming and costly, so can implementing the change to go truly digital.

Building out a roadmap

Clearly, launching a digital bank is far from easy. There are plenty of boxes to tick and potential problems to be navigated. Furthermore, launching a digital bank is by no means a guarantee for digital success. The market is becoming increasingly crowded, and plenty of digital banks have already failed, including Bo by Royal Bank of Scotland, Finn by JPMorgan Chase and Greenhouse by Wells Fargo.

It’s therefore vital to approach building a digital bank in the right way. We believe that approach starts with putting together a roadmap. On a macroscale, this involves outlining the objectives, mission and vision, and identifying the short and long-term values to be achieved. On a more detailed level, it’s about research and testing – identifying target segments, understanding customer needs and pain points, and using that data to create high-value USPs.

Building iteratively on this type of approach – including listening to and acting on customer feedback – gives organisations the best chance to succeed.

Of course, putting together and following a roadmap toward building a digital bank is, in itself, not self-evident. Organisations need to understand the intricacies around it, such as customer journey, technical architecture, the associated costs and licenses involved. That’s why having the support of an experienced and trusted partner during this process is crucial.

As is becoming increasingly the case in the banking world, partnerships here may be the key to survival. Banks and organisations need to seek out tried-and-tested expertise in order to help them build out their digital roadmaps, as going alone will not be successful.

IBS Intelligence partnered with Sopra Banking Software to promote the Sopra Banking Summit, which took place 18-22 October 2021. The summit tackled the biggest issues in the financial sector. This weeklong festival of FinTech touched on the hottest topics in financial services and highlighted the new paths industry leaders are taking.

This article was originally published here.

CategoriesIBSi Blogs Uncategorized

All that glitters is not gold: Is a golden source of data truly the way forward?

When seeking perfect data quality firms often look for a golden source of truth, what are the pitfalls to this approach? The fact that a golden source of data has historically been celebrated doesn’t make it right. How can firms successfully lay the foundations for true data integrity?

by Neil Vernon, CTO, Gresham Technologies

A golden source of data – a single source of truth to supply a business with all the information that it needs to rely on – has historically been seen as the pinnacle of data quality. But while financial institutions have long strived towards a utopia of data perfection, this approach does present some drawbacks. What’s more, just because a golden source of data has long been sought after, it doesn’t necessarily make it the best option.

Today, in an era of waning tolerance for poor data integrity, more complex regulatory reporting requirements, and increasingly tight margins, we ask: how beneficial is a golden source of data in the current environment, and is there another path to take us to the top?

Enhancing reporting across multiple counterparties

The original focus of golden sources of data centres around a financial institution’s internal data repositories. However, the ability of firms to report accurately, on time, and in full, is often bound up with that of their counterparties – as is the case in the recently introduced Consolidated Audit Trail regulation.

Neil Vernon, CTO, Gresham Technologies, discusses the promise and pitfalls of a golden source of data
Neil Vernon, CTO, Gresham Technologies

This has led to questions over whether there should be golden sources developed across the industry which each firm can access as needed for regulatory reporting and other requirements. It is easy to see the appeal of this – one single source of truth, properly managed, would reduce error rates in transaction reporting dramatically, as well as decrease or eliminate time spent on counterparty communication. If you and your counterpart report from the same repository, how could you possibly report differently?

An innovation roadblock

However, creating such a golden source has a major drawback: it would significantly limit the abilities of financial institutions when it comes to innovation – another current ‘hot topic’ area. Coming with strict usage and management requirements, a true golden source would inhibit the kind of ‘fail fast’ experimentation with processes and products which the industry has been so at pains to encourage.

And of course, there’s the unavoidable fact that everyone’s truth is different. The questions that you are using your data to answer will determine the lens through which you should view it. For example, analysing data for product purposes will require you to consider accurate product hierarchy.

Practical realities: Why accuracy holds the key

But if creating a true golden source is neither practical nor desirable, what should firms do instead?

As far as internal data goes, firms should certainly still strive towards a reliable data source – but they should also recognise that a general-purpose solution is not always possible.

Rather, appropriate control processes should be applied to ensure that there is no slippage in data quality and that the organisation fully understands its usage. Managing data lineage through systems that allow complete visibility of the data lifecycle makes the source of data easily traceable, enhancing understanding further and ensuring that any issues can be easily fixed.

In addition, education is key: organisations should ensure that data consumers have sufficient understanding and knowledge that, when using data, the right ‘lens’ for the situation is applied.

These steps will also help to resolve many of the issues that banks experience when dealing with their counterparties, since each side will have improved the accuracy of its data. Resolving linkage issues with counterparties consumes valuable resources, particularly where escalation is required. But by giving themselves maximum visibility and control over their data, financial institutions can stop many of these issues before they start.

Financial institutions should not change their ambitions to create a strong data source, but a golden source is a naïve objective: it is too simplistic, and harms more than it helps. True data integrity does not come from such a one-size-fits-all approach. Full control and knowledge over the lifecycle of your data, and the upgrading of legacy systems, is the only way financial institutions will be able to build the strong foundations of true data quality.

CategoriesIBSi Blogs Uncategorized

How US Credit Unions can ensure reliability in their digital banking systems

As the trend of digital transformation continues to impact the financial services industry, progressive Credit Unions in the US understand that to keep up, they must re-evaluate their digital platforms and convert to a new system that will offer their members a better banking experience.

by Michael Collins, Associate, Credit Union Practice, Qualitest Group

When we think about enhancing members’ digital banking experience, we often think about adding new features and capabilities, and making the system easier to use. But what about simply making sure the new system can run without any slowdown in response times, or worse crashing completely? After all, what could be a poorer experience for members than not having access to their money when they want it? This is where performance and stress testing is crucial for digital banking systems.

When undergoing a complex digital banking conversion, there needs to be rigorous testing of the system to ensure it is functioning properly and free of any bugs that will impact the user. Most Credit Unions understand how critical it is to test functionality and ensure all the data has been migrated over successfully to the new system, but they often overlook the performance and stress tests that ensure the system’s availability, responsiveness, and scalability. Performance and stress testing ensures that a new system can handle the expected usage of Credit Unions’ members and is well-equipped to scale as membership grows.

Michael Collins, Associate, Credit Union Practice, Qualitest Group on how performance and stress testing can help US credit unions offer better services
Michael Collins, Associate, Credit Union Practice, Qualitest Group

The need to ensure the availability and responsiveness of a digital banking system is more important now than ever before, as the pandemic has led more and more people to abandon in-branch transactions and to choose to do their banking digitally.

Why Is performance testing so important?

Everyone can relate to the frustration caused when trying to use an app or website that is running slowly or crashes altogether. This frustration is amplified when dealing with our money. People have an expectation that their right of access to their own money is a given and are rightly very upset when they lose this access or cannot perform transactions in a reasonable timeframe. These situations can ultimately damage your Credit Union’s reputation and lead members to lose faith in their banking system. To ensure that members never run into these issues, Credit Unions must run performance tests on their banking platforms.

Configuration is key

When slow response times or crashes occur, it is often due to the system not being able to handle the volume of usage it is experiencing. There are simply too many people trying to do the same thing at the same time. Performance tests are meant to simulate the usage a Credit Union can expect its system to encounter, and then measure the response times to ensure they are up to standard and being executed quickly. They allow for the creation of scripts that will run and mimic the workflows Credit Union members carry out when banking. A well thought out performance test will account for the factors below to accurately represent the actual usage by members:

  • Choosing the right workflows – A Credit Union will want the workflows being run in the performance tests to be around the activities and transactions its members carry out the most frequently. For example, paying a credit card bill, opening an account, transferring money, etc. A performance test can simulate many users performing these different types of transactions concurrently.
  • Establishing a baseline for average expected usage – It’s important to accurately determine the average number of members who are accessing the system at the same time. Performance tests allow for the simulation of this amount of activity to provide an accurate representation of the number of concurrent users trying to perform similar transactions at the same time.
  • Generating traffic from the right location – Another nice feature of performance tests is that they can simulate user activity from a specified location. If most of a Credit Union’s members reside in a particular geographical region, the performance tests will generate activity from the area specified.

To accurately capture all of the above factors in performance tests, a Credit Union will want to examine activity data from its old system and use it as a baseline for the activity a new system will face. If the Credit Union has plans for growth, that should also be accounted for, and the activity level simulated should be even higher than average.

Once tests have been configured appropriately, it is time to run the tests and see how the new system stands up to the type of activity it can expect to face. The goal is to confirm that the new system has not suffered any degradation to end user response times and can handle the amount of activity expected from members, allowing them to carry out their banking transactions quickly. The response times that are measured can then be used as a baseline standard for the system moving forward. If the system has suffered any degradation to response time, this is an issue that should be fixed immediately.

Running performance tests will give a Credit Union the confidence that its system will be able to handle the expected usage from its members, but what happens when facing uniquely high levels of traffic? Will the system crash, leaving members scrambling for answers?

Stress test for success

Unlike a performance test, which aims to replicate the average amount of activity a system will face, a stress test looks to simulate and establish a baseline for the peak activity level the system can withstand before crashing. The method for configuring the test is the same, but this time with a much larger number of users being simulated. A good way to come up with the activity level is to look at the peak usage level the system has faced historically, run the test using that amount, and see if the new system is able to handle that load without any outages. If it is, then the usage level should be increased, until the system ultimately reaches the peak level it can withstand.

Stress testing is not only a valuable way to mitigate risk and ensure the stability of a Credit Union’s digital banking system under unusually high activity levels. It can also give an institution confidence that its system will continue to perform as the business grows and the number of members using the system increases.

In conclusion, while it is important for Credit Unions to test their digital banking systems to make sure all of the features and functionality are working properly, it is equally important to confirm that the system is ready to withstand the traffic it will encounter from members. Running these performance and stress tests will provide confidence that members will be able to carry out their banking transactions quickly and without disruptions to service.

CategoriesIBSi Blogs Uncategorized

Retail banking, Covid, and the digital competition

IBS Intelligence is partnering with Sopra Banking Software to promote the Sopra Banking Summit, which takes place 18-22 October 2021. The summit is tackling the biggest issues in the financial sector. This weeklong festival of FinTech will touch on the hottest topics in financial services, including developments in retail banking, and highlight the new paths industry leaders are taking.

The following article was originally published here.

Retail banking was one of the sectors most affected by Covid. Nation-wide lockdowns, sanitary measures and social distancing shook the day-to-day practice of banking to its core, with brick-and-mortar branches closing and digital demand skyrocketing.

by Bettina Vaccaro Carbone, Head of Research for SFP at Sopra Banking Software

And yet, despite this, there may be a silver lining for the incumbents in retail banking moving forward. The impacts caused by Covid-19 have forced them to accelerate their digital transformation strategies, while also damaging many of their challenger bank competitors, therefore levelling the playing field to some degree.

Legacy banks now have an opportunity to become the digital torch bearers for the financial services industry in the years, and perhaps even decades, to come. But it’s an opportunity they have to take now, because it won’t last long.

The decline of digital-only banks

The pandemic, to some degree, hit the reset button for the industry. Before 2020, incumbent banks were somewhat on the run from new, more agile competitors. There are swathes of statistics that highlight the success gained and ground covered by industry entrants during the last decade. Here are three that summarize their success and the reasons that incumbent banks were feeling the heat:

However, 2019 feels like a lifetime ago. Challenger banks and their ilk have taken a massive hit since the beginning of the pandemic. UK challenger bank Monzo, for example, laid off hundreds of employees and lost 40% of its valuation during the height of the pandemic last year; and others, such as Simple and Moven, called it quits altogether on their consumer activities.

Bettina Vaccaro Carbone, Head of Research for SFP at Sopra Banking Software discusses the impact of Covid on retail banking
Bettina Vaccaro Carbone, Head of Research for SFP at Sopra Banking Software

Received wisdom would suggest that customers are more risk averse during risky times, and that even though the traditional banks are not soaring in terms of trust among end consumers, they have emerged as a preferred and stable choice.

Curious, given that challenger banks were supposed to be on the frontline of a digital revolution, and the impact of Covid demanded more digital banking services and bandwidth than ever before. But rather than flock to new digital-only banks during the pandemic, customers instead chose to stick with the traditional industry players.

Digitisation of banking services during the pandemic

Risk averse or not, there’s no doubt that customers want – and even need – digital banking services. This was true long before the pandemic, hence the rise of challenger banks last decade. A growing generation of digital-native consumers, a burgeoning digital ecosystem and the availability of new banking products and services all combined to ensure that the future of the financial services industry would be digital.

Conversely, traditional banks largely struggled to deal with this exponentially growing trend. Burdened with legacy systems unfit for purpose, rafts of regulations (from which their challenger bank counterparts were largely exempt) and reluctant-to-change cultures, the industry’s incumbents were falling behind fast. Indeed, 45% of banks and credit unions had not even launched a digital transformation strategy before 2019, per 2021 research by Cornerstone Advisors.

How things change. Catalysed by the pandemic, traditional banks – from big names like Bank of America and Chase through to regional incumbents – now boast of how digitally adept they are. In a period of intense digitisation, legacy banks have added a host of new and/or improved services, including video KYC, higher contactless payments and chatbot services, just to name a few.

Many of these technologies were in the pipeline for banks before Covid hit, but there’s no doubt that the pandemic accelerated plans. Speaking at the 2020 Bank Governance Leadership Network, one director said: “Suddenly the impossible became possible. Solutions that used to take 18 months to deliver are now happening in 18 days.”

Digital challenges for banks post-Covid

Despite this sudden surge, however, the traditional retail banking players may not have made the progress that the market demands. In some cases, far from it. A deeper look at some of the figures around the current state of legacy banks’ digital transformations makes for somewhat grim reading.

  • Approximately 40% of banks which state they are more than half-way through their digital transformation strategies, have not deployed cloud computing or APIs
  • Only a quarter of these banks have implemented chatbot technology
  • Just 14% have deployed machine learning tools

It seems that many legacy banks have not made as many inroads into the digital future as some might claim, and certainly not as many as they need to, to be seen as progressive digital players. That will have to change, as customer expectations are becoming increasingly digital focused, and that’s reflected in their attitude toward retail banking. Many end-customers expect their banking habits to change over the long term because of Covid.

Worse than the supposed lack of progress, however, is an apparent lack of awareness from some incumbent banks at where they need to be on the digital roadmap. According to the same Cornerstone Advisors study that cites the aforementioned developments, over a third of banks believe they are more than half-way through their digital transformation.

Incumbent banks have made digital strides during the pandemic, edging closer toward being a bank that appeals to a digital generation of consumers. Suddenly, they are no longer playing catch-up and facing imminent disintermediation across the board, at least not to the same intensity as before.

However, the job isn’t done. One could even argue that because of the ever-evolving nature of digital technology, the job will never be done; rather, digital transformation is a state of constant change and adaptation. For now, though, traditional players in retail banking can take a moment to reflect on just how far they’ve come since the beginning of 2020 and pause on what could be considered a sector reset in their favor.

But it’s a moment that should be taken quickly. Technologies will continue to develop; existing challenger banks will regroup, and new ones will be launched to challenge the status-quo. Any complacency or naval gazing will quickly see legacy banks lose ground to a new wave of resurgent digital players.

CategoriesIBSi Blogs Uncategorized

Cloud is the answer – what was the question?

Cloud is the answer – what was the question?

Against the backdrop of the FinTech boom, technical innovation and turbulent post-pandemic markets, up to 90% of global bank workloads are estimated to be moving to the cloud in the next decade.

by Craig Beddis, CEO and Co-Founder, Hadean

Varying demand in compute power was one of the core motivations for moving to the cloud. Dr Michael Gorriz, CIO of Standard Chartered, recently described how the geographical spread of the multinational bank’s trading resulted in ‘varying compute need, dependent on the presence of different countries and regions at different times of day and the pattern of the activities’.

This pattern often creates an unpredictable compute load, meaning that infrastructures need to be able to scale to ensure reliable provisioning. Some cloud providers are solving this through a load balancing feature. This is where processing power is scaled across several machines dynamically, providing an overall much more reliable IT platform.

Craig Beddis, CEO and Co-Founder, Hadean, discusses cloud and multi-cloud solutions
Craig Beddis, CEO and Co-Founder, Hadean

Goldman Sachs, HSBC and Deutsche Bank have all recently announced major partnerships with cloud platforms. It’s a move indicative of a broader industry trend towards cloud adoption, one initiated in part due to the emergence of containerisation. With traditional banks wary of hosting large quantities of sensitive information in a singular, outsourced location – containers have paved the way for a multi-cloud solution.

Containers enable the repackaging of applications for different cloud environments. This provides much needed flexibility in data abstraction and processing. Different cloud providers offer advantages for specific tasks, where for example one might offer greater upload speed, another might offer more security. Overall, being able to scale IT functions across these different clouds can help a financial organisation achieve greater agility in its services. Multi-cloud also represents a positive move for the industry as a whole. By allowing businesses and consumers to choose from a greater range of multi-cloud providers, these providers in turn compete on both price and delivery of service, improving the choice for the user. In essence, we have the recreation of competition that existed previously between banks but now in a more digitalised environment.

This migration however is no easy feat and while tech companies might have succeeded in convincing financial institutions of the merits of cloud computing, namely reduced overheads and a faster time to market, true success will lie in mapping out feasible cloud strategies. Decisions will need to be made on what to migrate, when and where.

Navigating the pitfalls that come with this change is difficult, particularly with the number of different options and choices that multi-cloud strategies offer. Taking a ‘cloud native’ approach has been popular among a number of financial institutions; for example, Standard Chartered’s digital bank Mox launched in Hong Kong as a cloud-based banking platform, while Capital One moved its entire service to run on AWS, saying that: “The most important benefit of working with AWS is that we don’t have to worry about building and operating the infrastructure.”

Wider economic effects, trends and hardship are also demanding change, with the pandemic putting on pressure to cut unnecessary costs. While changes of infrastructure have large initial costs, the move to cloud ultimately represents a more efficient mode of service delivery and will save money in the long run. The serious reduction of demand for in-person services that banks offer has also led to branches closing and the increased importance of digital services.

Open banking has been one of the most disruptive developments in finance of recent years with the customer’s information no longer exclusive to the one bank. It led to both an increase in exchange of data as well as a wave of innovation in banking services.

Increasing financial inclusion has also been a driving force on the consumer side, with FinTech start-ups forming to meet the demands of the rising number of people looking to increase ownership of their finances. This has put further pressure on accessibility in financial services, with the cloud’s flexibility primed to fill the gaps. It is cloud-native and scalable systems that will provide the ultimate platform for financial services and the various applications required to provide future innovative functions.

Call for support

1800 - 123 456 78
info@example.com

Follow us

44 Shirley Ave. West Chicago, IL 60185, USA

Follow us

LinkedIn
Twitter
YouTube