CategoriesIBSi Blogs Uncategorized

Visa outage highlights IT maintenance challenges – and the promise of predictability

Evan Kenty, Managing Director EMEA, Park Place Technologies

In June, Visa started rejecting one in 10 financial transactions across the U.K. and Europe – a problem lasting 10 hours and affecting 1.7 million cardholders. Even in an IT environment designed to support 24,000 transactions per second, a hardware failure crashed the system. The incident was a wake-up call for an industry reluctant to suspend services for scheduled, expensive repairs. Could predictive maintenance have prevented the crisis?

Predictive maintenance draws on machine learning, neural networking, and artificial intelligence. Commonly used in marketing, learning technologies improve with use: every time you search Google, its accuracy improves.

Yet while AI can predict preference, it is still learning how to factor in context. Nirvana for marketers will be when technology shows my car purchase is followed by a caffeine urge, with my coffee advertised accordingly. It’s the search for the unforeseeable yet real relationship that can only be found with a deep data dive. We’re not there yet, but we’re on the way.

Maintenance that informs itself

The same neural networking technologies are being applied to hardware and networks. There is countless data in a data centre. Just as marketers want to utilise all the information available, so do data centre managers. The promise in machine learning is the ability to examine the full range of performance data in real-time to detect patterns indicative of “faults-in-the-making”, uncovering relationships no human engineer would return, like cars and caffeine.

This application of AI algorithms to data centre maintenance underpins our ParkView advanced monitoring system, which contextualises patterns to “understand” infrastructure behaviours. This means instant fault identification and fewer false alarms. Future predictive systems will prevent the types of issues Visa experienced.

The next stage: predictive maintenance taps IoT

In the Tom Cruise sci-fi movie, Minority Report, police use “psychic technology” to prevent crimes before they happen. The twist comes when the crime-solver is accused of the future murder of a man he hasn’t yet met.

There is a parallel with data centres. Human error causes an estimated 75 percent of downtime. That’s why data centres are less populated. The perimeter has security staff, but the interiors are becoming vast and lonely server expanses, where the electric hum is rarely broken by the sound of footsteps. The downside is the lack of human detection of things like temperature changes and dripping water.

That’s where the IoT and the Industry 4.0 playbook developed in heavy industry comes in, in which remote monitoring enables smart and predictive maintenance. A good example here is fixing a data centre air-conditioning system based on its predicted performance in relation to it’s surrounding environment. This concept can be applied across the entirety of a data centre and its cooling, power, networking, compute, storage, and other equipment. Emerging dynamic and largely automated predictive maintenance management will transform the data centres we know today into self-monitoring, self-healing technology hubs, enabling reliability as we move computers to the edge to support the IoT applications of tomorrow.

Evidence indicates a move from a reactive/corrective stance, still dominant in many data centres, to more preventative maintenance delivering average savings of up to 18%. The next leap towards predictive maintenance drops spending about 12% further. In fact, Google used such strategies to drive a 15% drop in overall energy overhead.

Combating downtime with predictive technology

Enterprises must integrate predictive maintenance. Downtime kills reputations, profits, and customer relationships. Most organisations like Visa can recover from unplanned outages, but reducing unscheduled maintenance is always preferable.

IT leaders must make hardware and facilities as downtime-proof as possible. This means using machine learning and AI to return a pound of ROI on every ounce of prevention possible. Banks are investing in AI for a range of purposes, from contract scanning to fighting fraud. It’s essential that the new technology is used to fix problems in advance.

By Evan Kenty, Managing Director EMEA, Park Place Technologies

CategoriesIBSi Blogs Uncategorized

Chatbot: A Friend You Can Bank Upon

The digital banking space has always been a hotbed of tech innovation, with almost every new tool putting customer comfort and convenience at its core. And why not? After all, the customer is king.

Wait. Scratch that.

The New Age business idiom has changed – now, the customer is a comrade. Smart financial institutions are building a sense of camaraderie with customers to enhance banking experience. For this, they’re turning to Artificial intelligence (AI).

Enter the chatbot.

The most effective chatbots – essentially computer programmes designed to simulate human conversation – are designed to make life breezy for the busy customer. To be like that finance-savvy friend – only, all smarts and zero sarcasm. Programmed to take requests, offer insightful advice and even crack the occasional bad joke (check out the philosophically quirky chatbot created by National Geographic to promote Genius, their show on Albert Einstein), chatbots are all about Empowering through Experience.

For a bank customer, this could mean:

  • Personalised assistance: Chatbots can simplify banking for customers by opening a new account, making money transfers, paying bills online – without going through multiple steps and checks. They can be intuitively programmed to provide personalised alerts based on customer habits and preferences. Salary credited. How about investing in a Fixed Deposit? Credit card outstanding settled. How about finally placing an order for that Bose sound system you’d been Google-ing for the last one year?
  • Round-the-clock support: I have a friend who often has nightmares that every cheque she’s written has bounced because she’s exhausted her salary account mid-month. What she needs is a chatbot to allay her fears, instantly, even if it is after business hours. So, imagine her having this rather reassuring text exchange with a banking chatbot at 2am:

Chatbot: Hello, Priya. How can I help you today?

Priya: How I am doing with my salary account till my next payday?

Chatbot: Well, you have a phone bill of Rs 2,238 due tomorrow. The balance thereafter would be Rs 43,034.

Priya: OK. And could you please transfer Rs 10,000 to my Demo Bank savings account right now?

Chatbot: Done. Your Demo Bank savings account balance is Rs 53,000. Do you want to add Rs 7,000 more and round it up to Rs 60,000?

Priya: Sure.

Chatbot: Done. The balance in your Demo Bank savings account now is Rs 60,000. That’s Rs 12,000 more than it was this time last year. Good going!

  • Financial guidance: Money management is a challenging landscape for a lot of people. Especially millennials with a multitude of options to choose from. For this lot, chatbots can help make choices based on their needs and financial health. Erica, the Bank of America chatbot, for instance, shares tips on how customers can save better by cutting certain expenses and even offers advice on how much they can afford to spend based on their current financial status.

While they definitely give customers more bang for their buck, chatbots can also have financial services providers laughing all the way to the (…well) bank. Creating well-strategized chatbots could mean:

  • Customer loyalty: Bringing in a personal touch, through services like 24-hour assistance and financial advice, can win over customers.
  • Customised marketing strategy: Information collected by chatbots during interactions with customers can be leveraged to deliver personalized suggestions and push targeted products based on customer profile and preferences.
  • Brand building: Chatbots can be designed to personify the ethos of an organisation – no-nonsense and business-like or casual and cool – and build brand identity.

The conversation around the use of artificial intelligence in business and service delivery is not new. However, what is heartening is that the interest hasn’t waned. Google Trends data shows that the chatbots narrative is still buzzing. If you are not part of this story yet, get on board ASAP – because the best is yet to come.

By Padmanabhan R, Head of Product Management, Clayfin

 

CategoriesIBSi Blogs Uncategorized

Assuring good customer outcomes in a digital world – the five key risks of digital

James Nethercott, Group Head of Marketing at Regulatory Finance Solutions

Online banking is fast becoming the norm and brings with it many benefits. However, it is not without risk. How can firms ensure that customers are being best served by these new ways of transacting?

Digital banking has many benefits. For customers, they can instantly manage their finances from any location using an ‘always-on’ service. For firms, they can scale, gain reach, save cost, capture data more easily and build loyalty. However digital is not without risk. The same risks of mis-selling, poor servicing and inadequate complaint management are still present; albeit in different ways. Control frameworks need to be in tune with these new ways of interacting with customers. The FCA is clear that good customer outcomes should always result, regardless of the channel.

Research from Forrester indicates that rather than undergo a re-design, many products have simply been migrated online. Products designed for sale in branch or by telephone may not be suited to online. Digital demands an alternative way of thinking. When using an electronic interface, customers behave differently than when talking to an adviser. Natural cognitive biases go unchecked and people may be prone to making rushed and less optimal decisions.

Digital readiness demands more

Provider side, digital typically tends toward a pre-occupation with optimising conversion rates. Less attention can be given to end-to-end service design and compliance. Digital readiness means more than having high performing front-end interfaces. It also demands the right back-end processes, policies and controls. Without this good customer outcomes can easily be compromised.

As with most sectors, omnichannel experiences are standard. Customers will switch from one channel to another throughout their journey and firms need to ensure continuity. Typically, this demands good CRM processes so that customers are treated consistently and appropriately at all touchpoints.

Given the risks inherent with digital, a thorough testing programme is recommended. This provides assurance that each channel is working; and where not gives the insight needed to put things right.

The five key risks of digital

Risks in digital may manifest in different ways to other channels. Here are the most critical areas where good customer outcomes need to be assured.

  1. Buying the right product

Without an advisor to carry out a thorough needs assessment, and then recommend products, customers may select products that are not best suited. Online journeys need to guide customers through a process that is easy to follow and provides them with a good match to their needs and circumstances.

  1. Disclosure

Effective disclosure is particularly problematic in digital journeys. Customers may overlook important

information and be prone to over-confidence in financial decision making. It is important that digital journeys provide clear, unambiguous and impartial information. Firms need to be sure that customers fully understand the risks, and this understanding needs to be complicit and tested.

  1. Decision making

The data that customers provide needs to be adequate, appropriate and verified. In addition, the decision-making processes used need to be made clear. This is so customers understand how their information is being used and the terms by which they have been approved, or denied, at any stage.

  1. Product servicing

During the life of the product, service must be effective. Documentation, account servicing, complaints, cancellations and renewals all need to be readily available and compliant. There also needs to be integration with other channels, so where need be, customers can rely on human advice to help them achieve good outcomes.

  1. Vulnerable customers

Firms need to ensure that vulnerable customers are supported and neither disadvantaged or marginalised by digital. Some are unable to access online services, or to use them effectively. The same levels of service must be available offline, either for the whole or part of the customer journey. In addition, firms need to consider how vulnerability is identified in an online environment and then provide appropriate treatment to ensure good outcomes.

Technology vs. humans in a digital world

The industry is already speculating on how technology can be used to improve compliance. The first steps are simply to optimise existing sources of data so that it can be used for analysing compliance performance. More sophisticated approaches, such as applying voice recognition and semantic technology, will only be a matter of time. However, humans are far from redundant in this.

Humans can spot patterns and anomalies in ways that have not yet been coded, and humans are also capable of moral and ethical judgements that machines are not. Machines also need to be taught, calibrated and checked, a task that needs ‘real’ input and intervention.

FCA concerns over robo-advice shows that we may have gone too far in putting all parts of a process to machines. Instead, a balance is needed that incorporates the best of technology and the best of people.

For the time being, at least, people still have a place in ensuring good customer outcomes.

By James Nethercott, Group Head of Marketing at Regulatory Finance Solutions

CategoriesIBSi Blogs Uncategorized

Legacy Systems and Data Security in Open Banking

                     Shuvo G. Roy

The Catalyst for Change

Billed as a game changer by most in the industry, Open Banking witnessed a managed roll out in the UK in April 2018, paving the way for customers to experience enhanced banking services through a variety of authorised providers. The Competition and Markets Authority ushered in Open Banking with the aim to improve the quality of banking and financial services, ensuring banks remain customer-oriented in an extremely competitive market.

Optimistic market forecasts estimate that Open Banking could generate more than £7.2bn by 2022 if various sectors tap into its massive potential.

Open Banking allows secure data sharing by using an integration technology called Application Programming Interface (‘API’) that accesses the account and transaction information of customers and even allows third party providers (‘TPPs’) to initiate payment on behalf of customers, only upon their explicit approval.

As we move into 2019, what has actually changed and what lessons can we learn? Has this ‘great disruptor’ in the banking sector lived up to its initial hype?

A Closed Mind to Open Banking

The CMA reported that in June, there were 1.2 million uses of Open Banking APIs, describing it as a slow but positive start to changing consumer attitudes and revitalising the banking ecosystem for the better.

However, one senior source at a financial technology company told The Daily Telegraph: “The lack of promotion by the big banks has been disappointing and it’s the main reason for the slow take-up”.

So what are the reasons for the slow start? Why are the big banks taking their time?

Anne Boden, CEO and founder of Starling Bank, has been quoted as saying that the big banks “are all using legacy technology that’s 20, 30 or 40 years old… there’s no commercial reason why they want to do it [Open Banking]. Without that it’s a very difficult thing to do.”

Though public sentiment towards Open Banking is far from effusive, do remember it is a complex change that will take time to transform the way banking is done. Open Banking inherently brings a raft of technological and economic risks for the traditional banking model and navigating those changes is going to be an uphill task. One of the biggest teething problems faced in the banking sector is the legacy technology that is still used in the major banks, preventing them from quickly benefiting from this ambitious regulatory-driven process. In some instances, the technology could be even thirty or forty years old. The cost of overhauling their legacy technology to allow integration with API is prohibitively high, adding further traction to the process of adoption. However, if banks and financial organisations are eager to monetise the myriad opportunities presented by Open Banking, they need to be quick about overhauling their systems and IT infrastructure. Further, they also need to constantly innovate and bring out banking apps and other technology-driven solutions to enhance the banking experience for their customers.

Though the CMA provides guidelines on security measures and details of regulated providers, it still fails to address the underlying issues of legacy technology to ensure that there is no loss in the transfer of customer data.

Driving the Change

Banks own valuable customer data and are fiercely protective of it. Also, consumers who are not familiar with the actual applications of Open Banking are reluctant to embrace it as they fear fraudulent transactions and other complications arising from this technology. Adding to this hurdle is also the lack of awareness of the risks and benefits associated with Open Banking that has limited its appeal among the masses.

Therefore, the challenge for the banking sector is in implementing these concepts on the ground. Any compromise on customer data will not only result in regulatory penalties but also in the damaging press. No wonder then that cyber and data security rank amongst the top priorities of every Bank CIO and CEO.

Since Open Banking requires banks to share detailed customer information (other than sensitive payment data), they are required to undertake due diligence while sharing the same, even under the express consent of the customer. Banks and TPPs need to ensure customer consent is taken with due emphasis on the customer’s ability to understand and appreciate the possible outcome from the provision of their data. Since banks are deemed to be the final custodian of customer information, they have to secure their systems against financial crime, fraud detection and AML, among other things. Further, a bank’s IT infrastructure will need to be more secure and resilient as it will now be exposed to threats ported through TPP systems. They have to invest more effort and energy to analyse and discover potential points of vulnerability and take adequate measures to address this holistically. Core banking systems need to adopt open API based peripheral development, delivering quicker implementation cycles and minimal customisation of the core product. Furthermore, the industry’s adoption of API standards should set a benchmark for all involved parties. Banks and TPPs should adhere to and promote development in line with these standards.

Finally, it is worth mentioning that many large payment systems and core banking providers have developed Open Banking-compliant solutions. Without going into a lengthy debate on the merits and demerits of each of them, it might suffice to recognise that these systems, along with robust identity and access management systems, can comprise a strong first line of defence for the Open Banking ecosystem.

The Best Has Yet to Come

While the consumer experience may not have altered significantly in the initial rollout of Open Banking, experts opine that it won’t be long before the positive effects of this innovative model trickle down to the end users.

Already, the market is charged with competition and has become riper for innovation. Positive changes are taking place internally and banks are strategising to become more customer-centric and proactive. This will bode well for the long-term relationships banks have with their customers. As we gear up for the next wave of Open Banking, we hope that its innovative model will lead to a level playing field for both customers and banks. For once, innovation will go hand in hand with pragmatism and plain grit, to script the winning equation for the future of banking.

By Shuvo G. Roy, Vice President & Head – Banking Solutions (EMEA), Mphasis

CategoriesIBSi Blogs Uncategorized

A GDPR storm is coming – are you prepared?

Julian Saunders, CEO and founder, PORT.im,

Julian Saunders, CEO and founder of personal data governance company PORT.im, discusses how alleged breaches of GDPR by Facebook and Twitter may just be the beginning

Cast your mind back to early 2018. The world was alive with the sound of GDPR commentary. In the run-up to the May compliance deadline, everything was up for debate. Would it spell the end of marketing as we know it? Was anyone actually compliant? Was it good news or bad news for businesses? And, getting the most airtime – would GDPR be a damp squib like the Cookie Directive?

If you were of the opinion GDPR was a lot of hot air, the intervening months may feel like vindication. GDPR has largely gone off the agenda of most media publications and with it the minds of many business owners. However, we’re merely in the eye of the storm. In the last few weeks Facebook, and now Twitter, have been squarely in the crosshairs of regulators for allegedly failing to comply with GDPR. The EU has issued a stark warning that big fines will be handed down before the end of the year. Similarly, the ICO has ramped up its warnings that major action is likely to be taken. Added to this momentum has been a seemingly endless series of high-profile data breaches with Google+ the latest casualty.

For business owners who put their GDPR compliance on the backburner since May, the warnings could not be clearer: If you aren’t GDPR compliant you’re likely to be in some serious trouble in the next few months.

Facebook has quickly become the poster boy for poor data governance procedures. Cambridge Analytica, data breaches, and GDPR failures have all come in quick succession and provide a case study for businesses on how not to collect and manage data. While it may be tempting to revel in some schadenfreude, a better approach is to see what every business can learn from Facebook and how they can protect themselves from the expected GDPR storm.

First, it should go without saying that financial organisations hold some of the most sensitive personal data. Thankfully, the most important data linked to account information has largely been well protected. However, having high security standards around bank accounts can breed complacency especially when you consider it’s not the only information the average financial company holds. The marketing, customer service and sales departments will all, usually, have their own customer databases which may be subject to vastly different security and governance standards. A breach related to any of this data could be fatal to a financial organisation and result in hefty GDPR fines.

General complacency is kryptonite for data management and protection. For Facebook, its complacency manifested itself in lax standards, questionable practices and a belief it would never be brought to account. For financial organisations, it can lead to blind spots related to data that is deemed less ‘sensitive’. Often, to enable smooth marketing, client management and sales operations, customer data is more readily accessible than financial information, shared with more parties, updated more frequently and inputted into more platforms. Each of these processes increases risk. Compounding this issue is a general lack of education related to the power of this data to do harm. Many would ask, what use is an email address to a hacker? The short answer is, a lot. This is why GDPR seeks to protect every piece of personal data.

If you’ve got to this point in this article and you’re beginning to feel some doubt surrounding your data practices – good. Now is the perfect time to audit and review all your data processes and security standards. The baseline should be – is everything GDPR compliant? If it was in May – is it still compliant? New technology, teams and initiatives can all impact your data processes and result in non-compliance.

If you avoided all of this in the faint hope that GDPR wasn’t going to be an issue, you need to get on it immediately. In this instance, buying in technology and availing yourself of the services of specialist consultants will be the fastest (but not the cheapest) option.

Next, what is the general understanding of your staff? All the procedures and technological safeguards will mean nothing if your colleagues do not understand what GDPR is and the danger of data breaches. Undertaking company-wide training regularly and incorporating data management expertise and ethics into staff development and assessment can be a powerful way to measure and improve education.

Finally, if the worst happens and there’s a breach – are you prepared? Time and again we see that a poorly handled response to the data breach generally do more damage than the breach itself. Again – I’ll point to Facebook and its slow, incomplete and unsatisfactory responses to each and every data issue it has encountered.

Slow responses are symptomatic of a failure to have the right procedures in place. This can be because there is no technology or expertise available to identify the breach in the first instance or the right people are not empowered to make quick decisions. You need to start from the position that any breach, no matter how minor it appears, is serious. It should be reported to a specialist team led by the CEO. Within that team should be the IT lead, marketing, customer service and legal. Consumers should be informed as quickly as possible, both to be GDPR compliant, and to reassure. The business needs to identify who is impacted, how, what went wrong, how it can be fixed and how consumers will be protected in the future. The faster these boxes are ticked and communicated the better the end result – especially if the ICO gets involved. As with anything, practice makes perfect. Conducting wargames and drawing up ideal responses and contingencies with this team could make all the difference.

We now live in a world where the reputation and future of a company can be destroyed by hacks and data breaches. Organisations are generally to blame for this environment. There has long been a culture that personal data is a commodity that businesses can deal with as they wish. Now the wheel has turned. If you’re one of the many business owners that still believe that data governance is just something for the IT department to worry about – you’re going to be in for a big surprise. By the end of the year, a number of large businesses will be hit with near-fatal fines as a warning to other companies. Acting now will ensure that your company is not one of these cautionary tales.

CategoriesIBSi Blogs Uncategorized

Indian FinTech sector has potential to cross $2.4 billion earnings by end 2020

Abhishek Kothari, Co-founder, FlexiLoans

2020 is almost here, and it is a perfect time to look back on 2019 and appreciate the highs and lows. By this point in 2019, the words ‘FinTech’, ‘Data Science’ and ‘Machine Learning’ have become relatively common, and implications attached to these words have become apparent to anyone who is a part of the modern world.

FinTech in India has been growing at a significant pace for the last four years as a result of the increasing focus from RBI, government policies, advancing technology and affordable smartphones and data.

In turn, the Indian FinTech ecosystem has finally matured with the public at large, becoming more receptive towards digitization and tax automation. This is owing mainly to the demonetization of 2016 and the introduction of the Goods and Services Tax in 2017. In fact, implementation of GST alone has led to dedicated startups and new business verticals from established brands to help small, medium and large businesses with their taxes.

2019 was expected to be a year with continued momentum, but it came with its share of surprises. The industry did not grow as fast as anticipated, but like everything else in life, there were also moments of delight.

Firstly, the IL&FS liquidity crisis led to a massive trickle-down effect on NBFC lending, which led to a considerable reduction in available debt to smaller NBFCs. Liquidity is the raw material for financial services, and in the absence of a steady supply, many FinTechs grew slower than expected.

Secondly, RBI continues to be silent on some key issues like e-KYC, e-sign, e-NACH, which were the catalysts for a seamless journey and growth. The circulars were expected to post the elections, but that has been delayed, leading to a lack of clarity.

Thirdly, UPI and Payments saw a great deal of growth and investments coming in. UPI has been recognized globally as a masterpiece of innovation. With 143 banks live on UPI clocking 1.2Bn transactions in November alone, it has completely transformed the way money moves in India.

2019 was also a year with many FinTechs building real-time, fully automated and intelligent solutions for lending and payments. AI and Machine Learning saw some real takers and many human-led processes were fully automated.

As liquidity continues to come back and wait for RBI continues to streamline KYC, the trends I see shaping fin-tech startups in 2020 involve a highly aware customer and further innovations in data science and data engineering.

Trend 1: India is rapidly moving towards a mobile-first approach for accessing financial services, and they prefer vernacular platforms.

With a 400Mn reach of WhatsApp and thousands of hours of content being created by OTT platforms – Indian consumers are online on their smartphones. YouTube in India has over 1,200 channels with one million subscribers, and this number was only 14 in 2014. 

This provides an unparalleled opportunity for tech companies to build digital journeys and solutions to disrupt almost everything that we know today. Financial Services, Transportation, Logistics, Shopping, Telecom, Healthcare, Education are all going to see newer players challenging the status quo. There is nothing called Digital Strategy now, it’s just Strategy to survive in a Digital India!

FinTech also is witnessing the same behavioral shift where 95%+ users apply for a loan using a mobile device while this number was less than 30% three years ago. We have seen a 2X conversion on our vernacular pages compared to English landing pages.

Trend 2: Data Science and Engineering are delivering substantial cost efficiencies and better decisions with cutting edge applications of Computer Vision, Optical Character Recognition and Pattern recognition.

FinTech is growing at an exponential pace in India with high applications of data science in aspects like lending, insurance, broking and wealth management. Several lending companies have used image, text, and voice as input data sources to provide accurate decisions and better experiences than their banking counterparts in the last couple of years in India. Optical Character Recognition was meant to read the text inside images and transform that into digital text data. Now, there is an integration of OCR in our daily lives – from scanning documents and credit cards to data entry. The traditional, time-consuming paper-based work has been replaced with an optimized way of collecting the same data. With the enhanced ease in collecting data, data scientists can start their analysis journey quicker.

Data Science and Data Engineering are working more closely than ever with T-shaped data scientists becoming popular by the day.

Being one of the youngest nations in the world, a considerably large section of the Indian population is significantly more receptive and adaptive. The result is tech-savvy zealous entrepreneurs pushing the Indian fin-tech industry towards potential earnings to the tune of US$ 2.4 billion by end 2020.

CategoriesIBSi Blogs Uncategorized

Accenture to enhance core banking platform with SEC Servizi acquisition

Accenture has completed its acquisition of Italian banking technology service provider, SEC Servizi Spa from the Intesa Sanpaolo Group. Accenture now has 80.8% ownership in SEC Servizi and will also be acquiring the remaining interests held by other shareholders.

Established in 1972, SEC Servizi is a consortium formed by Italian banks to provide IT services and outsourcing solutions for banks and other financial institutions in the country. Its offerings include application and facility management, centralized back office services and specialized multi-channel, consulting, education and support solutions. The company reportedly manages more than 21 million transactions per day for nearly 1,400 bank branches in Italy and had revenues of EUR 152 million by the end of 2017. Some of its clients include Banca di Credito Popolare, Banca Italo Romena, Banca Nuova, Veneto Banca, Allianz Bank Financial Advisors, others. Intesa Sanpaolo acquired SEC Servizi in 2017 as part of the acquisition of certain assets, liabilities and legal relationships of Banca Popolare di Vicenza S.p.A. and Veneto Banca S.p.A, both in compulsory administrative liquidation.

The acquisition of SEC Servizi’s expertise and technology and operational assets will enable Accenture to create an advanced and innovative core banking platform that can support banks in their transition to digital. This transaction will help to establish Accenture as a leader in the banking technology market in Italy, serving SEC Servizi Spa’s existing customers, including Intesa Sanpaolo and other mid-sized financial institutions in Italy.

After slowly recovering from the financial crisis, Italian banks are now looking at modernizing their technology infrastructure and are increasingly relying on digital resources to remain competitive in the market and align their services to the digital savvy customer. An indication of this is the drop in the number
of branches at the end of 2017 which was was 20 per cent lower than in 2008.  Banks such as Unicredit, Intesa Sanpaolo, Monte dei Paschi, Mediobanca, Banca Carige are leading the way with digitalization initiatives ranging from contactless payments, virtual reality branches, robo advisory service, etc.

For Accenture, this presents an opportune time to enhance its core banking technology services with the acquisition of SEC Servizi Spa.

CategoriesIBSi Blogs Uncategorized

The spreadsheet challenge as banks move processes to European financial centres in preparation for Brexit

Henry Umney, CEO, ClusterSeven

Uncertainty around Brexit continues, but practical preparations have begun – many banks are now well in the throes of duplicating or moving systems and business processes from London to other financial hubs.

Extricating processes isn’t going to be an easy task. There are two aspects to this separation process – formal IT supported enterprise systems and the grey IT (or end user supported IT systems). Most banks have the understanding and the ability to effectively disentangle the core enterprise systems. Where in this extrication activity, banks are likely to come unstuck is situations wherever there are end user supported IT, commonly Microsoft Excel spreadsheet-based processes, that are deeply linked with the rest of the banking group’s enterprise systems.

If a bank is required to set up a separate entity in the UK, all the data residing in ancillary spreadsheets that feed data into the various systems pertaining to this jurisdiction will need to be delinked/duplicated and housed separately too. For instance, as banks separate their Treasury operations, there will likely be certain processes that heavily rely on common Bloomberg and Reuters market feeds that are owned by or have deep linkages to the banking group’s systems. Similar issues will arise for capital modelling-related processes. While previously a bank might be evaluating business risk based on its aggregated position across its European operation, post-Brexit, determining the UK entity’s risk position will require the financial institution to disconnect and separate the relevant data for this jurisdiction.

Essentially, as banks duplicate their enterprise systems for specific jurisdictions, they need to do the same for the spreadsheet-based application landscape that they rely on operationally.

Disentangling these unstructured, but business-critical processes manually will prove impossible and eye-wateringly costly.  Typically, spreadsheets surround the core systems such as accounting, risk management, trading, compliance, tax and more. Complete visibility of the spreadsheet-based processes landscape is essential to identify the ones that need to be duplicated/extricated for the new entity, but due to the uncontrolled nature of spreadsheet usage, there will potentially be 1,000s of such interconnected applications and no inventory of these processes.

Banks should consider adopting an automated approach to safely extricating their spreadsheet-based processes. Spreadsheet management technologies, can scan and inventory the entire spreadsheet landscape, based on very specific complexity rules and criteria. The technology can expose the data lineages of individual files across the spreadsheet environment to accurately reveal the data sources and relationships between the applications.

This approach is already proven in M&A type operational transformation situation, which to some extent resemble the Brexit scenario. Aberdeen Asset Management adopted this approach to separate the Scottish Widows Investment Partnership (SWIP) when it bought the business from Lloyds Banking Group. Due to the number of convolutedly connected spreadsheets across the vast spreadsheet landscape and the complexities of the business processes residing in this environment at SWIP, manually understanding the lay of the land was unfeasible. Utilising spreadsheet management technology, SWIP inventoried the spreadsheet landscape, identify the business-critical processes, and pinpointed the files that required remediation. Simultaneously, the technology helped expose the data lineage for all the individual files, revealing their data sources and relationships with other spreadsheets. SWIP was able to securely migrate the relevant business processes to Aberdeen Asset Management and where necessary decommissioned the redundant processes.

Post-Brexit too, banks have a lot more to gain from automated spreadsheet management.  Spreadsheets will likely be the used to set up temporary business processes/solutions for the new operations. Spreadsheet management will embed best practice-led use of these tools across the lifecycle of such applications – from creation through to remediation and decommissioning into formal IT supported applications– encompassing spreadsheets and their unique data flows. It will also offer banks an in-depth understanding of their data landscape. This will help institute data controls and spreadsheet change management processes so that there is complete transparency and an audit trail tangibly reducing operational, financial and regulatory risks caused by spreadsheet error.

 

By Henry Umney, CEO, ClusterSeven

 

About the author

Henry Umney is CEO of ClusterSeven. He joined the company in 2006 and for over 10 years was responsible for the commercial operations of ClusterSeven, overseeing globally all Sales and Client activity as well as Partner engagements. In July 2017, he was appointed CEO and is strongly positioned to take the business forward. 

CategoriesIBSi Blogs Uncategorized

The Danish startup putting the killing blow into key encryption technology

Danish encryption specialist Sepior, founded in 2014, was started on the back of ground-breaking encryption projects and the support of the EU’s Horizon 2020 programme. In discussion with IBS Intelligence it revealed that it has lots more surprises up its Fairisle jumper

Sepior’s big break came with the EU’s Horizon 2020 initiative, an irony not lost on CEO Ahmet Tuncay – as we spoke to him, the chaos which is Brexit continues to engulf Europe.

Ahmet Tuncay, Sepior CEO said: “Yes, we’re a truly Danish company and found our footing within the Horizon programme, which deals mostly with small to medium enterprise projects or SMEs.  For companies with promising technologies, the EU economic commission provides grants for the ones they believe will become a commercial success.  But there’s a fairly high bar for them to grant you this money, you have to commit to specific milestones and strict targets.  The commitment our founders of the company made was: ‘If you give us these funds and support, we’re going to create economic activity within the EU, which means hiring people and growing the company’.

He continued: “Our obligation was really to take that money and create a piece of commercially viable technology.  At the early stages, specific use cases aren’t as important as the foundational technology and broad market appeal.   Once the foundation is created,  we wanted to be able to acquire institutional funding to go and build a business.  In the long term our obligation is to create jobs, insofar as the EU is concerned, but now we have commitments to our shareholders, so it’s not just jobs that matter today.”

Tuncay says: “If you just look at the size of the market for encryption key management, you’re not going to be impressed by the number, it’s only around a $1 billion market.  But if you take the same technology, repurpose it and, apply it to commercial asset exchanges, which is a $300 billion market, and find a way to participate in a revenue sharing opportunity, you’ve moved yourself from a $1 billion market to a $300 billion market. You then have to figure out how to extract your fair share from that activity.”

The numbers are certainly impressive if you consider the amount of dollars that brokers and exchanges collect in fees – it’s a vast amount – it’s certainly more than the $1 billion market for encryption key management.  It’s several hundred billion dollars, it is super lucrative and it’s a great market to be in because few companies are good enough to offer a differentiated service to capture new customers..

Tuncay says: “Our investors recognised that the big pain of cryptocurrency activity is that if you lose the coins, they’re gone forever.  So that turns up the need for novel security solutions more than ever.  The digital wallet containing the cryptocurrency assets must be hosted in trusted custody and the transactions involving the wallet must be protected against malicious or incompetent brokers and clients. The need for a higher level of security means having multiple signatures and multiple approvers, which obviously more secure than having just one.  When you have the multiple approvers using our ThresholdSig technology versus a MultiSig or multiple signature technologies, we can deliver very high levels of security and trust along with some operational benefits for the exchange, because the administration of the security policies involving adding people, removing people, replacing lost devices, and who can participate in those signatures, that’s all done off-chain and it’s simple.”

The alternative approach is to use MultiSig, which is all on-chain, so when you change the policies you have to broadcast the policy, telling everyone who the approvers and policies are, which is not really good for security. You may also have to reissue or generate new keys.  There is a lot of administrative bureaucracy that goes with that approach.  Until recently MultiSig has been the gold-standard for threshold cryptographic currencies but ThresholdSig provides an equal or higher level of security with a more flexible, lower administrative effort environment and also has some potential efficiencies to improve and reduce the size of the recorded transaction on the blockchain.  That means that the way the transactions occur, they’re recorded on the ledger, and with MultiSig, the blocks actually contain multiple signatures that have signed off on the transaction, which of course increases the block sizes.

Tuncay says: “With ThresholdSig there’s only one signature that goes on the ledger, so it actually reduces the amount of data on the ledger. It turns out these signatures are a substantial portion of the total transaction size.  So, there’s this kind of tertiary benefit that could end up being quite material, because it means that the blocks can contain more transactions. Blocks are typically fixed in size, so if the transactions are smaller you get more of them onto the chain.  In some of the currencies, like Bitcoin, it’s already hitting capacity on processing.  So, if you can have the highest level of security and smaller transaction sizes it’s going to maximise throughput.”

There is the hope that ThresholdSig transactions will also have lower transaction fees than MultiSig. ThresholdSig transactions appear as a single signature transaction on the blockchain. Historically, single signature transactions are the smallest in size, allowing for maximum transactions per block and typically have the lowest mining transaction fees. Our expectation is that the exchange could end up with lower transaction fees, with higher security and lower administrative overhead. So, there are some very compelling reasons why this technology is going to be relevant to a far wider audience than up to now.

Sepior’s investors were on the front edge of recognising threshold schemes, the cryptography approach with multiparty computation, and how that technology could bring real benefits in this use case.  As Tuncay says: “We’re focusing on the implementation around cryptocurrency exchanges and hot wallets, but this technology is applicable to a much wider range of applications.  So next month we’re going to be making some announcements around more blockchain generic solutions, to provide more privacy on private blockchains in general. There are a whole series of problems with using distributed ledger technology for business and one of these is scalability.  How do you support – for example, in the case of logistics tracking operations,  a container being loaded and shipped from a point in China to destination in Los Angeles? Sometimes there are 35 or 40 different parties involved in that transaction.  These parties don’t necessarily need to know everything on the blockchain.  Effectively all the transactions are on the chain.  So all parties that are participating in the chain can validate and see their own transactions but need not see the confidential data of other parties.  One strategy for this has been to create virtual blockchains called channels, which is used in Hyperledger fabric, but it’s use creates a messy scalability problem.

Tuncay says: “If I were to generalise it further, while a blockchain is supposed to contain transactions that are immutable because everybody on the chain can validate them, the downside is that everybody on the chain can also see everything on the chain.  So how do you create an application like logistics tracking where there are 30 parties on the chain and you want every party to have a different view of it?  Our solution to this – and there are existing solutions which have proven to be unscalable, is based on access control policy that relies on encryption to make only the intended parts of the chain available to users based on their permissions.

“There is nothing magical about this, we’re just using our underlying key management system and fabric.  But once we make this available, it will also enable the creation of privacy-preserving chains that are massively scalable than what is possible today.  We think there’s value there, again this is something that we’re going to go and test out and we’re involved in activity with several large companies, to validate this.  We think that it’s worthwhile.:

Fundamentally Sepior is providing fine-grained control over who has visibility to what on the blockchain.  The key words here are ‘threshold cryptography’.  Sepior is pioneering and leading the industry in the field of threshold cryptography, to apply these key management concepts in a manner that’s more scalable and works in distributed environments with a high degree of efficiency.  Part of the threshold aspect, the threshold cryptography, in the case of a crypto wallet is that you might have four parties who are available to approve a transaction, but you might have a threshold that says if any three are available it will be accepted as a valid transaction.  Therefore, you can define a threshold so that if somebody loses their phone or their device gets hacked and we no longer want to trust it, it can be excluded but continue to transact and do business.

Tuncay says: “When you move into the blockchain application the threshold aspect is more around signing key availability and management. What we’ve done here is to take the key management function and distribute it using multi-party computation (MPC).  We’re able to distribute the key generation and management functions across multiple virtual servers, if you will, in the cloud, such that no individual server has a full key that could be hacked or stolen.  But collectively maybe two out of three of these virtual servers can provide keys for all the users that require access to the content on that blockchain.  This threshold aspect gives a high degree of availability, reliability and integrity of both the encryption and the availability of key management.”

For this Danish company, it looks like blockchain will be The Killing it deserves.

CategoriesIBSi Blogs Uncategorized

The rise of the KYC Utility: How to plan for success

Successful compliance and risk management programmes within financial institutions depend on effective Know Your Customer (KYC) processes yet most have found these processes to be increasingly onerous, both in terms of time and cost. 

This is further complicated by an ever-changing and progressively stringent regulatory landscape. As a result, consortiums of banks, governments and vendors have explored the possibility of reducing costs and improving customer service through the establishment of industry-wide KYC processors or Utilities, with the intended aim of standardising KYC processes.

The latest region to look into adopting this approach is the Nordics, following a series of high-profile money laundering cases there. Leading banks DNB Bank, Danske Bank, Nordea Bank AB, Svenska Handelsbanken and Skandinaviska Enskilda Banken banded together in May last year to announce their intention to develop “an efficient, common, secure and cost-effective Nordic KYC infrastructure” called Nordic KYC Utility.

If designed and executed properly, the potential benefits of a shared infrastructure are clear to see, not least raising KYC compliance standards across the financial industry as a whole. However, there are challenges, as demonstrated by the Monetary Authority of Singapore shelving its plans for a centralised Utility late last year due to spiralling costs. Here, we look at what the objectives of a Utility and the key factors to be considered to ensure success.

Objectives of a shared KYC Utility

First and foremost, a Utility should provide benefits to ALL its stakeholders – from the participating banks and their customers to regulatory authorities.

For banks, the Utility should streamline KYC and customer onboarding, reduce costs and enhance KYC standards and auditability. Meanwhile, their end customers should benefit from a smoother customer onboarding journey and reduced friction.

From the perspective of the regulators, the primary objective of a successful Utility should be to raise confidence – among themselves and society as a whole – that banks are working to the highest KYC standards to more effectively combat financial crime, money laundering and terrorist financing.

Considerations for success when designing and implementing a Utility

The failure of the Singapore KYC Utility has highlighted areas of caution to be considered in other jurisdictions. In a report published by the Association of Banks in Singapore (ABS) after the project halted, it was revealed that “the overall margins at a systemic level did not allow for a viable business case in a projected term, and the proposed solution was going to cost more than the savings that banks would get out of it.” So, what can be learnt?

Approach the project with a thorough understanding of participant needs

It’s important to recognise that, despite all the participants operating under the same legislative and compliance regulations, their interpretations and – importantly – risk appetite will vary immensely. It is therefore crucial to establish a deep level of understanding of the needs of each participant in terms of existing KYC processes, risk methodologies, data and technology requirements.

The reality is one size never fits all. Participating institutions may vary significantly in their need to access various specific sources or adapt certain processes by jurisdiction or customer type based on their established compliance policies. This places extra demand on the Utility operator to select and deploy highly configurable best-of-breed technology, data and processes in the early stages.

Build design with flexibility at front of mind

The report highlights the importance of the core design, stating that “significant priority was given to design choices which represented a highly ambitious ideal” and that “more agility in governing the interaction between design and cost could have helped”. This emphasises the need for flexibility and adaptability, since initial requirements often evolve and so an agile approach and design thinking are essential to ensure the Utility truly delivers value based on the actual needs of the participating parties.

Ultimately, it was the cost of integrating an inflexible solution into a bank’s established compliance processes that proved to be the Singapore Utility’s primary downfall. The ABS report noted that the banks are “all at various stages of sophistication and evolution in terms of client data systems and KYC workflow systems”, and acknowledged that integration costs would account for over a third of the project costs. Therefore, ensuring there is flexibility in the design and technology used to build a Utility is critical in managing costs and ensuring participants realise maximum value today and into the future.

A Utility simply won’t work if its infrastructure is not future proof from advancements in regulation, technology or user experience expectations. For instance, here at encompass we recently launched global biometric identify verification (IDV), and this is the type of feature that Utility participants may well want to use in the future, so the infrastructure has to got to be able to adapt for this.

Are Utilities the way forward?

Given the amount of time and money financial firms currently spend on KYC and customer onboarding, it isn’t surprising that the Utility concept is gaining such traction. While there are certainly questions that remain to be answered – such as whether one Utility can realistically meet all the requirements of multiple, diverse institutions – the arguments in favour are certainly persuasive.

Industry pundits have debated the likely success of a regional KYC Utility and time will tell whether the proposed Nordic KYC Utility will achieve its desired outcomes. However, it is an encouraging move by the main Nordic banks to up the ante in the fight against financial crime, and rebuild their standing in the eyes of both regulators and customers. With a high level of engagement among key stakeholders, there shouldn’t be any reason why the project does not succeed, should it stay true to its primary objectives and understand that design thinking based on inherent flexibility is absolutely critical.

By Wayne Johnson, Co-Founder and CEO at encompass corporation

Call for support

1800 - 123 456 78
info@example.com

Follow us

44 Shirley Ave. West Chicago, IL 60185, USA

Follow us

LinkedIn
Twitter
YouTube