CategoriesIBSi Blogs Uncategorized

Assuring good customer outcomes in a digital world – the five key risks of digital

James Nethercott, Group Head of Marketing at Regulatory Finance Solutions

Online banking is fast becoming the norm and brings with it many benefits. However, it is not without risk. How can firms ensure that customers are being best served by these new ways of transacting?

Digital banking has many benefits. For customers, they can instantly manage their finances from any location using an ‘always-on’ service. For firms, they can scale, gain reach, save cost, capture data more easily and build loyalty. However digital is not without risk. The same risks of mis-selling, poor servicing and inadequate complaint management are still present; albeit in different ways. Control frameworks need to be in tune with these new ways of interacting with customers. The FCA is clear that good customer outcomes should always result, regardless of the channel.

Research from Forrester indicates that rather than undergo a re-design, many products have simply been migrated online. Products designed for sale in branch or by telephone may not be suited to online. Digital demands an alternative way of thinking. When using an electronic interface, customers behave differently than when talking to an adviser. Natural cognitive biases go unchecked and people may be prone to making rushed and less optimal decisions.

Digital readiness demands more

Provider side, digital typically tends toward a pre-occupation with optimising conversion rates. Less attention can be given to end-to-end service design and compliance. Digital readiness means more than having high performing front-end interfaces. It also demands the right back-end processes, policies and controls. Without this good customer outcomes can easily be compromised.

As with most sectors, omnichannel experiences are standard. Customers will switch from one channel to another throughout their journey and firms need to ensure continuity. Typically, this demands good CRM processes so that customers are treated consistently and appropriately at all touchpoints.

Given the risks inherent with digital, a thorough testing programme is recommended. This provides assurance that each channel is working; and where not gives the insight needed to put things right.

The five key risks of digital

Risks in digital may manifest in different ways to other channels. Here are the most critical areas where good customer outcomes need to be assured.

  1. Buying the right product

Without an advisor to carry out a thorough needs assessment, and then recommend products, customers may select products that are not best suited. Online journeys need to guide customers through a process that is easy to follow and provides them with a good match to their needs and circumstances.

  1. Disclosure

Effective disclosure is particularly problematic in digital journeys. Customers may overlook important

information and be prone to over-confidence in financial decision making. It is important that digital journeys provide clear, unambiguous and impartial information. Firms need to be sure that customers fully understand the risks, and this understanding needs to be complicit and tested.

  1. Decision making

The data that customers provide needs to be adequate, appropriate and verified. In addition, the decision-making processes used need to be made clear. This is so customers understand how their information is being used and the terms by which they have been approved, or denied, at any stage.

  1. Product servicing

During the life of the product, service must be effective. Documentation, account servicing, complaints, cancellations and renewals all need to be readily available and compliant. There also needs to be integration with other channels, so where need be, customers can rely on human advice to help them achieve good outcomes.

  1. Vulnerable customers

Firms need to ensure that vulnerable customers are supported and neither disadvantaged or marginalised by digital. Some are unable to access online services, or to use them effectively. The same levels of service must be available offline, either for the whole or part of the customer journey. In addition, firms need to consider how vulnerability is identified in an online environment and then provide appropriate treatment to ensure good outcomes.

Technology vs. humans in a digital world

The industry is already speculating on how technology can be used to improve compliance. The first steps are simply to optimise existing sources of data so that it can be used for analysing compliance performance. More sophisticated approaches, such as applying voice recognition and semantic technology, will only be a matter of time. However, humans are far from redundant in this.

Humans can spot patterns and anomalies in ways that have not yet been coded, and humans are also capable of moral and ethical judgements that machines are not. Machines also need to be taught, calibrated and checked, a task that needs ‘real’ input and intervention.

FCA concerns over robo-advice shows that we may have gone too far in putting all parts of a process to machines. Instead, a balance is needed that incorporates the best of technology and the best of people.

For the time being, at least, people still have a place in ensuring good customer outcomes.

By James Nethercott, Group Head of Marketing at Regulatory Finance Solutions

CategoriesIBSi Blogs Uncategorized

Visa outage highlights IT maintenance challenges – and the promise of predictability

Evan Kenty, Managing Director EMEA, Park Place Technologies

In June, Visa started rejecting one in 10 financial transactions across the U.K. and Europe – a problem lasting 10 hours and affecting 1.7 million cardholders. Even in an IT environment designed to support 24,000 transactions per second, a hardware failure crashed the system. The incident was a wake-up call for an industry reluctant to suspend services for scheduled, expensive repairs. Could predictive maintenance have prevented the crisis?

Predictive maintenance draws on machine learning, neural networking, and artificial intelligence. Commonly used in marketing, learning technologies improve with use: every time you search Google, its accuracy improves.

Yet while AI can predict preference, it is still learning how to factor in context. Nirvana for marketers will be when technology shows my car purchase is followed by a caffeine urge, with my coffee advertised accordingly. It’s the search for the unforeseeable yet real relationship that can only be found with a deep data dive. We’re not there yet, but we’re on the way.

Maintenance that informs itself

The same neural networking technologies are being applied to hardware and networks. There is countless data in a data centre. Just as marketers want to utilise all the information available, so do data centre managers. The promise in machine learning is the ability to examine the full range of performance data in real-time to detect patterns indicative of “faults-in-the-making”, uncovering relationships no human engineer would return, like cars and caffeine.

This application of AI algorithms to data centre maintenance underpins our ParkView advanced monitoring system, which contextualises patterns to “understand” infrastructure behaviours. This means instant fault identification and fewer false alarms. Future predictive systems will prevent the types of issues Visa experienced.

The next stage: predictive maintenance taps IoT

In the Tom Cruise sci-fi movie, Minority Report, police use “psychic technology” to prevent crimes before they happen. The twist comes when the crime-solver is accused of the future murder of a man he hasn’t yet met.

There is a parallel with data centres. Human error causes an estimated 75 percent of downtime. That’s why data centres are less populated. The perimeter has security staff, but the interiors are becoming vast and lonely server expanses, where the electric hum is rarely broken by the sound of footsteps. The downside is the lack of human detection of things like temperature changes and dripping water.

That’s where the IoT and the Industry 4.0 playbook developed in heavy industry comes in, in which remote monitoring enables smart and predictive maintenance. A good example here is fixing a data centre air-conditioning system based on its predicted performance in relation to it’s surrounding environment. This concept can be applied across the entirety of a data centre and its cooling, power, networking, compute, storage, and other equipment. Emerging dynamic and largely automated predictive maintenance management will transform the data centres we know today into self-monitoring, self-healing technology hubs, enabling reliability as we move computers to the edge to support the IoT applications of tomorrow.

Evidence indicates a move from a reactive/corrective stance, still dominant in many data centres, to more preventative maintenance delivering average savings of up to 18%. The next leap towards predictive maintenance drops spending about 12% further. In fact, Google used such strategies to drive a 15% drop in overall energy overhead.

Combating downtime with predictive technology

Enterprises must integrate predictive maintenance. Downtime kills reputations, profits, and customer relationships. Most organisations like Visa can recover from unplanned outages, but reducing unscheduled maintenance is always preferable.

IT leaders must make hardware and facilities as downtime-proof as possible. This means using machine learning and AI to return a pound of ROI on every ounce of prevention possible. Banks are investing in AI for a range of purposes, from contract scanning to fighting fraud. It’s essential that the new technology is used to fix problems in advance.

By Evan Kenty, Managing Director EMEA, Park Place Technologies

CategoriesIBSi Blogs Uncategorized

Sit tight, modern APIs will soon take banks on a fast ride  

Hans Tesselaar, BIAN

The world of banking today is like a race car on the grid preparing for the inevitable green light. There is a lot of noise before the ‘go’ signal; from the vehicles revving their engines, pundits in commentary boxes speculating on the race outcome, and spectators cheering on favourites from the grandstands. When the chequered flag drops and the race begins, a plume of dust and smoke is left behind as the vehicles speed off across the track. The winner is yet to be decided… 

In banking, the race is just starting. Amidst the noise, speculation and fanfare, success in this industry will come down to one key thing: open APIs. Those that can harness them correctly will take the top spot on the podium. 

Shifting up a gear 

Modernisation in retail banking is largely being driven by customers, who have come to expect a level of digitalisation consistent with what they experience in other areas of their lives. Simply compare well-known consumer tech innovations such as the Amazon Echo, or Google’s impressive AI-enabled search function, to understand why people expect more from those who handle their money.  

This is not to say banks have neglected innovation. Flashier, more convenient services for customers have been introduced. But in the face of ongoing political, legacy, technological, competitive and regulatory challenges, the ‘from scratch’ development of advanced Google or Amazon-style services remains an uphill struggle.  

Even in light of the recent technological advancements permitted by open banking, the issues outlined above have prevented many banks from properly grasping the opportunities of technology and the disintermediation of data.  

Opening the throttle 

Open banking is accelerating the banking industry into the future, with APIs acting as the fuel to power the innovation ahead. But successful development and implementation of API-based technology is a long-winded and costly task for banks to undertake alone. To combat this, some banks have started acquiring fintech businesses to quickly bolster their own service offerings. However, for maximum benefit, industry-wide collaboration around innovation is needed. 

This will require banks to shift from a historically closed-off, competitive mentality, to recognising the advantages of pooling knowledge and raising standards of industry innovation together. BIAN, the organisation that I am proud to head up, has spent a decade promoting this ideology. Our global organisation brings together some of the biggest, most innovative banks and technology vendors, to build a common IT architecture or ‘how-to guide’ to streamline the inevitable move to modern, high-quality, and customer oriented services. 

A large part of how to create a modern IT architecture for banks involves utilising a library of definitions for popular APIs, to avoid unnecessary duplication of time, money and effort. BIAN’s current banking architecture contains 26 new API definitions, including ones that instruct banks how to build automated customer on-boarding processes. These API definitions comply with the SWIFT ISO20022 open banking standardisation approach, making them universally compatible. 

Miles ahead 

Adopting a common IT framework would allow the banking industry to launch services faster, and better meet customer demands for smarter and more transparent services. As time goes on, more complex API functionalities will be built, allowing banks to not just incorporate more exciting services into their offering (e.g. WhatsApp payment), but also establish novel ways to maximise new and previously untapped revenue streams. Naturally, modern and streamlined services can reduce operational costs by eliminating outdated back and middle-office processes.  

Looking ahead, the next phase of API development will focus on ‘micro-services’ – that is, API first banking capabilities which run independently from core banking systems. Microservices will provision banks to facilitate a “pick-and-mix” approach to their offerings, allowing them to be more aligned to their customer base. In time, such a model could renew the core banking system and change the banking IT function forever. 

First place 

The introduction of a common IT framework will be of massive benefit to the banking industry, helping major players to address customers’ demands for modern banking solutions in a more effective manner. As the introduction of higher standards for global banking services grows, the industry will eventually move away from competing on service offerings to competing on brand value. Like we have seen in the retail industry, the winners in banking will be those that provide the right mix of innovative offerings as well as premium customer service.  

By Hans Tesselaar, Executive Director at BIAN 

CategoriesIBSi Blogs Uncategorized

E-invoicing: How digital networks are helping to eradicate decade old processes

Chris Rauen, Senior Manager, Solutions Marketing at SAP Ariba

If you have an electronic invoice system that just about meets the needs of the accounts team, but operates in complete isolation from the rest of the company, is that a system that provides much value?

It might do — if you’re doing business in the 1990s. Since then, a plethora of electronic invoicing systems have entered a crowded marketplace, all looking to streamline the complex way of processing invoices globally.

In today’s digital economy, new business value comes from linking invoice data to contracts, purchase orders, service entry sheets, and goods receipt for automated matching. Furthermore, automation of the invoice management process must extend beyond enterprise operations to include suppliers. Yet few platforms enable this. By treating accounts payable as a department, many e-invoice systems fall short of their potential.

So, how can linking electronic invoicing with a company’s other operational systems, and to suppliers, unlock this value? It turns out that an interconnected approach to invoice management in a digital age reduces costly errors, strengthens compliance, and facilitates collaboration both within the organisation and among trading partners.

A cloud-based network can assess trading partners against hundreds of criteria, including whether they can root out forced labour from their supply chain to how well they document the use of natural resources, and even giving work to minority suppliers. Of course, while software alone cannot ensure compliance with the ever-changing policies that continue to come into effect, it remains a powerful tool towards efforts in achieving it. Compliance, once a tedious task, now can be managed from a dashboard.

To reduce invoice errors effectively, a digital network must rely on intelligence — not just the human kind, but through smart invoicing rules that are essential to a business network. These rules effectively validate invoices before posting for payment to streamline processing, reduce operating costs, lower overpayment and fraud risk, and maximise opportunities for early payment discounts.

By enabling real-time collaboration between buyers and suppliers, digital networks not only bridge the information gap that can delay invoice processing, but they also reduce the complexity often associated with compliance. That includes effectively screening suppliers and monitoring business policies automatically before a transaction takes place.

However, perhaps the greatest advantage of digital networks is collaboration. Issuing an invoice, even when accurate and on-time, can sometimes be a one-way, asynchronous conversation. A buyer receives an agreed-upon product or service from a supplier, who at a later date sends out an invoice and, at an even later date, receives payment. This scenario has been the same for decades. But digital networks challenges that. The immediacy of network communications begs the question: Should electronic invoicing merely replicate the age-old process that postal mail once facilitated? Or shall it improve upon it?

We continue to see chief procurement officers choosing the latter. Through their day-to-day experience with digital networks, they have come to view invoice processing as just one part of the wider exchange of information among trading partners. An electronic invoice reflects a snapshot of the multi-party collaboration that networks enable, and — through intelligent business rules — alerts of potential errors or exceptions relating to the transaction. As we move forward in the digital age, and buyers and suppliers extend their relationship to include product design, innovation and product delivery, they are able to expand the scope of electronic invoicing to capture up-to-the-minute progress reports on the teamwork within and across organisations.

Ultimately, your electronic invoicing system shouldn’t focus only on accounts payable, it should give open visibility onto the rest of your operations and even who you do business with – so that mutual growth can be achieved and positive collaboration can flourish.

The author is Chris Rauen, Senior Manager, Solutions Marketing at SAP Ariba, the company behind the world’s largest business network, linking together buyers and suppliers from more than 3.4 million companies in 190 countries

CategoriesIBSi Blogs Uncategorized

The Death of the PIN

David Orme, SVP, IDEX

Personal identification numbers (PINs) are everywhere. These numeric versions of the password have been at the heart of data security for decades, but time moves on and it is becoming evident that the PIN is no longer fit for purpose. It is too insecure and is leaving consumers exposed to fraud. 

Why bin the PIN?

In a world that is increasingly reliant on technology to complete even the most security-sensitive tasks, PIN usage is ludicrously insecure. People do silly things with their PINs; they write them down (often on the back of the very card they are supposed to protect), share them and use predictable number combinations (such as birth or wedding dates) that can easily be discovered via social media or other means. And this is entirely understandable: PINs must be both memorable and obscure, unforgettable to the owner but difficult for others to work out. This puts PIN users — all of us, basically — between the proverbial rock and a hard place.

Previous research has shown that when people were asked about their bank card usage, more than half (53%) shared their PIN with another person, 34% of those who used a PIN for more than one application used the same PIN for all of them and more than a third (34%) of respondents used their banking PIN for unrelated purposes, such as voicemail codes and internet passwords, as well. In the same study, not only survey respondents but also leaked and aggregated PIN data from other sources revealed that the use of dates as PINs is astonishingly common1.

But if the PIN has had its day, what are we going to replace it with?

Biometrics

Biometrics may seem to be the obvious response to this problem: fingerprint sensors, iris recognition and voice recognition have all been rolled out in various contexts, including financial services, over the past decade or so and have worked extremely well. In fact, wherever security is absolutely crucial, you are almost certain to find a biometric sensor — passports, government ID and telephone banking are all applications in which biometric authentication has proven highly successful.

However, PINs are used to authenticate any credit or debit card transaction, and therein lies the problem. For biometric authentication to work, there has to be a correct (reference) version of the voice, iris or fingerprint stored, and this requires a sensor.

It is one thing to build a sensor into a smartphone or door lock, but quite another to attach it to a flexible plastic payment card. Add to that the fact that cards are routinely left in handbags or pockets and used day in and day out, and it becomes clear why the search for a flexible, lightweight, but resilient, fingerprint sensor that is also straightforward enough for the general public to use, has been the holy grail of payment card security for quite some time.

Another key advantage of fingerprint sensors for payment cards is that the security data is much less easy to hack, particularly from remote locations, than is the case with PINs. Not only are fingerprints very difficult to forge, once registered they are only recorded on the card and not kept in a central data repository in the way that PINs often are – making them inaccessible to anyone who is not physically present with the card. In short, they cannot be ‘hacked’.

Your newly flexible friend

Fortunately, the long-held ambition to add biometrics to cashless transactions has now been achieved, with the production and trials of an extremely thin, flexible and durable fingerprint sensor suitable for use with payment cards. The level of technology that has been developed behind the sensor makes it very straightforward for the user to record their fingerprint; the reference fingerprint can easily be uploaded to the card by the user, at home, and once that is done they can use the card over existing secure payment infrastructures — including both chip and ID and contactless card readers — in the usual way.

Once it is registered and in use, the resolution of the sensor and the quality of image handling is so great that it can recognise prints from wet or dry fingers and knows the difference between the fingerprint and image ‘noise’ (smears, smudging etc.) that is often found alongside fingerprints. The result is a very flexible, durable sensor that provides fast and accurate authentication.

The PIN is dead, long live the sensor

Trials of payment cards using fingerprint sensor technology are now complete or underway in multiple markets, including Bulgaria, the US, Mexico, Cyprus, Japan, the Middle East and South Africa. Financial giants including Visa and Mastercard have already expressed their commitment to biometric cards with fingerprint sensors, and some are set to begin roll-out from the latter half of2018. Mastercard, in particular, has specified remote enrollment as a ‘must have’ on its biometric cards, not only for user convenience but also as means to ensure that biometrics replace the PIN swiftly, easily and in large volumes2.

And so, with the biometric card revolution now well underway, it is time to say farewell to the PIN (if customers can still remember it t, that is) and look forward to an upsurge in biometric payment card adoption in the very near future. Our financial futures, it seems, are at our fingertips.

 

By Dave Orme, SVP, IDEX Biometrics

 

References

1 Bonneau J, Preibusch S and Anderson R. A birthday present every eleven wallets? The security of customer-chosen banking PINs: https://www.cl.cam.ac.uk/~rja14/Papers/BPA12-FC-banking_pin_security.pdf

2 Mastercard announces remote enrolment on biometric credit cards: https://mobileidworld.com/mastercard-remote-enrollment-biometric-credit-cards-905021/

 

CategoriesIBSi Blogs Uncategorized

BofE rate rise: the unintended trading cost consequences for banks

Kerril Burke, CEO of Meritsoft

Does anyone long for a return to more benign economic times? A time when a rise in the base rate simply led to immediate benefits for savers. Well, get prepared for a continued long wait, as last week’s decision from the Bank of England’s (BofE) signals anything but a move to more conventional times.

In fact, this rise, albeit small, has much wider knock-on effects than simply “what does this mean for my mortgage repayments”? Similarly, it obviously increases the costs for anyone trading the capital markets in terms of funding. Even with interest rates at historically low levels, some of the biggest players have been losing double digit millions in unrecovered failed funding costs. And with more hikes down the road, there are further implications of the BofE rate increase for the cost of trading.

As of last Thursday, the cost of the fail funding of trades in Sterling shot up 50%. Therefore, any trader looking to borrow say one million to finance a trade now faces an extra 0.25% per annum in funding costs. One of the main strategies traders use to minimise funding is by buying and selling for the same contractual settlement date. This means paying funds from the proceeds received from a transaction. Take the example of a trader selling Sainsbury’s stock in order to fund a purchase of Tesco shares, both for the same agreed settlement date. The trader expects the cash from Sainsbury’s trade in order to settle the Tesco transaction. There is just one small issue – he hasn’t received the money for his stake in Sainsbury’s. In this, let’s face it not untypical scenario, the only way to pay for the Tesco shares is to borrow the money. The trader in question, now has to take on an additional funding cost to borrow the funds to settle the Tesco trade. If the reason for the fail in the Sainsbury shares was due to the counterparty, it does not seem fair that they are forced to pay this additional cost does it?

Market sentiment

But hey, perhaps it doesn’t cost much? The cost will obviously vary based on the amount of cash open and the length it is outstanding but it could run into USD thousands per trade! And the major trading firms can have thousands of securities, FX, equity and commodity derivatives fails everyday. This may have been hidden because rates have been and are largely still at record lows. But the trend and market sentiment is now unmistakably upwards. However, this is only part of the problem.

There are costs and capital for market participants in the wide range of receivables on their balance sheet. These balances, at least the ones in Sterling, are now half a percent more expensive to fund. So the cost of failing to settle these transactions are now far more than they would have been before the hike. A bank is now at a distinct disadvantage, particularly if they do not have a way to identify, optimise and recover where they are incurring funding and capital costs through no fault of their own. Essentially, by having receivable items open while waiting for money to come in, it will be borrowing cash to cover itself. If a trade fails to settle for say five days, then that is a whole week of extra funding costs that a bank needs to cough up. And not being able to track additional funding costs due to the late settlements is not the only issue. Many banks are still not even identifying the direct cost impact of a trade actually failing. If a bank can’t work out the cost implications of not receiving funds when a trade fails, then how on earth can they identify whether or not they can claim money back from their counterparties?

Trying to work out the many effects of the BofE’s latest monetary policy decision is difficult, but like those with a variable mortgage, trading desks are impacted. Late settlement means higher funding and higher rates means the additional funding costs more. Preparing now to handle the trading cost impact of this small rise and the upwards trend is exactly what’s needed to ensure banks are ahead of the curve whenever the BofE or other countries decide to hike rates again in the future.

By Kerril Burke, CEO of Meritsoft

 

CategoriesIBSi Blogs Uncategorized

The Need for Effective Third-Party Risk Management in Financial Services

In the last few years, we have seen the frequency and severity of third-party cyberattacks against global financial institutions continue to increase. One of the biggest reported attacks against financial organisations occurred in early 2016 when $81 million was taken from accounts at Bangladesh Bank. Unknown hackers used SWIFT credentials of Bangladesh Central Bank employees to send more than three dozen fraudulent money transfer requests to the Federal Reserve Bank of New York asking the bank to transfer millions of Bangladesh Bank’s funds to bank accounts in the Philippines, Sri Lanka and other parts of Asia. Bangladesh Bank managed to halt $850 million in other transactions, and a typo made by the hackers raised suspicions that prevented them from stealing the full $1 billion they were after.

Landscape

The Financial Conduct Authority (FCA) reported 69 attacks in 2017 compared to 38 reported in 2016, a rise of more than 80% in the last year. We saw two main trends last year. First, there was a continuation of cyber attacks targeting systems running SWIFT — a fundamental part of the world’s financial ecosystem. Because SWIFT software is unified and used by almost all the major players in the financial market, attackers were able to use malware to manipulate applications responsible for cross-border transactions, making it possible to withdraw money from any financial organisation in the world. Victims of these attacks included several banks in more than 10 countries around the world. Second, we saw the range of financial organisations that cybercriminals have been trying to penetrate expand significantly. Different cybercriminal groups attacked bank infrastructure, e-money systems, cryptocurrency exchanges and capital management funds. Their main goal was to withdraw very large sums of money.

With the evolving risk landscape and the challenges of new potential risks including third party risks, companies within financial services need a set of management procedures and a framework for identifying, assessing and mitigating the risks these challenges present. Effective risk management offers sound judgement in making decisions about what is the appropriate resource allocation to minimise and mitigate risk exposure.

Risk management lifecycle

The basic principle of a risk management lifecycle is to mitigate risk, transfer risk and accept/monitor risk. This involves identification, assessment, treatment, monitoring and reporting.

In order to mitigate risk, an organisation must measure cyber risk performance and incentivise critical third-party vendors to address security issues through vendor collaboration.

In terms of identification, you can’t manage your risks if you don’t know what they are, or if they exist. The first step is to uncover the risks and define them in a detailed, structured format. You need to identify the potential events that would most influence your ability to achieve your objectives, then define them and assign ownership.

Once the risks are identified they need to be examined in terms of likelihood and impact, also known as assessment. It is important to assess the probability of a risk and its consequences. This will help identify which risks are priorities and require the most attention. You need to have some way of comparing risks relative to each other and deciding which are acceptable and which require further management. In this way, you establish your organisation’s risk appetite.

To transfer risk, an organisation is advised to influence vendors to purchase cyber insurance to transfer risk in the event of a cyber event.

Once the risk has been assessed, an approach for treatment of each risk must now be defined. After assessment, some risks may require no action, to only be continuously monitored, but those that are seen as not acceptable will require an action or mitigation plan to prevent, reduce, or transfer that risk.

To accept and monitor risk, the organisation must understand potential security gaps and may need to accept certain risks due to business drivers or resource scarcity.

Once the risk is identified, assessed and a treatment process defined, it must be continuously monitored. Risk is evolutionary and can always change. The review process is essential for proactive risk management.

Reporting at each stage is a core part of driving decision-making ineffective risk management. Therefore, the reporting framework should be defined at an early point in the risk management process, by focusing on report content, format and frequency of production.

Managing with risk transfer

Risk transfer is a strategy that enterprises are considering more and more. It mitigates potential risks and complies with cybersecurity standards. As cybercrime rises, an insurer’s view of cybersecurity has changed from being a pure IT risk to one that requires board-level attention. Insurance is now viewed as fundamental in offsetting the effects of a cyber attack on a financial institution. However, insurers will want to know that appropriate and audited measures are in place to prevent an attack in the first place and respond correctly when cybersecurity does fail. An organisation’s risk management responsibility now extends down the supply chain and insurers will want to know the organisation’s strategies to monitor and mitigate third-party vendor risk.

Simplifying risk management and the transfer of risk can also be accomplished by measuring your organisation’s security rating. This is a similar approach to credit ratings for calculating risk. Ratings provide insight into the security posture of third parties as well as your own organisation. The measurement of ratings offers cost saving, transparency, validation and governance to organisations willing to undertake this model.

The benefits of security ratings will be as critical as credit ratings and other factors considered in business partnership decisions in the very near future. The ratings model within risk management can help organisations collaborate and have productive data-driven conversations with regards to risk and security, where they may not have been able to previously.

Long-term potential

This year we will see a continuation of third-party cyberattacks targeting systems running SWIFT, allowing attackers to use malware in financial institutions to manipulate applications responsible for cross-border transactions across the world. Banks generally have more robust cyber defences than other sectors, because of the sensitive nature of their industry and to meet regulatory requirements. However, once breached, financial services organisations’ greatest fear is copycat attacks. This is where an effective risk management strategy can enable better cost management and risk visibility related to business operational activities. This leads to better management of marketplace, competitive and economic conditions, and increases leverage and consolidation of different risk management functions.

By Tom Turner, CEO, BitSight

 

CategoriesIBSi Blogs Uncategorized

Four Reasons to Use Security Ratings Before Your Next Acquisition

Tom Taylor

For years, cybersecurity was considered a “check-the-box” discussion during the merger and acquisition (M&A) process. It was almost always examined to ensure there weren’t any glaring issues or major red flags—but due to limited time resources, or the ability to parse out qualitative responses during M&A from real performance, there wasn’t a great deal of importance placed on it.  Very few transactions would be prevented due to cybersecurity practices today, however, each M&A does require a financial business case created regardless. This may be as simple as assessing integration costs.

You are probably aware of the security breach at luxury retailers, Saks Fifth Avenue and Lord & Taylor, that compromised payment card information for over 5 million customers. As a result, Hudson’s Bay Company (HBC) who acquired Saks and brought the retail chain to Canada five years ago, suffered a 6.2% drop in shares the following day. Although HBC was able to quickly recover, history has shown that a lack of due diligence on cybersecurity during or after the acquisition process can be devastating to the acquiring organisation.

The reduction in the price of Yahoo, following the acquisition by Verizon is a clear demonstration of the business impact. Following the occurrence of two major Yahoo data breaches, Verizon announced in February 2017 that they have reached new acquisition terms. After slow progress of acquisition following the data breaches, Verizon lowered its purchase price for Yahoo by $350 million, down to $4.48 billion.

Up until recently, cybersecurity due diligence consisted of a set of questions that the acquiring firm presented to the target firm maybe an on-site visit or a phone call. Today, security is a boardroom issue, and the implications associated with it can seriously diminish the value of a future organisation, especially with regard to sensitive data and intellectual property. These have a direct impact on your ability to do business and as a result on the valuation of the deal (Yahoo lost 350M in purchase price value after disclosure).

Typically assessments carried out to measure cyber risk have been point-in-time assessments, such as audits, questionnaires, penetration tests and so on.  However, these only provide a snapshot in time of true security posture.  Businesses that rely on this type of reporting, especially during the M&A process should consider moving towards more continuous monitoring of the business they intend to acquire and also its third-party ecosystem in order to mitigate any risk that could flow into their organisation upon acquisition.

Luckily, there are security rating tools available that can help you understand the true cybersecurity posture of your acquisition. Security ratings are much like credit ratings in that they measure an organization’s security posture.  These are objective tools that deliver a standardised method of reporting risk to the board in a meaningful way.

Below is an information security due-diligence checklist, highlighting the four reasons you should consider using security ratings before, during, and after any merger or acquisition.

  1. It saves you money in the immediate future.

You likely remember the newsworthy fiasco between Canadian-based TIO Networks and PayPal: the payment processing company was acquired by PayPal in July 2017 for $238 million. Just a few months following the acquisition, TIO Networks revealed that as many as 1.6 million of its customers may have had personal information stolen in a data breach.

Companies that conduct thorough due diligence of the security posture of acquisition targets using security ratings review historical security data and can use that information to better structure M&A deals. If their acquisition target has a long or constant history of security issues they may be able to negotiate a lower sale price to counteract potential cyber risks. More importantly, acquiring companies may also be able to help targets improve their security posture, thereby reducing the level of risk incurred as a result of the transaction.

  1. It saves you money in the long term.

While some companies have been breached during a merger or acquisition transaction, others have been breached well after the deal has gone through. A prime example is TripAdvisor’s 2014 purchase of Viator, a tour-booking company. Just a few weeks after the completed transaction, Viator’s payment card service provider announced that unauthorised charges occurred on many of its customers’ credit cards. The breach affected 1.4 million users and led to a 4% drop in TripAdvisor’s stock price.

Security ratings can help. Security ratings are correlated to the likelihood of a breach, so if the rating of an acquisition target indicates they are at risk for a future cyberattack, that risk is inherited by the acquiring company as part of the deal.

  1. It aids collaboration between the acquiring company and their target.

Since acquiring companies inherit the digital footprint of organisations they buy, security and risk departments at both organisations need to have a simple and effective way to collaborate and plan appropriate integration investment Here is how BitSight Security Ratings can help with this process:

  • Acquiring organisations can invite any target company to take a look at their own digital infrastructure and security posture free of charge.
  • Target companies can then use the platform to review their own digital infrastructure, including any owned IP addresses and domains. This is a very important step as many companies often own IP space they may not have accounted for. The acquiring organisation needs to know precisely what is being consolidated, because once the deal is finalised, the acquiring company has a much larger attack surface—so they must be aware if there are any infections or issues so they can monitor adequately going forward.
  1. It gives you a competitive business advantage.

Today, cybersecurity is a business differentiator, and organisations who have a good security rating may use it as a selling point. For example, a highly-rated law firm would be considered more trustworthy. The same idea can be applied to acquisitions. Acquiring a company with a good security posture could be a strategic move, as it could either reinforce or enhance your company’s own security posture and strategy.

In a nutshell, using security ratings is a critical step to continuously monitor your acquisition before, during, and after an M&A deal. Without this real-time look at your target’s security posture and performance, you could end up acquiring vulnerabilities that could cause major damage if exploited.  Indeed analyst firm Gartner issued an M&A report earlier this year stating how important Cybersecurity is in the due diligence process.  Not only will this save your organisation money immediately but prevents future risk of financial losses, aiding your collaboration with the target company and improving your business prospects.  For more information, you can download this data sheet.

By Tom Turner, CEO, BitSight

CategoriesIBSi Blogs Uncategorized

How can banks compete with the tech disruptors?

Digital disruption in the banking industry is something that’s gradually been gathering pace in recent years, but it’s about to get much more prevalent. Enter the GAFAMs. Google, Apple, Facebook, Amazon and Microsoft – the big five global tech companies that have made their presence known by expanding their customer offering and disrupting multiple industries in recent years. In the world of finance, Amazon has just made headlines following the announcement it’s investing in a digital insurer, while Facebook has secured an electronic money license in Ireland.

Banks beware. PSD2 has allowed GAFAMs to access customer data with their permission and use it to provide innovative solutions to their needs and the issues they face when it comes to banking. The GAFAMs have enviable digital prowess and knowledge, not to mention near-limitless funds. Combine this with data-rich customer insight and they could easily change the face of banking forever. So how will this affect the industry as it stands?

 Could challenger banks be the underdog?

Challenger banks have been quietly but effectively shaking things up in the industry, in particular looking at ways customers interact with their bank and providing a more seamless, convenient alternative. The initial Open Banking fears that challenger banks would immediately start stealing vast amounts of market share from high-street banks have been quashed for now, but they have certainly raised standards across the board when it comes to providing a slick customer experience.

So much so that Paul Riseborough, CCO of Metro Bank has stated that it will take a while before Open Banking starts to get exciting, with real innovation approaching in “about three to five years’ time”. In contrast however, PwC revealed last year in some research that 88 per cent of the financial industry is worried they will lose revenue to disruptive innovators. While there is uncertainty regarding challenger banks, it’s more likely that GAFAMs will have more power and influence when it comes to innovation and changing how customers engage with the banking industry.

 Finance and tech crossing over

The lines of relationships between financial organisations and technology platforms are becoming increasingly blurred, as China’s WeChat app has proven. Launched in 2011 with an initial concept similar to that of WhatsApp, it has since evolved into a much broader service that allows its one billion users around the world to do everything from ordering a taxi to arranging a doctors appointment, but also money transfers and other banking transactions.

Given that the GAFAMs are all heavily tech-led, if they were to establish a presence in the financial industry and introduce a similar all-encompassing product, retail banks face a further risk of falling behind in customer engagement and losing market share.

 Investing wisely

Amidst the uncertainty and potential threats brought about by GAFAMs, there is opportunity for banks to improve their innovation strategies using information they already have on their customers. McKinsey recently said in a report that banks may be at an advantage compared to the industry’s disruptors, as “customers would not find it attractive to provide third parties access to their data or accounts.” If banks can harness their data in the correct way before the tech goliaths come into view, they could strengthen their customer retention.

RBS is staying ahead of the curve as it announced earlier this year that it plans to launch a digital-only bank to complete with existing challenger banks such as Monzo and Starling. On a more international scale, a survey by PwC shows that 84 per cent of Indonesian banks are likely to invest in technology transformation over the next 18 months.

Partnerships and collaboration are also key and fast-becoming a growing trend. Software developers are being encouraged to use existing APIs to build platforms that allow financial organisations to improve both the internal and customer-facing elements of their businesses. Avaloq is a good example; its developer portal aimed at freelancers, fintechs and large banks currently has more than 1,000 developers collaborating and sharing insight with the global financial sector to drive innovation. For retail banks, it’s certainly worth taking advantage of the tech and insight on offer from external parties.

 Going above and beyond

The disruptors and challengers which have already made a mark on the financial services industry have done so by going above and beyond the perceived limits of retail banking. It’s something that retail banks need to take a step back and look at to learn from.

Many are already making strides, such as a group of big banks including Bank of America, Citi and Wells Fargo reacting to newcomer Venmo marking its territory on instant transfers. They’ve partnered with P2P payments app Zelle to integrate directly with their own apps.

Instant transferring follows a wider trend of convenience that consumers have come expect from all industries. Banks can go even further by looking at non-banking services which ensure they are making more a positive impact on their customers’ lives. Whether it be the introduction of lifestyle benefits such as high-street discounts, or helping customers to simplify their monthly bills, offering add-ons that increase convenience or reward the customer is likely to make them want to stay. In fact, our ‘Connected Customer’ report shows businesses that offer three or more additional products have considerably higher customer engagement scores, resulting in customers staying longer and spending more.

 Planning ahead

With PSD2 and Open Banking making an impact, it’s all change in the banking industry and as GAFAMs enter the market, banks and fintechs need to plan ahead to maintain their presence and stay relevant to customers.

Innovation and collaboration are the two key ingredients to improve their offering and position. The introduction of GAFAMs and other new players is a healthy addition to the financial sector, as it drives positive change and competition, while customers will reap the benefits.

By Karen Wheeler, Vice President and Country Manager UK, Affinion

 

 

CategoriesIBSi Blogs Uncategorized

Five keys to achieving a hyperscale data centre without a hyperscale budget

Kevin Deierling, vice president marketing, Mellanox Technologies

Don’t be daunted by the overwhelming technological resources of today’s market leaders, says Kevin Deierling, vice president marketing, Mellanox Technologies. Times are changing and that exclusive hyperscale architecture is now within reach of any large enterprise.

How to tame the tech titans asked a January 18th Economist headline in Competition in the digital age. A more recent article (American tech giants are making life tough for startups) outlines the problems of startups in the tech giants’ “kill-zone” – where investors will shy away from any company that might appear to be entering the big boys’ territory.

You do not have to be either a startup or a direct competitor to the likes of the Super 7 – Amazon, Facebook, Google, Microsoft, Baidu, Alibaba, and Tencent – to feel daunted by their sheer market presence and technological dominance. Then there are the second tier “unicorns” like LinkedIn, Twitter and Instagram who share their secret of building massive network infrastructures to achieve

unprecedented power to mine data and automate business processes for super-efficiency. How can the average enterprise survive in a commercial environment that is dominated by such giants?

There are two keys to their market dominance. The first is to have exceptional reach – not millions of customers, but hundreds of millions or even billions. But the real advantage is to have “hyperscale” data ccentres specifically designed to accommodate and work with such a massive customer base.

Hyperscale

“Hyperscale” describes a data centre architecture that is designed to scale quickly and seamlessly to a massive and expanding population of users and customers, while maintaining reliability, performance and flexibility for ongoing development. Until recently there was nothing available that could deliver such a service, so those giants went ahead to design and build their own hardware and software so they could control every detail and achieve unmatched efficiency. This required teams of computer scientists and specialist skills to manipulate every configurable element – something that could not be achieved using off-the-shelf solutions.

By the end of last year there were nearly 400 such hyperscale datacenters in the world, nearly half of them in the USA. There was also a growing number of specialist providers of smart interconnect solutions specifically designed for exceptional performance and minimal latency in order to serve this market.

What has changed is that those same providers now have their eyes on a very exciting opportunity: to apply their experience and advanced technology to simplify the deployment and lower the cost of hyperscaling to bring it within reach of medium to large enterprises. This is wonderful news for thousands of enterprises that will benefit enormously from hyperscaling. For the providers, it also opens up a far larger market.

There are five key factors that must be considered to take advantage of this opportunity.

Key 1 – High Performance

The faster the data travels through a complex system, the more responsive and quick will be the benefits. The leading solution providers have been providing an end-to-end portfolio of 25G, 50G, and 100G adapters, cables, and switches to these hyperscale data centres, and the resulting intelligence, efficiency and high performance is now well proven. Your own business might not yet need 100G performance, but it no longer makes sense to buy 10G now that the cost of 25G is on a par with it.

Key 2 – Open Networking

In a traditional static network environment, the one-stop-shop approach is efficient and reassuring. But today’s business environment demands agility and an infrastructure that can be extended and optimised to meet less predictable changes. Sometimes that means choosing best-of-breed, or sometimes the most cost-efficient, solutions. An open and fully disaggregated networking platform is now vital for scalability and flexibility as well as achieving operational efficiency

Key 3 – Converged Networks on an Ethernet Storage Fabric

A fully converged network will support compute, communications, and storage on a single integrated fabric. To grow a traditional network it was necessary to scale it “up” by the disruptive process of installing further resources into the existing fabric. This is like growing business by recruiting training and accommodating extra staff, whereas in today’s business environment it is often more efficient to outsource skills to meet sudden demand. Hyperscale networks are designed to scale “out” disaggregated hardware, so you can add units of CPU, memory and storage independently – and an integrated, scalable, and high-performance network is the key to achieve this.

Key 4 – Software Defined Everything and Virtual Network Acceleration The hardware required for a converged network (Key 3) is fully integrated with software to orchestrate a virtual environment optimized for the needs of each specific application. The software controller enables the system to be managed from a single screen, and software automation removes most or all of the burden of manual commissioning and ongoing management.

Software defined networking, storage, and virtualization – or software defined everything (SDX) – transforms what would have been an impossibly complex aggregate into an intelligent and responsive whole.

Key 5 – Cloud Software Integration

It goes without saying that you will want your new hyperscale network to be fully integrated with popular cloud platforms such as OpenStack, vSphere, and Azure Stack. It should also support advanced software defined storage solutions such as Ceph, Gluster, Storage Spaces Direct, and VSAN.

One integrated whole

These five key factors show that we have come a long way from a bank’s traditional static datacenter – and this is the way to go. The “Super 7” may be way ahead of anything most enterprises can even dream about, but many more companies will be facing similar pressures for flexible and efficient scalability. A retail or food chain going international could be taking on millions of new customers. There are numerous IoT initiatives that will manipulate terabytes of data flooding into their systems and a company needs massive in-house capability to run and evolve new algorithms. The result could be disastrous unless the systems are designed to scale to meet the needs of the business, while maintaining performance and reliability.

A recent example was provided by Vault Systems, a company that delivers ASD certified Government Cloud to Australian federal, state and local government agencies and their partners – managing sensitive data at the highest levels of security. The company wanted an open, flexible 100GbE network that would at the same time maintain its high level of security. They chose a supplier of hyperscale network solutions to the tech giants but one that also provides for high performance computing, enterprise data centers, cloud, storage and financial services that do not have a hyperscale budget or resources. In the words of Vasult Systems’ CEO and Founder, the resulting system has “contributed to the high performance of our cloud and also given us the confidence and peace of mind that our network is the fastest and most resilient available in market today. We couldn’t be happier with the results we have seen so far.”

Conclusion

All the five keys listed above are bread and butter to the companies that supply those “tech titans”. But don’t be daunted by the thought of asking advice from a company whose customers include giants like Netflix. As a more normal size enterprise you represent their next, even bigger, market opportunity. They will be keen to prove that they can build you hyperscale networking – without a hyperscale budget.

Call for support

1800 - 123 456 78
info@example.com

Follow us

44 Shirley Ave. West Chicago, IL 60185, USA

Follow us

LinkedIn
Twitter
YouTube