finance
monthly
Personal Finance. Money. Investing.
Contribute
Premium
Corporate

More than a third (38%) of IT decision-makers across the UK financial sector believe it has become more difficult over the past five years to find staff with the right skills and experience. Over a third (34%) believe the problem is going to worsen in the coming five years. This is according to a survey across a range of financial and banking sector organisations, including retail and investment banking, asset management, hedge funds and clearing houses.

The survey, commissioned by software vendor InterSystems found a shortage across a variety of roles. Almost a fifth (18%) of respondents cited a lack of data scientists followed by 17% who revealed a shortage in security consultant/specialists, while 16% referenced application developers and 12% mentioned financial analysts.

“IT skills shortages are clearly a major concern for banking and financial services firms across the UK and this is only likely to escalate in the future,” says Graeme Dillane, financial services manager, InterSystems. “Skills shortages are a barrier to innovation in the banking and financial services sector. And as firms upgrade their legacy systems and look to innovate to meet the latest wave of regulations, that represents an increasingly serious concern.”

When survey recipients were asked to name the key qualities that technology can bring to help mitigate the negative affect of skills shortages within businesses today, 44% of respondents said: ‘simplicity of use’, 42% cited ‘ease of implementation’ and 36% ‘high-performance’.   

The study also found that skills shortages are one of the biggest barriers preventing innovation as cited by 35% of the study, behind only cost (41%) while compliance was referenced by 31%.

“These findings match with our experience in talking to customers and prospects across the sector,” added Dillane. “IT employees with the skills that banks and financial services companies are looking for are in short supply. Knowledge transfer is therefore increasingly key alongside solutions which combine ease of development; simplicity of use; high-performance and intuitive workflow transfer.”

(Source: InterSystems)

You’ve seen a lot of content, articles, warning and advice on cybersecurity, with hundreds of firms trying to sell you next level cyber protection. So, before you do anything else, you need to know what exactly it is you’re protecting yourself against. Below Suid Adeyanju, Managing Director of RiverSafe, lists 10 threats you need to be aware of.

In early July IBM Security and the Ponemon Institute released a new report titled ‘Cost of a Data Breach Study’. In this study it was reported that that the global average cost of a data breach and the average cost for lost or stolen information both increased. The former is up 6.4% to £2.94 million while the latter increased by 4.8% year over year to $112.57. This shows that cyberattacks on enterprises continue to rise. In particular over the last two years there has been a continual stream of concerning data security breaches.

One of the ways that organisations can defend against attacks is to ensure staff understand and are educated about the cyber threat landscape.

Understanding Threats to your Business

Getting the right technology, services, and security professionals is only a part of tackling the cyber security problem. It is also important that companies get a clear understanding of the cyber threat landscape. This means knowing where these types of attacks can come from and in turn, who is leading the attack (whether it be an individual or group). Often, knowing the answer to these types of questions leads to an understanding of the motive and makes countering the attacks easier. So, in this article, I wanted to highlight the areas of the cyber threat landscape that enterprises should be aware of.

  1. Nation State: This kind of hacking is often government versus government. It is often functionally indistinguishable from cyber terrorism, but the defining trait is that the attack is officially sanctioned by a country’s government. These attacks can involve not only hacking but the use of more traditional spying as well.
  2. Insider Threat: This is one area where many businesses least expect a threat to come from: inside the business itself. A reportfrom A10 Networks revealed that employee negligence is a major cause of cyber attacks. Employees unknowingly allowing hackers into the business through unauthorised apps. And, on the very rare occasion, a disgruntled employee could try and bring the business down in revenge, so it is always important to investigate who could have access because there is every chance that the threat could come from the inside.
  3. Individual Attackers: When you think of the stereotypical hacker most thoughts turn to a hooded youth sitting alone in their room. This is the individual attacker and their motives are often more one of curiosity and learning. They want to see if they can hack a system rather than attempt anything malicious. This is the most neutral cyber threat.
  4. Industrial Espionage: Sometimes an unrelated group and other times a rival business, cyber threats that deal with industrial espionage have the motive of creating problems for your business. The most common reason for industrial espionage is to discover the secrets of a rival business, often through spying. However, it could also involve destroying valuable data or, with some IoT devices, physically breaking the technology. Anything that can push a business over a competitor.
  5. Cybercriminals: Much like the individual attackers, cybercriminals are an all-encompassing cyber threat. Almost all hackers are criminals in some way and the motives can vary from demanding money, to setting up crypto-mining, to damaging company property. Whatever they do it won’t be a good thing.
  6. Phishing and Ransomware: These are some of the most common types of attacks you’ll find cyber criminals performing. These attacks are motivated purely by financials and exist to either scam a business out of money or hold valuable company data at ransom. Sometimes this can be a distraction to hide something more nefarious. Therefore, organisations need to make sure they are prepared for any escalation.
  7. Ethical Hackers: An ethical hacker is the opposite of a cybercriminal, as the term ‘ethical’ implies. These types of threats are often undertaken for the sake of a company, and often have been paid for by the business to see if it can hack into its own servers. These hackers test the security resilience of a business and locate areas that are vulnerable, before an ‘unethical’ hacker comes along.
  8. Hacktivists: A hacktivist is a sub-set of cybercriminals whose motives are more ideological. As the name references, a hacktivist is essentially a cyber activist. They are using hacking purely to push an agenda, whether political, religious, or otherwise, rather than a financial motive. A hacktivist attack can be something as simple as changing the text on a company website to a more nefarious act that interferes with the day to day running of the business.
  9. Cyber Terrorism: While hacktivists don’t always cause damage, a cyber-terrorist will. Just like real terrorism, cyber terrorism exists to bring terror to your business, country and customers. Examples include the attacks on the NHSlast year which aimed to bring systems down in hospitals and cause chaos and fear.

By understanding all the different types of attacks in the cyber threat landscape it can help you build your cyber defence by identifying a motive and being able to trace what kind of opponent your business is facing, as well as if this is an attack aimed primarily at an individual, an organisation or a national-level threat where the solution would be to work with other companies to stop the attack as a team.

If the recent software failures in the financial industry are anything to go by, then disruption to payment systems are becoming the ‘new normal’. This week David O Riordan, Principal Technical Engineer, SQS Group, delves into the benefits of blockchain, in particular in the aftermath of a software disaster.

The VISA card payment outages, Faster Payments issues and disruption to card payments at BP petrol garages, all within the first half of 2018, have caused many to question the regulatory environment around financial institutions. And with the Bank of England and FCA requesting banks to report on how prepared they are for IT meltdowns, stating that any outages should be limited to just 48 hours, the finance industry is under real scrutiny when it comes to technology.

Corporations are now expected to have a Disaster Recovery (DR) and business continuity plan put into place to avoid falling victim to software failures. Nevertheless, what business leaders need to understand is that while no IT solution is completely foolproof, and will likely go down from time to time, the key is knowing how a potential internal failure can be mitigated without affecting the overall performance. This can only be achieved with a well-practiced DR plan that is second nature to the responsible parties and can be executed in the desired timeline. However, this can be both costly and time-consuming to set up. How can such incidents be minimised, or potentially eliminated, in the future? Blockchain is an alternative technology solution business leaders should consider, as it has fraud protection already built-in and is highly resistant to all type of attacks and failures.

Blockchain for Business Continuity

Built-in Fraud Protection:

Blockchain is a de-centralised platform, where every node in the network works in concert to administer the network and no single node can be compromised to bring down the entire system. It is a form of distributed ledger where each participant maintains, calculates and updates new entries into the database. All nodes work together to ensure they are all coming to the same conclusions, providing in-built security for the network.

Most centralised databases keep information that is up-to-date at a particular moment. Whereas blockchain databases can keep information that is relevant now, but also all the historical information that has come before. But it is the expense required to compromise or change these databases that have led people to call a blockchain database undisputable. It is also where one can start to see the evolution of the database into a system of record. In the case of VISA and other payment systems, this can be used as an audit trail to track the state of transactions at all stages.

Ingrained Resiliency:

Additionally, blockchain removes the need for a centralised infrastructure as the distributed ledger automatically synchronises and runs across all nodes in the network by design. As a result, Disaster Recovery (DR) is essentially built in, eliminating the need for a synchronised DR plan. The inability to alter entries in the ledger also contributes to the overall security of the blockchain, improving resilience against malicious attacks.

This is unlike traditional large centralised systems where resilience is provided by failover within a cluster, as well as site-to-site Disaster Recovery at a higher level. Disaster Recovery plans and procedures can be costly due to a large amount of hardware and data replication required. Furthermore, most businesses often do not execute it, so when disaster strikes, corporations are not prepared to deal with the aftermath; as seen with VISAs outage problems.

The Downside of Decentralised Blockchain Technology

Performance:

While blockchain can be used as a system of record, and are ideal as transaction platforms, they are slow compared to traditional database systems. The distributed networks employed in blockchain technology means they do not share and compound processing power like traditional centralised systems. Alternatively, they each independently service the network; then compare the results of their work with the rest of the network until there is an agreement that an event has happened.

Confidentiality:

In its default, blockchain is an open database. Anyone can write a new block into the chain and anyone can read it. Private blockchains, hybrid limited-access blockchains, or ‘consortium’ blockchains, can all be created, so that only those with the appropriate access can write or read them. If confidentiality is the only goal then blockchain databases offer no benefit over traditional centralised databases. Securing information on a blockchain network requires a lot of cryptography and a related computational liability for all the nodes in the network. A traditional database avoids such overhead and can be implemented ‘offline’ to make it even more secure.

Blockchain for Disaster-Relief?

As an emerging digital disruptor technology, no one can say for sure where blockchain technology will ultimately lead. While many have disregarded this technology, the potential is certainly there to attempt to solve some of the most common problems in the digital space.

However, with high customer demands on the increase within financial services and with the combination of a widespread network and substantial cost pressures, IT outages will continue to impact consumer experience. Businesses can minimise potential damage by managing communication effectively and dealing with the technical nature of the outage quickly. With a comprehensive and well-rehearsed data recovery plan, it can not only mitigate outages but maintain standards of service too. This will encourage customer retention, loyalty and growth. Therefore, blockchain should be considered, as it has a built-in check and balance to ensure a set of colluding computers can’t ‘game’ the system; as the network is virtually impossible to crack. As blockchain processing efficiency improves, it will increasingly become a more viable proposition, potentially making traditional disaster recovery unnecessary in the future.

With the future looking more cashless by the day, the future of cybersecurity looks even more risk heavy. Below Nick Hammond, Lead Advisor for Financial Services at World Wide Technology, discusses with Finance Monthly how banks/financial services firms can ensure a high level of cyber security as we move towards a cashless society.

Debit card payments have overtaken cash use for the first time in the UK. A total of 13.2 billion debit card payments were made in the last year and an estimated 3.4 million people hardly use cash at all, according to banking trade body UK Finance.[1] But with more people in the UK shunning cash in favour of new payments technology, including wearable devices and payment apps as well as debit and credit cards, the effects of IT outages could be more crippling than ever.

Take Visa’s recent crash, for example, which left people unable to buy things or complete transactions. Ultimately, payment providers were unable to receive or send money, causing serious disruption for users. And all because of one hardware issue. Finding new ways to mitigate the risk of system outages is a growing area of focus for financial services firms.

Application Assurance

At a typical bank, there will be around 3,500 software applications which help the bank to deliver all of its services. Of these, about 50-60 are absolutely mission critical. If any of these critical applications goes down, it could result in serious financial, commercial and often regulatory impact.

If the payments processing system goes down, for instance, even for as little as two hours in a whole year, there will be serious impact on the organisation and its customers. The more payments systems change to adapt to new payments technology, the more firms focus their efforts on ensuring that their applications are healthy and functioning properly. As Visa’s recent hardware problems show, much of this work to assure critical applications must lead firms back to the infrastructure that their software runs on.

Having a high level of assurance requires financial services firms to ensure that applications, such as credit card payment systems, are in good health and platformed on modern, standardised infrastructure. Things become tricky when shiny new applications are still tied into creaking legacy systems. For example, if a firm has an application which is running on Windows 2000, or is taking data from an old database elsewhere within the system, it can be difficult for banks to map how they interweave. Consequently, it then becomes difficult to confidently and accurately map all of the system interdependencies which must be understood before attempting to move or upgrade applications.

Protecting the Crown Jewels

Changes to the way financial services firms use technology means that information cannot simply be kept on a closed system and protected from external threats by a firewall. Following the enforcement of Open Banking in January 2018, financial services firms are now required to facilitate third party access to their customers’ accounts via an open Application Programming Interface (API). The software intermediary provides a standardised platform and acts as a gateway to the data, making it essential that banks, financial institutions, and fintechs have the appropriate technology in place.

In addition, data gets stored on employee and customer devices due to the rise of online banking and bring-your- own- device schemes. The proliferation of online and mobile banking, cloud computing, third-party data storage and apps is a double edged sword: while enabling innovative advances, they have also blurred the perimeter around which firms used to be able to build a firewall. is no longer possible to draw a perimeter around the whole system, so firms are now taking the approach of protecting each application individually, ensuring that they are only allowed to share data with other applications that need it.

Financial services firms are increasingly moving away from a product-centric approach to cyber-security. In order to protect their crown jewels, they are focusing on compartmentalising and individually securing their critical applications, such as credit card payment systems, in order to prevent a domino effect if one area comes under attack. But due to archaic legacy infrastructure, it can be difficult for financial institutions to gauge how applications are built into the network and communicating with each other in real-time.

To make matters more difficult, documentation about how pieces of the architecture have been built over the years often no longer exists within the organisation. What began as relatively simple structures twenty years ago have been patched and re-patched in various ways and stitched together. The teams who setup the original systems have often moved on from the firm, and their knowledge of the original body has gone with them.

The Next Steps

So how can this problem be overcome? Understanding how applications are built into the system and how they speak to one another is a crucial first step when it comes to writing security policies for individual applications. Companies are trying to gain a clear insight into infrastructure, and to create a real-time picture of the entire network.

As our society moves further away from cash payments and more towards payments technology , banks need the confidence to know that their payments systems are running, available and secure at all times. In order to ensure this, companies can install applications on a production network before installation on the real system. This involves creating a test environment that emulates the “real” network as closely as possible. Financial players can create a software testing environment that is cost-effective and scalable by using virtualisation software to install multiple instances of the same or different operating systems on the same physical machine.

As their network grows, additional physical machines can be added to grow the test environment. This will continue to simulate the production network and allow for the avoidance of costly mistakes in deploying new operating systems and applications, or making big configuration changes to the software or network infrastructure.

Due to the growth in payments data, application owners and compliance officers need to be open to talking about infrastructure, and get a clear sense of whether their critical applications are healthy, so that they can assure them and wrap security policies around them. An in-depth understanding of the existing systems will enable financial services firms to then upgrade current processes, complete documentation and implement standards to mitigate risk.

[1] http://uk.businessinsider.com/card-payments-overtake-cash-in-uk-first-time-2018-6

Below Jonathan Bennun, product strategist at OneLogin and ex-hacker discusses the current IT sphere, cyber security progress and the open vulnerabilities of today’s tech.

As a hacker, I found vulnerabilities like easy-to-guess passwords made my work much easier. If that attack vector didn't pan out, I could usually get around the authentication flow, or gain basic privileges and escalate them for admin access. We must accept that these vulnerabilities – imperfect authentication and passwords - are not going away anytime soon, and businesses must take steps to strengthen their security posture with this in mind.

A key challenge in eliminating passwords is that too many SaaS providers still don’t offer token-based sign-in such as with SAML or OpenID Connect. On top of that, many enterprises still have dozens - if not hundreds - of legacy applications that require passwords. It will take some enterprises a long while to migrate off these legacy apps which use application-specific passwords, and do not support requirements such as password complexity or password expiration.

In addition, passwords make for only a small part of a strong security posture. Security is only as strong as its weakest link, and on some systems, passwords may be a good attack vector. Real-world attackers are more likely to use alternate attack vectors to get around passwords. Some common ones are:

Being a true password champion means applying password best practices while having a modern approach to access management that is more holistic than a password management tool or a password education campaign.

Here’s what businesses are doing wrong and how they can fix it. To illustrate, let’s use the classic security triangle: People, Process, and Technology.

People

Enterprises invest in education like training for compliance reasons, but often overlook enabling people with self-service for password reset and self-registration of MFA. In addition, companies combat shadow IT, but don’t offer an alternative such as faster onboarding of business apps. For example, your employees need to use LinkedIn and Twitter for business, so provide them with a safe way to manage passwords for those personal apps.

Process

Think marathon, not sprint. Some SaaS providers still don’t offer token-based sign-in such as SAML-enabled login. Enterprises need to gradually consolidate passwords, ideally to a single set of corporate credentials for apps, networks, and devices. Similarly, access management should be unified and holistic across the entire organization with user information and privileges.

Technology

Password best practices are not hard to follow and apply, and they are an important part of your security practice. Having said that, don't stop there, and don't look for a silver bullet. Look for a platform, not a tool, for the wide variety of use cases and for supporting complete authentication and access management scenarios across the enterprise. For example, a single platform can make it much easier to provide password reset self-service to your entire user base.

In summary, being a true password champion goes well beyond password best practices. Enterprises that fail to deploy today’s front-line access management solutions across their orgs - enabling people, planning for a continuous effort, and seeking a full platform solution - are at serious risk and will lose out.

Determined CFOs need to stay ahead of the game if they are to make an impact in an ever-changing market landscape, says Philippe Henriette, SVP of Finance, Processes and IT for Volvo Construction Equipment. Below Phillippe discusses the drive that’s needed to push finance into the digital age.

The finance function has expanded from a laser beam focus on reporting, budgeting and control to include a more overarching strategic role. At Volvo CE we are no different to any other organization in our ambitions to allocate more funds to IT development and innovation. The market is changing and finance should have a clear view on how the digital spend turns into value for our customers. And to operate at its high-performing best, finance needs to have an overview of the 'big picture' and be prepared to invest in new technologies even without the promise of an immediate payback. The use of big data and predictive analytics to identify these new trends is a vital tool in this future focused approach.

We live in a fast-moving environment where digitalization is disrupting industries the world over, yet construction is a relatively conservative sector. At Volvo CE we have to think about how our industry might look further down the line and how we can adopt new technologies and new ways of working to shake up our traditional business model. After all, the demands of a customer today might be radically different tomorrow. And finance has a vital role to play.

Interpreting changing customer needs

We looked to the wider economy for inspiration to see how companies like Uber redefined

the way people buy and access services – a way of spending that is beginning to filter through into other industries. Owning an asset is becoming less important to customers who are shifting to a value-buying spending model. So if our business is to sell a construction machine, and its relevant parts and services, how can we adapt for the future? With the emergence of electrification and other technologies, shouldn’t rental services be generalized? Should we be selling our services by the hour? And it is already happening. This was the impetus behind us introducing a ‘power by the hour’ scheme for one of our key accounts. Our customer demanded to get the construction job done, but instead of purchasing our machines, they only pay the hours and value machines create. If this is the future construction business model, then finance cannot stand still. We need to be ready to support the business transformation from generating revenue on machine and parts to selling services.

Data-driven culture shift

Our aim is always to simplify things for our customers, and to do this we have to have a deep understanding of their needs and stay steps ahead of those demands. Shifting from a product centric to a data driven culture plays a key role. By putting data analytics at the heart of our research and development and turning customer and product information into insight we can be confident we are staying ahead of the game.

Equally, if we are going to provide the flexibility our customers require, we need to be brave when it comes to fixing a price point for our new services. I have learnt that we cannot test the waters by bringing new services to market without understanding how much it is worth. By doing this we would make it impossible to set a price when it proves a success. Instead we do our due diligence through data analytics so that we can be confident we are setting the right price from the very start. With this data-driven culture comes a huge responsibility on the part of the CFO to handle this information appropriately. We do this by ensuring we have proper systems in place to protect the data we use – an issue that is becoming increasingly important as digital technology leaps into the future.

ROI for a new digital era

Having an eye for future trends – and the risks and rewards that go with them – is one thing, but how can CFOs be assured of a profitable return on investment on these new innovations in the years to come? Developing the right set of measurements to monitor the progress of new digital offerings may not lend themselves to standard ROI calculations. It is essential therefore to adopt non-financial metrics alongside the usual measurements of cash generation and profit so that we have the big picture we need to drive the company through this new digital era.

We are working in a vastly different corporate landscape today than we were 20, 10, even 5 years ago. The finance function has navigated choppy waters during the economic downturn and is now learning to adapt to customer demand and increased innovation. This puts us in a unique position to act as a driving force for the digital revolution. The world is changing and it’s up to every CFO in every industry to stay ahead of the curve.

Take a look at the inner workings of any modern enterprise, and there’s a good chance you’ll find IT silos - islands of departmental data only loosely connected across the organisation. Such isolation presents a potential regulatory risk and undermines the rich productivity gains that digitisation should be driving across commerce, and yet these silos are becoming ever more commonplace.

Whereas ten years ago the primary cause for disjointed IT was the existence of outdated legacy systems within operations, now it is the advent of hosted independently-sourced solutions that is driving compartmentalisation across the IT landscape. With some options coming out of operational, rather than capital, expenditure, departmental heads have empowered themselves to take the matter of updating their processes and software into their own hands.

This empowerment has bred productivity gains, as departments have acquired best-of-breed functionality from systems to support their specific needs. Front and back office operations - from finance and business development to HR, logistics and marketing - have been invigorated by the introduction of solutions specifically implemented to fill operational gaps; address deficiencies and bottlenecks; and allow functionality which had been on managers’ wish-lists for a decade.

Unfortunately, these upgrades have often been made without consideration for the rest of the organisation. This narrow-minded piecemeal approach will return to haunt organisations across most sectors in the years to come, if the issue is not addressed on a company-wide basis.

The dangers represented by such silos are already becoming apparent within many firms: Reliability of data, in particular, is becoming ever more important for both regulatory and operational reasons. But if customer information is stored separately by each department that needs it, the numerous versions which a company possesses can gradually digress. In the case of a financial services organisation, for example, a loan approval department may end up holding a different set of data on a client than the online banking platform. The eventual outcome could range from frustrating or embarrassing the customer, to incurring bad debt and regulatory sanction.

At the very least, such a situation is highly inefficient from a business perspective, and an obstacle to good customer service. There are also cost implications in time and money: Time, because it is harder for employees who require data to access it; and money because the charges for storing and processing data are not inconsiderable, particularly given increasing regulatory and security requirements.

Therefore, as digital transformation is helping businesses to address individual operational problems, the time has come to reassess the approach and ensure that the entire information ecosystem is supporting the greater demands of internal and external customers.

Executive leadership must acknowledge that digitisation alone will not enhance information flow, innovation and productivity, unless there is a clear enterprise strategy to ensure information is made available and can be freely interchanged. Without this, content fragmentation is likely to accelerate, creating further challenges to aggregating, connecting and managing the flow of digital content.

There are inherent challenges for businesses looking to safeguard the efficient and secure access to enterprise-wide information, while retaining the benefits of a distributed approach to technology. One approach that is working well for an insurance client currently in a process of change and growth, is to encourage departments to first seek a solution to any IT need they have from one of a ‘family’ of trusted providers.

In this scenario, it is crucial to work with partners who are committed to ensuring the best for your company: whereas some IT providers will be inclined to make a sale of their own software at all costs, others will be happy to recommend a ‘friend’ from the trusted business family, where they feel that their rival can provide a more suitable product.

At the same time, this ’friends and family’ approach encourages supplier firms to work together on inter-operability and connectivity issues, and to adapt their own products, where necessary, to ensure a solution that is both bespoke and easily integrated into a wider corporate system. With such an approach, all the core systems can be hosted under a single roof - our client works with five core suppliers - and the momentum is towards further integration, not divergence, as each new applications is added.

However, even with such practices, institutions of any size can end up running hundreds of applications. It is essential to link those data repositories and ensure that they are accessible to all potential users, with as much ease as possible. This can be accomplished with an enterprise information hub: a unified information platform, which facilitates an end-to-end view of the organisation’s entire ecosystem.

Such a hub is a valuable tool for management and a driver of innovation, as it is used to speed feedback times and analyse data on whole-company performance. It is also invaluable when it comes to increasing efficiency and diligence at the ‘coal face’, by allowing all documents to be viewed on a single platform or device.

As digitisation drives further changes in years to come - some not yet conceived or planned for, the ability to integrate new systems and view operations holistically will be crucial, if organisations are to fully realise potential gains and remain efficient.

 

Website: www.hyland.com

It has emerged that TSB could be facing £16 million in fines for the catastrophic meltdown of its online banking software which prevented customers from accessing their bank accounts and using their debit cards. On the back of our Your Thoughts this week, Yaron Morgenstern, CEO at Glassbox Digital, discusses the important lessons we can learn from this ordeal.

Almost a month after the crisis emerged, mortgage account holders are still unable to access accounts online, while business customers continue to face problems making online payments.

TSB’s response to its customers’ fury is more revealing, with customers unable to get through to customer service teams, even after fraudsters have drained their accounts. Any financial organisation that truly values its customers can learn a number of lessons from this meltdown. Providing a positive and consistent customer experience is vital in today’s digital environment – and this is likely to get even more important as your clients move away from human interactions, such as in bank branches and via call centres.

In the aftermath of TSB’s IT disaster, the question is: how can organisations create digital engagements that are responsive to clients’ needs and at least as successful as human engagements?

Be ethical

A digital footprint is the only way to understand the issues your clients are experiencing, whether they are on a similar scale to the TSB crisis, or as tiny as a minor frustration. However, the Cambridge Analytica scandal has reminded business of the importance of considering ethical data collection when measuring your customer’s experiences.

These recent events, and the distrust that surrounds tech giants and data collection, have showed that financial organisations must inform their online users how their data is collected, stored and used. More importantly, it must be remembered that customer data is on loan to businesses for a given period of time and not owned by the organisation. As such the data collected must be relevant to the individual customer and be able to offer them a distinct advantage in the customer experience.

Be helpful

In light of this mistrust it’s more important than ever that you demonstrate the advantage your processes offer to customers and clients. We are now in a world where there are all kinds of service users, devices and operating systems operating in the financial services environment. This landscape will only become more complicated as the amount of IoT-enabled devices continues to increase. How organisations connect with customers will also evolve in line with these technological advances.

Digital mapping allows businesses to know precisely what browser, device and operating system each online user is operating on, and therefore to know more about the experiences users are having than ever before. The upshot for customers is that these organisations can offer an improved digital journey at every touchpoint in return.

Be responsive

In this digitally-enabled world, organisations should be more capable of staying in touch with their customers. Digital processes need to identify customer pain-points and solve these problems before they begin to mount up like they did at TSB. And instead of operating in complete silos, IT and customer service teams must work together. When considering the TSB disaster, you cannot help but wonder how prepared other parts of the business were for the back office switch.

How can you react immediately to any issues that emerge? Customisable alerts can be set up that go out to IT, customer service, marketing and web development departments that warn about problems on the website and app. With these alerts in place, all teams have full visibility of digital problems and there are no nasty surprises. Similarly, if a user then approaches a customer service representative with a problem, the handler of this complaint should be able to effortlessly tap into the online session data and identify what the issue is and where it lies.

Be pre-emptive

The TSB fire was stoked by Sabadell’s development team, who before the IT crash were publicly toasting what they thought was a successful migration of customers to a new platform. Whilst this is a PR disaster, it also demonstrates how little they understood about the potential pitfalls they were facing. With such a heavy reliance on online experiences, it’s important your teams consistently prepare for failures, in order to best react.

Financial services firms must put in place processes that prevent online glitches (however small these may be). If they do so, businesses will enjoy increased customer loyalty and retention. Rather than simply employing digital mapping when moving legacy systems over or updating a customer portal, it should be engaged day-to-day.

Can you do it?

The finance industry is more reliant on the online experience to retain and win customers than ever before. Despite this, not all banks and insurers are doing it well. Making sure that your IT and business processes are ethical, ongoing and integrated will help guarantee customer loyalty and retention. This approach will insulate businesses from IT disasters like the TSB fiasco – or at least allow them to respond properly in the event of a crisis.

The ongoing TSB IT meltdown has been strong evidence of the risks and challenges financial institutions face daily. It has caused mass uproar from customers and severely tarnished the bank’s overall reputation.

TSB started a long-planned move of 1.3 billion customer records from its former parent company, Lloyds Banking Group, to Proteo4, a platform built by TSB’s Spanish owner, Banco Sabadell. The change-over, which started on Friday 20 April, was supposed to be completed over the weekend by 18:00 on Sunday. But on Monday morning millions of customers were unable to use online or mobile banking or had been given access to other people’s accounts.

Error messages and glitches meant paydays and company salaries were turned upside down across the UK. This has understandably caused a chain of problems across many sectors. TSB’s overall response has not been appreciated by the public and its customer service methods have been hugely questioned.

Below Finance Monthly lists some of Your Thoughts on TSB’s IT failure and its customer service approach.

Mark Hipperson, CTO, Centtrip:

Looking more closely at what happened and how the events evolved, it appears that some key IT best practices might have been omitted, such as:

  1. Production system access: it appears developers had access and were making live fixes to production. This is a big no-no in software development even in an ultra-agile DevOps environment.
  2. Rollback plan: when it all went wrong, it appeared there was no contingency plan or option to revert back.
  3. Incremental proving: it would have been more appropriate to first validate each change to ensure it was successful before moving to the next.
  4. Testing: It is pivotal to confirm all changes have been implemented successfully and work well. There are many different types of testing: user, operational, data migration, technical, unit and functional, which would have helped identify any issues before customers did.
  5. Early Live Support: it is crucial to make sure sufficient highly skilled staff are available immediately after the release in case things still go wrong.

And last but not least is proof of concepts (PoCs), which would have revealed any tech and planning errors. TSB should have run PoCs on test accounts, or even staff accounts, before the full release.

Alastair Graham, spokesperson, PIF:

Small business customers have reached a nadir in their relationship with traditional banking partners. Branch closures and the move of services online have meant that few now receive any active guidance or support from their bank in helping to grow their business.

At the same time, many feel that even basic banking services aren’t meeting their expectations. Even without issues such as the recent TSB banking crisis, businesses would like improvements to be made.Whether that is quicker account opening processes, simple lending or transparent and fair charges, the demand for alternatives is growing.

Tech innovations, combined with legislative changes such as Open Banking, mean that more products and services are being launched, designed specifically to meet the needs of small business customers. SMEs have already shown they will trust other providers when their banks fail to provide adequate services. This has been particularly evident where prepaid platforms offer more versatility, while still being a safe, secure and flexible method to transfer money.

Yaron Morgenstern, CEO, Glassbox Digital:

In today’s digital age, customer experience is more important than ever. This banking app drama has revealed how important it is to measure your consumer’s experience with complete visibility of any problems. This should really be an ongoing effort, and not just when you plan large scale back office migration. There are three fundamental tenets to an effective customer experience: observation of the customer journey via touchpoints, reshaping customer interactions, and rewiring the company’s services to align with customer expectations.

It is only through advanced digital analytics and AI technology that organisations can understand what is going through their customers’ minds. These are powerful tools for mapping out customers’ digital journeys from the moment they visit a website. This all goes to the heart of improving conversion in the digital customer journey.

Fabian Libeau, EMEA VP. RiskIQ:

The fact that TSB’s IT meltdown dragged on for such a long time, meant that customers were locked out of their accounts for extended periods. It also made them vulnerable to digital fraud in the form of phishing. TSB itself has warned more than five million customers that fraudsters have been attempting to take advantage of its IT breakdown to trick people into handing over information that could enable them to steal their money. Criminals exploiting brands to defraud stakeholders in this way is nothing new, and we know that financial institutions are a much-loved target for hackers, given the highly-sensitive and valuable information they’ve been entrusted with – it is therefore no wonder that cybercriminals are queuing up for an opportunity to impersonate the bank online.

Andy Barratt, UK Managing Director, Coalfire:

In the grand scheme of things, the TSB incident is perhaps not as significant an event as a nation-state hack like last year's WannaCry. But it has still left many, including the ICO, concerned that a major 'data breach' occurred just weeks away from the implementation of the EU’s General Data Protection Regulation.

The power to hand out major fines that GDPR affords the regulator means that the price of poor data protection is about to become far easier to quantify. When the regulation comes into force at the end of the month, a breach like TSB’s would certainly require a Data Protection Impact Assessment and measures put in place to ensure a similar incident doesn’t happen in the future. At the very least, TSB will have put themselves on the ICO’s radar as ‘one to watch’ when GDPR comes into effect.

While the share price of Banco Sabadell, TSB's Spanish parent, wasn’t overly affected by the incident, there could still be a significant financial consequence for the bank. We now know that a large number of customers are affected so the cost of rolling back any mistaken transactions as well as offering support, and potentially refunds, is likely to eat up a lot of operational resource. This event should be a reminder that data protection and the safeguarding of personal information has to be to priority for financial institutions.

Andy Barr, Founder, www.10Yetis.co.uk:

The best thing you can say about the TSB approach to public relations throughout its issues is that it is going to become the modern benchmark for university lecturers on how not to approach crisis communications.

From the very outset, TSB has failed in its approach to handling this ongoing crisis. Its messages have been wrong, even from its highest-level member of staff, the CEO. He has repeatedly issued statements that have been incorrect and that he has had to retract and apologise for.

TSB’s brand reputation is now circling the plughole and its Spanish owners could very well be forced down the route of a re-brand in the mid to longer term in order to try and recover their reputation. I fully expect a classic crisis communications recovery plan 101 to be rolled out, once this all dies down. Step one; apologise (usually full page ads), step two; announce an independent investigation, step three; a member of the C-Suite gets the Spanish Archer (El-bow), and then step four; another apology before trying to move on.

Whatever the final outcome, this has been a public relations disaster for TSB and they are very lucky that at the time that it happened there was so much other “hard news” going on such as Brexit, rail company re-nationalisation and, of course, Big Don, over the pond, constantly feeding the 24-hour news agenda.

Danny Bluestone, Founder & CEO, Cyber-Duck:

The TSB fiasco shows that many organisations vastly underestimate data migrations. Moving data on such a scale from an incumbent system to a different one is an inherently complex task. There are several steps to follow for a successful migration.

First and foremost, it begins with a considered strategy for structural changes that ensures no legacy data is made unusable and new functionality is accounted for. Banks like Monzo test new features within alpha and beta modes, so new pieces of functionality are tried and tested before a mass general public release. TSB would have been wise to utilise test scripts and automated testing to auto-test thousands of permutations from login to usage of the system. Relevant applications that monitor errors could have then detected issues early on.

TSB could have also used a run-book for deployment so all steps of deployment are documented. When an error was detected, TSB could have rolled back without data loss. Problems could also have arisen if TSB failed to use a testing environment that was identical to the production environment. As if there is even a slight difference, the user experience can break.

With regards to the application hosting, TSB should have an active engineering team monitoring performance 24/7. In our experience at Cyber-Duck – from working with numerous institutions including redesigning the Bank of England’s digital website – there really is no excuse for users to suffer. Complex data migrations can be dealt with in a secure and efficient manner if best practice methodology is followed.

Adam Alton, Senior Developer, Potato:

Software is difficult; Microsoft still hasn't finished Windows. Trying to write a new piece of software or create a new system, and then migrate everything over to it in one go is likely to go badly. The chances of it working are incredibly slim. Instead, a migration in several parts would be better. Release small, release often. When Mark Zuckerberg said "move fast and break things", you could interpret that as "you're going to break things, so do frequent and small releases in order that you break as little as possible before you get a chance to fix it". The problems with TSB's migration appear to be multiple and disparate; error messages, slowness and capacity problems, users shown the wrong data. It seems unlikely that these stem from a single cause or single bug, so it would seem that they tried to do too much at once.

Coerced optimism: when under pressure to get something to work, it's easy for a team of developers to wishfully believe that something is finished and working because they can't see any problems, even though their experience tells them that the complexity of the system and the rushed job they've done means that it's extremely unlikely to be free of issues. I wouldn't be surprised if IT workers at TSB fell into this trap, leading to the premature announcements that the problems were resolved.

Denying that you have a problem is always a bad idea. Amazon Web Services (AWS) provide a detailed status dashboard giving a continuous and transparent view of any issues on their systems. They don't deny that they occasionally hit problems but instead have a process in place for actively updating their customers with as much information as possible. This transparency and openness clearly win them a huge amount of customer trust.

Senthil Ravindran, EVP & Global Head, xLabs, Virtusa:

Fortunately for all involved, it seems as if the worst of TSB’s IT debacle is now behind it. But its botched migration led to more than 40,000 customer complaints in what was arguably the most high-profile banking error we’ve seen this year. Worse still, the technology itself isn’t to blame here – both previous owner Lloyd’s and the Proteo4UK system used by new owner Banco Sabadell have a good record in handling data. Instead, the responsibility here rests solely with TSB.

It mostly boils down to a lack of proper preparation on TSB’s part. Banks carry out small data migrations regularly, but a large-scale migration such as this typically calls for months of preparation. Actually moving the data isn’t the tricky bit; drawing the data from the siloes it’s stored in across the business and knowing how it’ll fit within the target system is the real challenge. This is why banks are increasingly looking to ‘sandbox’ the testing process; creating a synthetic environment with the data they hold to gauge how it’s likely to fit within a new system of record. Granted, this approach to testing doesn’t happen overnight, but when applied properly, it reassures banks that the actual migration will run smoothly.

This method would likely have spared TSB the disaster it has faced. Yet in reality, we’ll likely see similar high-profile stories appear over the coming months thanks to the combined pressures of GDPR and open banking. The former is forcing banks to bolster their data handling practices in order to avoid hefty financial penalties, while the latter is forcing banks to expose their data to all manner of third parties. Both initiatives are incredibly difficult for banks reliant on decades-old legacy IT systems to manage (indeed, it’s likely that the GDPR deadline this month may have added pressure on TSB to rush the migration through), and as the reality of this new banking environment begins to set in, expect to see other examples along the same lines as TSB’s.

We would also love to hear more of Your Thoughts on this, so feel free to comment below and tell us what you think!

The private sector outsourcing market soared to a three-year high in 2017 as businesses signed contracts worth £4.93 billion, according to the Arvato UK Outsourcing Index.

The research, compiled by business outsourcing partner Arvato and industry analyst NelsonHall, found that the total value of contracts signed by UK companies rose 36% year-on-year, from £3.62 billion in 2016 and £1.84 billion in 2015.

Overall the UK outsourcing market saw an increase of nine% year-on-year in 2017, with contracts worth £6.74 billion agreed by the public and private sectors over the period.

A surge in technology investment was behind the strong performance in the private sector, according to the findings. Businesses spent £3.82 billion on procuring IT Outsourcing (ITO contracts) agreements in 2017, more than double the value of deals agreed in 2016 (£1.73 billion).

The analysis shows that companies focused their spending on securing multi-process IT deals, which included new hosting services, equipment, network infrastructure, data centres and application management.

Customer services accounted for almost half (46%) of business process outsourcing (BPO) agreements signed by companies last year. Firms spent a total of £508 million as they looked to deliver improvements in customer experience across traditional and digital channels, according to the findings.

Debra Maxwell, CEO, CRM Solutions UK & Ireland, Arvato, said: “The private sector is increasingly outsourcing more sophisticated work, with firms turning to external partners to introduce new technology and enhance the customer experience.

“This shift towards greater complexity is contributing to more outsourced services being delivered here in the UK. Just two% of private sector deals procured last year will be delivered offshore, compared to 12% in 2016, as outsourcing continues to move up the value chain.”

Overall, fewer deals were agreed across the UK outsourcing market last year, with 98 procured compared to 165 in the 12 months previous, according to the research.

The rise in spending in the private sector market comes as activity across the government market fell year-on-year. Central government departments and councils signed contracts worth £1.82 billion in 2017 compared with £2.59 billion in 2016 – a 30% drop.

Excluding work procured for healthcare, the data shows that the average value of deals signed across government was down 42% year-on-year in 2017

Debra Maxwell added, “In line with calls for a review of the government outsourcing model, the findings show the public sector is already moving away from procuring long-term, high value outsourcing contracts.

“Councils and central government departments are now accessing the technology and expertise they need to deliver a range of functions, from digital service transformation to cyber security, through smaller contracts for productised services.”

Financial services leads private sector growth

The analysis shows that a sharp rise in the value of outsourcing contracts procured by financial services businesses was behind the growth in private sector spend last year.

Companies across financial services agreed deals worth £3.26 billion in 2017, more than treble the total value of contracts agreed in the previous year (£829 million).

According to the research, the growth can be attributed to a sharp increase in ITO spending as firms turned their attention to deals in application management, application hosting and end user computing. The findings show ITO contracts worth £2.70 billion were signed across the sector last year, up from £208 million in 2016.

Pat Quinn, CEO of Arvato Financial Solutions UK & Ireland, said: “Financial services businesses are under pressure to transform, particularly in the wake of high profile security threats and the upcoming GDPR obligations.

“The findings show that a growing number of companies see outsourcing as key to addressing the challenge, delivering the resilient infrastructure and architecture they need to protect against cyber-attacks, keep their data safe and comply with new privacy legislation.”

Alongside financial services, telecoms & media and energy & utilities were the most active sectors in the UK outsourcing market, procuring deals worth £1.08 billion and £279 million respectively, according to the findings.

The research showed that the average value of contracts signed across the private sector more than doubled to £91 million in 2017, from £36 million in the previous year.

(Source: Arvato UK & Ireland)

Headquartered in Montreal, Canada, Interfacing Technologies Corporation (Interfacing) has been developing award-winning software solutions for over two decades, serving both large & small organisations across the world. Recognised as not only a pioneer, but also a leader in the field (within Gartner® Enterprise Business Process Analysis, Business Operating Systems & Operational Intelligence reports), the company provides quality management technological solutions to document, analyse, improve, and govern process, risk and performance data.

This month we caught up with Scott Armstrong, Managing Partner and one of four owners of Interfacing, who discussed the company’s commitment to continuously innovate and redefine the future of business software solutions.

 

Interfacing Technologies dates back to the early 80s – can you tell us more about the history of the company and its commitment to process? How have Interfacing’s solutions evolved? What are the company’s mission and values today?

Originally, Interfacing was a small IT consulting company. In the early 90s, the company was part of a large research and development MRP project which was funded by the National Research Council of Canada and included some big players such as Nortel at the time. As a result of the company’s participation in the project, in 1994 Interfacing released one of the first process modelling and simulation tools.

With the advancement of the World Wide Web and as Business Process Reengineering (BPR) evolved into Business Process Management (BPM) in the early 2000s, Interfacing launched one of the first web-based collaborative centralised repository solutions – the Enterprise Process Center® (EPC). To date, this is our flagship product and we are currently on version 10 (a.k.a. EPC X) – the revolutionised next generation of the platform - fully cloud-based, mobile ready and architected utilizing the latest & greatest highly scalable technology stack. Interfacing has been a player for many years, however, the change of ownership and management team four years ago is what catapulted the company forward and continues to fuel the growth today. Interfacing’s mission is to empower organisations to efficiently govern business complexity and transformation through process based quality, continuous improvement and compliance management solutions. I’m proud to share that the vision, team, and technology are stronger now than ever before!

 

What is the current state of the market and how does your solution align?

The need for automation is now well beyond to increase productivity, the digital age has forced companies to re-evaluate their core product and service offerings. The need to reassess what their company does as a business and apply changes to adapt to the digitisation of their industry is a big driver for our growth. Every industry is undergoing massive changes connected to digitisation, so with such large-scale digital transformation projects comes the need for major business transformation.

Our repository based knowledge management platform is designed to assist with requirements gathering, system configuration and deployment training. The platform provides the organisation with a blueprint of the current and future state and a tool to support the ongoing change management – every user can see the previous process versus the current process, what has changed, why it has changed, where they do their new work and how to do it. In effect, offering searchable mobile Digital Standard Operating Procedures (SOPs) with video work instructions.

With digitization and the fast pace of technologies’ ongoing evolution came the need for increased agility. The Enterprise Process Center®’s Rapid Application Development (RAD) module supports low-code development and workflow automation. The EPC RAD platform is unique in the BPMS market because it supports not only process and case management event based automation but transactional and data driven application types as well!  With graphical editors for data modelling, web forms design, process mapping, rules configuration, and dashboard creation, the EPC RAD solution helps companies create and continuously improve an application with relatively low amount of time and effort.

 

What trends do you expect in digital transformation in 2018 and how will these affect the products that Interfacing offers?

The main trend that we’re seeing is the role of Artificial Intelligence (AI) and Robotics in automation. The way we're planning to leverage AI in terms of our technology base depends on the module. Our KPI module for example could leverage AI by providing predictive analytical insight to help our customers foresee issues before they arise. Within automation, the software could provide more information at the point of decision making based on the trends of historical actions and with the addition of robotics even remove the need for any human intervention within certain use cases.

A progressive trend in the process analysis is space is customer journey mapping. Before the internet and social media, customer experience didn’t have such a direct impact and influence on a customer’s buying patterns. In today’s digital age, the customer is spoilt for options when it comes to finding a specific service or product and can easily compare these options by doing quick research online. Fuel this with the fact that a customer is more probable to make the effort to post a review about a negative experience than a positive one and hence the heightened importance of mapping your customers’ journey. Historically, businesses were focused on trying to improve their internal process to reduce things like cost, duration, delay, without paying much attention to how this impacts the customers’ experience. What is truly remarkable about our tool is that it not only allows businesses to map their customers’ journey, but they can connect the customers’ touch points to their internal operations to visualize the handoffs within a single diagram. This transparency provides clarity in relation to the areas that need to be evaluated, revised and improved in real-time, in turn increasing corporate agility. It’s important that we continue to invest in this area, as intelligence will be increasingly more available to consumers and more and more services will become online-only.

 

A big portion of our readers are in finance, can you please tell us a bit about Interfacing’s Governance, Risk Management and Compliance (GRC) solution?

Interfacing’s Enterprise Process Center® has evolved beyond process analysis and automation to offer a comprehensive solution for GRC Management. The ability to bridge the gap between risk, compliance, audit and the continuous improvement teams is another major differentiator of the EPC. With constant new legislation forcing organizations to regularly adapt their policies and rules to ensure compliance, professionals are looking for a way to better understand downstream impacts and manage change. To achieve this, corporations and other organizations must leverage technology to minimize the impact and amount of manual work needed to comply. Companies’ focus on vision, objectives and performance should always be their number one priority, not compliance!

Our tool offers policy, rule and requirement management that simplifies companies’ effort to adapt. Thus, if a law changes, users can immediately see all the potential implications – and how the law change will impact specific policies, rules, processes, risks, controls, roles and systems. Additionally, customers can break down their corporate high-level risks (ERM - enterprise risk management) down to the operational level and asses them against the controls that mitigate them. All risks and controls are reusable allowing customers to reassess each risk by process as well. Our platform not only documents risks and controls, but it also monitors key risk and control indicators – something that a lot of other products don’t do. This provides companies with the ability to track their risk mitigation strategies on an ongoing basis and react before going into the red, instead of having to wait until the end of the quarter once an audit is carried out to understand what went wrong. The EPC does support auditors as well though – from planning, to scheduling, to executing, through to reporting, the EPC offers a complete audit management solution. Finally, governance is at the core of our system, the EPC manages versions, tracks all audit trails, enforces security and ensures all stakeholders’ approvals are received before publishing a change.

 

Interfacing has been labelled a “Game Changer” by our team of analysts, what makes Interfacing stand apart from the rest?

Beyond our deep roots and understanding of the market, what differentiates Interfacing’s Enterprise Process Center® from its competitors is the value it brings to a diverse number of groups within an organization and the high level of adoption by end-user employees. We noticed a big gap in the market whereby processes were being used by multiple teams within an organization but usually not consolidated into one solution and never truly rolling out to every employee across the entire organization. For example, within a company, you may have business analysts leveraging such a tool for process improvement, audit department mapping for compliance and risk assessment or IT documenting requirements for a system deployment and automation project, however, the reach tended to be confined and limited. At the end of the day, it becomes difficult to gain the real value out of the effort of documenting processes, procedures, roles, risks, rules, controls, KPIs, etc. if they’re not rolled out to the people that require this knowledge to complete their daily tasks. Thus, our entire focus when redesigning our flagship platform was the end-user. Today, we offer truly a corporate-wide software that provides transparency at all levels of the organization and gives every employee a voice. Within the current digital landscape and fast pace of technological change, agility is a must to remain competitive. The need to collect feedback and continuously improve one’s processes, products & services on an ongoing basis is more important now than ever before.

 

For more information, please go to: https://www.interfacing.com/ or https://www.linkedin.com/in/solution/

Research carried out by Altodigital has revealed that two third (66%) of SMB IT executives admit that that they have significant IT challenges within their business. In comparison, an overwhelming 97% of those IT bosses working in larger organisations indicated having ongoing issues, suggesting very different attitudes to technology between small and larger firms.

The research also explored the differing priorities of these two business types and found that ‘maintaining existing IT infrastructure’ was a top priority for 40% of corporates while 32% unsurprisingly outlining ‘security and compliance’ as a top concern. It was also interesting to note that 25% of respondents listed ‘finding skilled staff’ as a big worry.

In terms of SMB organisations, 26% of IT executives listed ‘security and compliance’ as a major concern while budgetary constraints was close behind with 23%, something that was scarcely acknowledged by corporate respondents.

The poll organised by the office technology solutions provider, Altodigital was formed of two individual studies, one that polled 100 IT decision makers from corporate UK companies with over 500 employees while the second survey included firms with less than 500 employees.

Alistair Millar, Group Marketing Manager at Altodigital said: “It is worrying that such a high proportion of SMB IT Executives feel they do not have any IT issues, because it is likely that they are missing a trick, especially when the issue or security and compliance is something that requires continual upgrades in technology.”

The survey also indicated cultural differences when it came to technology, with 58% of SMBs revealing that they simply didn’t see the need for a bring your own devices policy whereas 72% corporates listed it as a major concern. These contrasting opinions were also clear when it came to discussing print policies, an overwhelming 78% of SMB IT managers admitted that they had no policy in place while 57% of corporates said that they review their print strategy every year or less.

Within these results, a quarter of respondents in large firms said that their printing plan was reviewed more frequently than every six months and 15% reported once a year.

“It is very surprising to see that a large majority of SMBs fail to have a print policy in place because managed print services are widely known to provide benefits for both small and large enterprises. SMBs must consider what services might help improve business efficiency and productivity on a regular basis, this point is clearly understood by large corporations who regularly review operations such as their print strategy on a regular basis,” added Millar.

(Source: AltoDigital)

About Finance Monthly

Universal Media logo
Finance Monthly is a comprehensive website tailored for individuals seeking insights into the world of consumer finance and money management. It offers news, commentary, and in-depth analysis on topics crucial to personal financial management and decision-making. Whether you're interested in budgeting, investing, or understanding market trends, Finance Monthly provides valuable information to help you navigate the financial aspects of everyday life.
© 2024 Finance Monthly - All Rights Reserved.
News Illustration

Get our free weekly FM email

Subscribe to Finance Monthly and Get the Latest Finance News, Opinion and Insight Direct to you every week.
chevron-right-circle linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram