Insolvency Oracle

Developments in UK insolvency by Michelle Butler


Leave a comment

InsS Annual Review, part 3: less carrot, more stick?

The Insolvency Service’s September 2018 report pulled no punches in expressing dissatisfaction over some monitoring outcomes: we want fewer promises to do better and more disciplinary penalties, seemed to be the tone.  Has this message already changed the face of monitoring?

The Insolvency Service’s September 2018 Report can be found at www.gov.uk/government/publications/review-of-the-monitoring-and-regulation-of-insolvency-practitioners and its Annual Review of IP Regulation is at www.gov.uk/government/publications/insolvency-practitioner-regulation-process-review-2018.

In this article, I explore the following:

  • On average, a quarter of all IPs were visited last year
  • But is there a 3-yearly monitoring cycle any longer?
  • 2018 saw the fewest targeted visits on record
  • …but more targeted visits are expected in 2019
  • No RPB ordered any plans for improvement
  • Instead, monitoring penalties/referrals of disciplinary/investigation doubled
  • Is this a sign that the Insolvency Service’s big stick is hitting its target?
  • IPs had a 1 in 10 chance of receiving a monitoring or complaints sanction last year

 

How frequently are IPs being visited?

With the exception of the Chartered Accountants Ireland (which is not surprising given their bumper year in 2017), all RPBs visited around a quarter of their IPs last year.  It’s good to see the RPBs operating this consistently, but how does it translate into the apparent 3-yearly standard routine?

Firstly, I find it odd that coverage of ACCA-licensed IPs seems to have dropped significantly.  After receiving a fair amount of criticism from the InsS over its monitoring practices, the ACCA handed the regulating of its licensed IPs over to the IPA in October 2016.  Yet, the number of ACCA IPs visited since that time has dropped from the c.100% to 79%.

Another factor that I had overlooked in previous analyses is the effect of monitoring the volume IVA providers (“VIPs”).  At least since 2014, the Insolvency Service’s principles for monitoring VIPs has required at least annual visits to VIPs.  Drawing on TDX’s figures for the 2018 market shares in IVAs, the IPA licensed all of the IPs in the firms that fall in the InsS’ definition of a VIP.  On the assumption that each of these received an annual visit, excluding these visits would bring the IPA’s coverage over the past 3 years to 56% of the rest of their IPs.  Of course, there are many reasons why this figure could be misleading, including that I do not know how many VIP IPs any of the RPBs had licensed in 2016 or 2017.

The ICAEW’s 64% may also reflect its different approach to visits to IPs in the largest firms: the ICAEW visits the firm annually (to cover the work of some of their IPs), but, because of the large number of IPs in the firm, the gap between visits to each IP within the firm is up to 6 years.  I cannot attempt to adjust the ICAEW’s figure to exclude these less frequently visited IPs, but suffice to say that, if they were exceeded, I suspect we might see something approaching more of a standard c.3-yearly visit for all non-large firm ICAEW-licensed IPs.

These variances in the 3-year monitoring cycle standard, which cannot be calculated (by me at least) with any accuracy, mean that there is very little that can be gleaned from this graph.  Unfortunately, the average is no longer much of an indication to IPs of when they might expect to receive their next monitoring visit.

 

The IPA’s new approach to monitoring

In addition to its up-to-4-visits-per-year shift for VIPs, at its annual conference earlier this year, the IPA announced that it would also be departing from the 3-yearly norm for other IPs.

The IPA has published few details about its new approach.  All that I have seen is that the frequency of monitoring visits is on a risk-assessment basis (which, I have to say, it was in my days there, albeit that the InsS used to insist on a 3-year max. gap) and that it is a “1-6 year monitoring cycle – tailored visits to types of firm” (the IPA’s 2018/19 annual report).

In light of this vagueness, I asked a member of the IPA secretariat for some more details: was the plan only to extend the period for those in the largest firms, as the ICAEW has done, or at least only for those practices with robust in-house compliance teams with a proven track record?  The answer was no, it could apply to smaller firms.  He gave the example of a small firm IP who only does CVLs: if the IPA were happy that the IP could do CVLs well and her bond schedules showed that she wasn’t diversifying into other case types, she likely would be put on an extended monitoring cycle.  The IPA person saw remote monitoring as the key for the future; he said that there is much that can be gleaned from a review of docs filed at Companies House.  He explained, however, that IPs would not know what cycle length they had been marked up for.

While I do not wish to throw cold water on this development, as I have long supported risk-based monitoring, this does seem a peculiar move especially in these times when questions are being asked about the current regulatory regime: if a present concern is that the regulators are not adequately discouraging bad behaviour and that they are not expediting the removal of the  “bad apples”, then it is curious that the monitoring grip is being loosened now.

Also, now that I visit clients on an annual basis, I realise just how much damage can be done in a short period of time.  It only takes a few misunderstandings of the legislation, a rogue staff member or a hard-to-manage peak in activity (or an unplanned trough in staff resources) to result in some real howlers.  How much damage could be done in 6 years, especially if an IP were less than honest?  Desk-top monitoring can achieve only so much.

What this means for my analysis of the annual reports, however, is that the 3-year benchmark for monitoring visits – or one third of IPs being monitored per year – is no longer relevant ☹ But it will still be interesting to see how the averages vary in the coming years.

 

Targeted visits drop to an all-time low

Only 10 targeted visits were carried out last year – the lowest number since the InsS started reporting them – and it seems that all RPBs are avoiding them in equal measure.

But 2019 may show a different picture, as several targeted visits have been ordered from 2018 monitoring visits…

 

Are the Insolvency Service’s criticisms bearing fruit?

I was particularly alarmed by the overall tone of the Insolvency Service’s “review of the monitoring and regulation of insolvency practitioners” published in September 2018.  In several places in the report, the InsS expressed dissatisfaction over some of the outcomes of monitoring visits.

I got the feeling that the Service disliked the focus on continuous improvement that, I think, has been a strength of the monitoring regime.  Instead, the Service expected to see more investigations and disciplinary actions arising from monitoring visit findings.  The report singled out apparently poor advice to debtors and apparently unfair or unreasonable fees or disbursements as requiring a disciplinary file to be opened with the aim of remedies being ordered.  It does seem that the focus of the InsS criticisms is squarely on activity in the VIPs, but the report did worry me that the criticisms could change the face of monitoring for everyone.  

2018 is the first year (in the period analysed) in which no monitoring visit resulted in a plan for improvement.  On the other hand, the number of penalties/referrals for disciplinary/investigation action doubled.

Could the InsS’ report be responsible for this shift?  Ok, the report was published quite late in 2018, in September, but I am certain that the RPBs had a rough idea of what the report would contain long before then.  Or perhaps the Single Regulator debate has tempted some within the RPBs/committees to be seen to be taking a tougher line?  Or you might think that these kinds of actions are long overdue?

I think that the RPBs have tried hard over the last decade or so to overcome the negativity of the JIMU-style approach to monitoring.  In more recent years, monitoring has become constructive and there has been some commendably open and honest communication between RPB and IP.  This has helped to raise standards, to focus on how firms can improve for the future, rather than spending everyone’s time and effort analysing and accounting for the past.  It concerns me that the InsS seems to want to remove this collaborative approach and make monitoring more like a complaints process.  In my view, such a shift may result in many IPs automatically taking a more defensive stance in monitoring visits and challenging many more findings.  Such a shift will not improve standards and will take up much more time from all parties.

Getting back to the graph, of course a referral for an investigation might not result in a sanction at all, so this does not necessarily mean that the IPA has issued more sanctions as a consequence of monitoring visits.  Also, the IPA’s apparent enthusiasm for this tool may simply reflect the IPA’s (past) committee structure whereby the committee that considered monitoring reports did not have the power to issue a disciplinary penalty, but could only pass it on to the Investigation Committee.  As this was dealt with as an internal “complaint”, I suspect that any such penalty arising from this referral would have featured, not in the IPA’s monitoring visit outcomes, but in complaint outcomes.

So how do the RPBs compare as regards complaints sanctions?

 

Complaints sanctions fall by a quarter

Although the IPA issued relatively fewer sanctions last year, I suspect that the monitoring visit referrals will take some time to work their way through to sanction stage, so it is unlikely that this demonstrates that the monitoring visit referrals led to a “no case to answer”.

What this and the previous graph show quite dramatically, though, is that last year the ICAEW seemed to issue far fewer sanctions per IP than the IPA.  As mentioned in my last blog, the IPA does license a large majority of the VIP IPs and there were more complaints last year about IVAs than about all the other case types put together.  One third of the published sanctions also were found against VIP IPs.

 

Likelihood of being sanctioned is unchanged from a decade ago

In 2018, you had a 1 in c.10 chance of receiving an RPB sanction, which was the same probability as in 2008…

I find it interesting to see the IPA’s and the ACCA’s results converge, which, if it were not for the suspected VIP impact, I would expect given that the IPA deals with both RPBs’ regulatory processes.

There’s not a lot that can be surmised from the number of sanctions issued by the other two RPBs: they’re a bit spiky, but it does seem that, on the whole, the ICAEW and ICAS has issued much fewer sanctions.  It seems from this that, at least for last year, you were c.half as likely to receive a sanction if you were ICAEW- or ICAS-licensed as you were if you were IPA- or ACCA-licensed.

 

Is a Single Regulator the answer to bringing consistency?

True, these graphs do seem to indicate that different regulatory approaches are implemented by different RPBs.  However, I do think that some of that variation is due to the different make-up of their regulated populations.  There is no doubt that the IVA specialists do require a different approach.  To a lesser degree, I think that a different approach is also merited when an RPB monitors practices with robust internal compliance teams; it is so much more difficult to have your work critiqued and challenged on a daily basis when you work in a 1-2 IP practice.

Differences in approach can also be a good thing.  Seeing other RPBs do things differently can force an RPB to challenge what they themselves are doing and to innovate.  My main concern with the idea of a single regulator is the loss of this advantage of the multi-regulator structure.

Perhaps a Single Regulator could bring in more consistency, but it would never result in perfectly consistent outcomes.  I’m sure I’m not the only one who remembers an exercise a certain JIEB tutor ran: all us students were given the same exam answer to mark against the same marking guide.  The results varied wildly.  This demonstrated to me that, as long as humans are involved in the process, different outcomes will always emerge.

 


Leave a comment

The stats of IP Regulation – Part 2: Monitoring

 

As promised, here are my thoughts on the RPBs’ 2017 monitoring activities, as reported by the Insolvency Service:

  • The InsS goes quiet on RPBs’ individual performances
  • Two RPBs appear to have drifted away from 3-yearly visits
  • The RPBs diverge in their use of different monitoring tools
  • On average, ICAEW visits were over three times more likely to result in a negative outcome than IPA visits
  • On average, every fourth visit resulted in one negative outcome
  • But averages can be deceptive…

As a reminder, the Insolvency Service’s report on 2017 monitoring can be found at: https://tinyurl.com/ycndjuxz

The picture becomes cloudy

As can be seen on the Insolvency Service’s dedicated RPB-monitoring web-page – https://www.gov.uk/government/collections/monitoring-activity-reports-of-insolvency-practitioner-authorising-bodies – their efforts to review systematically each RPB’s regulatory activities seemed to grind to a halt a year ago.  The Service did report last year that their “future monitoring schedule” would be “determined by risk assessment and desktop monitoring” and they gave the impression that their focus would shift from on-site visits to “themed reviews”.  Although their annual report indicates that such reviews have not always been confined to the desk-top, their comments are much more generic with no explanation as to how specific RPBs are performing – a step backwards, I think.

 

Themed review on fees

An example of this opacity is the Service’s account of their themed review “into the activities, and effectiveness, of the regulatory regime in monitoring fees charged by IPs”.

After gathering and reviewing information from the RPBs, the InsS reports: “RPBs responses indicate that they have provided guidance to members on fee matters and that through their regulatory monitoring; fee-related misconduct has been identified and reported for further consideration”.

For this project, the InsS also gathered information from the Complaints Gateway and has reported: “Initial findings indicate that fee related matters are being reported to the IP Complaints Gateway and, where appropriate, being referred to the RPBs”.

Ohhhkay, so that describes the “activities” of the regulatory regime (tell us something we don’t know!), but how exactly does the Service expect to review their effectiveness?  The report states that their work is ongoing.

Don’t get me wrong, it’s not that I necessarily want the Service to dig deeper.  For example, if the Service’s view is that successful regulation of pre-packs is achieved by scrutinising SIP16 Statements for technical compliance with the minutiae of the disclosure checklist, I dread to think how they envisage tackling any abusive fee-charging.  It’s just that, if the Service thinks that they are really getting under the skin of issues, personally I hope they are doing far more behind the scenes… especially as the Service is surely beginning to gather threads on the question of whether the world would be a better place with a single regulator.

So let’s look at the stats…

 

How frequently are you receiving monitoring visits?

There is a general feeling that every IP will receive a monitoring visit every three years.  But is this the reality?

This shows quite a variation, doesn’t it?  For two years in a row, significantly less than one third of all IPs were visited in the year.  Does this mean the RPBs have been slipping from the Principles for Monitoring’s 3-year norm?

The spiky CAI line in particular demonstrates how an RPB’s visiting cycle may mean that the number of visits per year can fluctuate wildly, but how nevertheless the CAI’s routine 3-yearly peaks and troughs suggest that in general that RPB is following a 3-yearly schedule.  So what picture do we see, if we iron out the annual fluctuations?

This looks more reasonable, doesn’t it?  As we would expect, most RPBs are visiting not-far-off 100% of their IPs over three years… with the clear exceptions of CAI, which seems to be oddly enthusiastic, and the ICAEW, which seems to be consistently ploughing its own furrow.  This may be the result of the ICAEW’s style of monitoring large firms with many IPs, where each year some IPs are the subject of a visit, but this may not mean that all IPs receive a visit in three years.  Alternatively, could it mean they are following a risk-based monitoring programme..?

There are benefits to routine, regular and relatively frequent monitoring visits for everyone, almost irrespective of the firm’s risk profile: it reduces the risk that a serious error may be repeated unwittingly (or even deliberately).  However, this model isn’t an indicator of Better Regulation (see, for example, the Regulators’ Compliance Code at https://www.gov.uk/government/publications/regulators-compliance-code-for-insolvency-practitioners).  With the InsS revisiting their MoU (and presumably also the Principles for Monitoring) with the RPBs, I wonder if we will see a change.

 

Focussing on the Low-Achievers?

The alternative to the one-visit-every-three-years-irrespective-of-your-risk-profile model is to take a more risk-based approach, to spend one’s monitoring efforts on those that appear to be the highest risk.  This makes sense to me: if a firm/IP has proven that they are more than capable of self-regulation – they keep up with legislative changes, keep informed even of the non-legislative twists and turns, and don’t leave it solely to the RPBs to examine whether their systems and processes are working, but they take steps quickly to resolve issues on specific cases and across entire portfolios and systems – why should licence fees be spent on 3-yearly RPB monitoring visits, which pick up non-material non-compliances at best?  Should not more effort go towards monitoring those who seem consistently and materially to fail to meet required standards or to adapt to new ones?

But perhaps that’s what being done already.  Are many targeted visits being carried out?

It seems that for several years few targeted visits have been conducted, although perhaps the tide is turning in Scotland and Ireland.  The ACCA also performed a number, although now that the IPA team is carrying out monitoring visits on ACCA-licensed IPs, I’m not surprised to see the number drop.

It seems that targeted visits have never really been the ICAEW’s weapon of choice.  At first glance, I was a little surprised at this, considering that their monitoring schedule seems less 3-yearly rigid than the other RPBs.  Aren’t targeted visits a good way to monitor progress outside the routine visit schedule?  Evidently, the ICAEW is not using targeted visits to focus effort on low-achievers.  Perhaps they are tackling them in another way…

 

Wielding Different Sticks

I think this demonstrates that the ICAEW isn’t lightening up: they may be carrying out less frequent monitoring visits on some IPs, but their post-visit actions are by no means infrequent.  So perhaps this indicates that the ICAEW is focusing its efforts on those seriously missing the mark.

The ICAEW’s preference seems to be in requiring their IPs to carry out ICRs.  Jo’s and my experiences are that the ICAEW often requires those ICRs to be carried out by an external reviewer and they require a copy of the reviewer’s report to be sent to the ICAEW.  They also make more use than the other RPBs of requiring IPs to undertake/confirm that action will be taken.  I suspect that these are often required in combination with ICR requests so that the ICAEW can monitor how the IP is measuring up to their commitments.

And in case you’re wondering, external ICRs cost less than an IPA targeted visit (well, the Compliance Alliance’s do, anyway) and I like to think that we hold generally to the same standards, so external ICRs are better for everyone.

In contrast, the IPA appears to prefer referring IPs for disciplinary consideration or for further investigation (the IPA’s constitution means that technically no penalties can arise from monitoring visits unless they are first referred to the IPA’s Investigation Committee).  However, the IPA makes comparatively fewer post-visit demands of its IPs.  But isn’t that an unfair comparison, because of course the ICAEW carried out more monitoring visits in 2017?  What’s the picture per visit?

 

No better and no worse?

Hmm… I’m not sure this graph helps us much.  Inevitably, the negative outcomes from monitoring visits are spiky.  We’re not talking about vast numbers of RPB slaps here (that’s why I’ve excluded the smaller RPBs – sorry guys, nothing personal!) and the “All” line (which does include the other RPBs) does illustrate a smoother line overall.   But the graph does suggest that ICAEW-licensed IPs are over three times as likely to receive a negative outcome from a monitoring visit than IPA-licensed IPs. 

Before you all get worried about your impending or just-gone RPB visit, you should remember that a single monitoring visit can lead to more than one negative outcome.  For example, as I mentioned above, the RPB could instruct an ICR or targeted visit as well as requiring the IP to make certain undertakings.  One would hope that much less than 25% of all IPs visited last year had a clean outcome!

This doubling-up of outcomes may be behind the disparity between the RPBs: perhaps the ICAEW is using multiple tools to address a single IP’s problems more often than the other two RPBs… although why should this be?  Alternatively, perhaps the ICAEW’s record again suggests that the ICAEW is focusing their efforts on the most wayward IPs.

 

Choose Your Poison

I observed in my last blog (https://tinyurl.com/y8b4cgp7) that the complaints outcomes indicated that the IPA was far more likely to sanction its IPs over complaints than the ICAEW was.  I suggested that maybe this was because the IPA licenses more than its fair share of IVA specialists.  Nevertheless, I find it interesting that the monitoring outcomes indicate the opposite: that the ICAEW is far more likely to sanction on the back of a visit than the IPA is.

Personally, I prefer a regime that focuses more heavily on monitoring than on complaints.  Complaints are too capricious: to a large extent, it is pot luck whether someone (a) spots misconduct and (b) takes the effort to complain.  As I mentioned in the previous blog, the subjects of some complaints decisions are technical breaches… and which IP can say hand-on-heart that they’ve never committed similar?

Also by their nature, complaints are historic – sometimes very historic – but it might not matter if an IP has since changed their ways or whether the issue was a one-off: if the complaint is founded, the decision will be made; the IP’s later actions may just help to reduce the penalty.

In my view, the monitoring regime is far more forward-looking and much fairer.  Monitors look at fresh material, they consider whether the problem was a one-off incident or systemic and whether the IP has since made changes.  The monitoring process also generally doesn’t penalise IPs for past actions, but rather what’s important are the steps an IP takes to rectify issues and to reduce the risks of recurrence.  The process enables the RPBs to keep an eye on if, when and how an IP makes systems- or culture-based changes, interests that are usually absent from the complaints process.

 

Next blog: SIP16, pre-packs and other RPB pointers.

 


Leave a comment

Annual review: IPs, complaints and visits down, but sanctions up

The Insolvency Service’s 2016 Review of IP Regulation always makes for interesting reading. This year, the headlines include:

  • The number of IPs falls again
  • Regulatory sanctions generally increase and for one RPB in particular
  • Complaints handled by the RPBs drop by 28%… although 17% of all complaints seem to be held in the Gateway
  • Apparent missing of the mark for 3-yearly visits
  • Current regulatory priorities include IVAs and fees, whereas routine monitoring appears less popular

The report can be found at https://goo.gl/Jkwz19.

 

IP number falls again

The Review reveals another drop in the number of appointment-taking IPs. In fact, there was the same number on 1 January 2017 as there was on the same day in 2009: 1,303.

Is it a surprise that the number of appointment-taking IPs has dropped again? The 2016 insolvency statistics show modest increases in the numbers of CVLs and IVAs compared with 2015 and of course there was a bumper crop of MVLs in early 2016. Why is it that fewer IPs seem to be responsible for more cases?

My hunch is that the complexity of cases in general is decreasing and I suspect that the additional hurdles put in place as regards fees have encouraged IPs to look at efficiencies, to create slicker processes, and to be more risk-averse, less inclined to go out on a limb with the result that some cases are despatched more swiftly and require less IP input.

I also suspect the IP number for next January will show another drop. The expense and effort to adapt to the 2016 Rules will make some think again, won’t it?

Does the presence of the regulators breathing down one’s neck erode IPs’ keenness to remain in the profession? How worried should IPs be about the risk of a regulatory sanction?

 

Regulatory actions on the increase

The RPBs seem to have shown varying degrees of enthusiasm when it comes to taking regulatory action.

To me, this hints at regulatory scrutiny of a different kind. Is it coincidental that the ACCA issued proportionately far more sanctions than any other RPB last year? Could the Insolvency Service’s repeated monitoring visits to the ACCA over 2015 and 2016 have had anything to do with this spike?

What are behind these sanctions? Are they generated from the RPBs’ monitoring visits or from complaints?

 

Monitoring v complaints sanctions return to normality

Last year, I observed that for the first time RPBs’ investigations into complaints had generated more sanctions than their monitoring visits. Regulatory actions in 2016 returned to a more typical pattern.

Does this reflect a shifting RPB behaviour or is it more a result of the number of complaints received and/or the number of monitoring visits undertaken?

 

Dramatic fall in complaints

Well, no wonder there were fewer disciplinary actions on the back of complaints: the RPBs received 28% fewer complaints in 2016 than they did in 2015.

Why is this? Is it because fewer complaints were made? Undoubtedly, IVAs have generated a flood of complaints in recent years not least because of the issues surrounding ownership of PPI claims, but those issues were still live in 2016, weren’t they?

Perhaps we can explore this by looking at the complaint profile by case type:

Yes, it looks like IVAs continued to be contentious last year, although perhaps the worst is over. It seems, however, that the most significant drop has been felt in complaints relating to bankruptcies and liquidations. The reduction in bankruptcy complaints is understandable, as the numbers of bankruptcies have dropped enormously over the past few years, but liquidation numbers have kept reasonably steady, so I am not sure what is going on there.

But are fewer people really complaining or is there something else behind these figures?

 

An effective Complaints Gateway sift?

When the Complaints Gateway was set up in 2014, it was acknowledged that the Insolvency Service would ensure that complaints met some simple criteria before they were referred to the RPBs. There must be an indication of a breach of legislation, SIP or the Code of Ethics and the allegations should be capable of being supported with evidence. Where this is not immediately apparent, the Service seeks additional information from the complainant.

The graphs above are based on the complaints referred to the RPBs, so what is the picture as regards complaints received before the sifting process occurs?

This shows that the Complaints Gateway sifted out more complaints last year: the percentage rejected rose from 25% in 2014, to 27% in 2015, to 29% in 2016.

The Insolvency Service’s review explains that in 2016 a new criterion was added: “Complainants are now required in the vast majority of cases to have raised the matter of concern with the insolvency practitioner in the first instance before the complaint will be considered by the Gateway”. This is a welcome development, but it did not affect the numbers much: it resulted in only 13 complaints being turned away for this reason.

But this rejected pile is not the whole story. The graph also demonstrates that a significant number of complaints – 144 (17%) – were neither rejected nor referred last year, which is a much larger proportion than previous years.   Presumably these complaints are being held pending further exchanges between the Service and the complainant. Personally, I am comforted by this demonstration of the Service’s diligence in managing the Gateway, but I hope that this does not hint at a system that is beginning to get snarled up.

 

How many complaints led to sanctions?

When I looked at the Insolvency Service’s review last year, I noted that the IPA’s sanctions record appeared out of kilter to the other RPBs. It is interesting to note that 2016 appears to have been a more “normal” year for the IPA, but instead the ACCA seems to have had an exceptional year. As mentioned above, I wonder if the Insolvency Service’s focus on the ACCA has had anything to do with this unusual activity (I appreciate that 2010 was another exceptional year… and I wonder if the fact that 2010 was the year that the Insolvency Service got heavy with its SIP16-reviewing exercise had anything to do with that particular flurry).

The obvious conclusion to draw from this graph might be that an ACCA-licensed IP has a 1 in 3 chance that any complaint will result in a sanction. However, perhaps these IPs can rest a little easier, given that the ACCA’s complaints-handling is now being dealt with by the IPA.

What about sanctions arising from monitoring visits? How do the RPBs compare on that front?

 

All but one RPB reported an increase in monitoring sanctions

These percentages look rather spectacular, don’t they? It gives the impression that on average almost one third of all monitoring visits result in some kind of negative outcome… and it appears that 90% of all the CAI’s monitoring visits gave rise to a negative outcome! Well, not quite. It is likely that some monitoring visits led to more than one black mark, say a plan for improvement and a targeted visit to review how those plans had been implemented.

Nevertheless, it is interesting to note that almost all RPBs recorded increases in the number of negative outcomes from monitoring visits over the previous year. I am not sure why the IPA seems to have bucked the trend. It will be interesting to see how the populations of ACCA and IPA-licensed IPs fare this year, as they are now being monitored and judged by the same teams and Committees.

 

How frequently are visits being undertaken?

The Principles for Monitoring, which forms part of a memorandum of understanding (“MoU”) between the Insolvency Service and the RPBs, state that the period between monitoring visits “is not expected to significantly exceed three years but may, where satisfactory risk assessment measures are employed, extend to a period not exceeding six years”. However, most if not all the RPBs publicise that their monitoring programmes are generally on a 3-yearly cycle.

The following graph shows that the RPBs are not quite meeting this timescale:

If we look at each RPB’s visits for the past 3 years as a percentage of their appointment-taking licence-holders, how far off the 100% mark were they..?

ICAEW’s missing of the mark is not surprising, given that they publicise that their IPs in the larger practices are on 6-year cycles. At the other end of the spectrum is the ACCA, which managed to visit all their IPs over the past 3 years and then some. However, as we know, the ACCA has relinquished its monitoring function to the IPA, so it seems unlikely that this will continue.

 

What is the future for monitoring visits?

The Insolvency Service’s 2015 review hinted that the days of the MoU may have been numbered. Their 2016 review strengthens this message:

“We propose to withdraw the MoU as soon as is reasonably feasible, subject to working through some final details”.

The review goes on to explain that the Service will be adding to their existing guidance (https://goo.gl/wDHElg). As it currently stands, prescriptive requirements such as the frequency of monitoring visits is conspicuously absent from this guidance. Instead, it is largely outcomes-based and reflects the Regulator’s Code to which the Insolvency Service itself is subject and that emphasises the targeting of monitoring resources where they should be most effective at addressing priority risks. The Service itself seems to be lightening up on its own monitoring visits: the review states that, having completed their round of full monitoring visits to the RPBs, they are now moving towards a number of risk based themed reviews. If this approach filters through to the RPBs’ monitoring visits, will we see a removal of the 3-yearly standard cycle?

 

Current priorities for the regulators

Does the 2016 review reveal any priorities for this year?

Not unsurprisingly, given one particularly high profile failure, IVAs feature heavily. The review refers to “general concerns around the volume IVA business model and developments in practice” and continues:

“The Insolvency Service is working with the profession to tackle some of these concerns; for example, through changes to guidance on monitoring and protections for client funds, and also a review of insurance arrangements. We are also engaging with stakeholder groups to better understand their concerns and how these may be tackled. We expect that this will be a key focus of our work for the coming year.”

Other projects mentioned in the review include:

  • Possible legislative changes to the bonding regime – consultation later this year;
  • Progression of the Insolvency Service’s recommendation that the RPBs introduce a compensation mechanism for complainants who have suffered inconvenience, loss or distress;
  • Publication of the Insolvency Service’s review into the RPBs’ monitoring and regulation processes, including consistency of outcomes, the extent of independence between the membership and regulatory functions, and the RPBs’ financial capabilities – report to be released within 12 months;
  • Progress on a review into the RPBs’ approach to the regulatory objective to encourage a profession which delivers services at a fair and reasonable cost, including how they are assessing compliance with the Oct-15 fee estimate regime – report to be released by the end of the year; and
  • A consultation on revisions to the Code of Ethics – expected in the spring.

 


Leave a comment

Monitoring the monitors: targeting consistency and transparency

IMGP2570

The Insolvency Service’s 2014 Review had the target of transparency at its core. This time, the Insolvency Service has added consistency.  Do the Annual Reviews reveal a picture of consistency between the RPBs?

My second post on the Insolvency Service’s 2015 Annual Review of IP regulation looks at the following:

  • Are the RPBs sticking to a 3-year visit cycle?
  • How likely is it that a monitoring visit will result in some kind of regulatory action?
  • What action are the RPBs likely to take and is there much difference between the RPBs?
  • What can we learn from 6 years of SIP16 monitoring?
  • How have the RPBs been faring in their own monitoring visits conducted by the Insolvency Service?
  • What have the Service set in their sights for 2016?

 

RPBs converge on a 3-yearly visit cycle

The graph of the percentages of IPs that had a monitoring visit last year gives me the impression that a 3-yearly visit cycle has most definitely become the norm:

Graph7

(Note: because the number of SoS IPs dropped so significantly during the year – from 40 to 28 – all the graphs in this article reflect a 2015 mid-point of SoS-authorised IPs of 34.)

Does this mean that IPs can predict the timing of their next routine visit? I’m not sure.  It seems to me that some standard text is slipping into the Insolvency Service’s reports on their monitoring visits to the RPBs.  The words: “[RPB] operates a 3-year cycle of rolling monitoring visits to its insolvency practitioners. The nature and timing of visits is determined annually on a risk-assessment basis” have appeared in more than one InsS report.

What do these words mean: that every IP is visited once in three years, but some are moved up or down the list depending on their risk profile? Personally, this doesn’t make sense to me: either visits are timed according to a risk assessment or they are carried out on a 3-year cycle, I don’t see how you can achieve both.  If visit timings are sensitive to risk, then some IPs are going to receive more than one visit in a 3-year period and, unless the RPB records >33% of their IP number as having a visit every year (which the graph above shows is generally not the case), the corollary will be that some IPs won’t be visited in a 3-year period.

My perception on the outside is that, generally, the timing of visits is pretty predictable and is now pretty-much 3-yearly. I’ve seen no early parachuting-in on the basis of risk assessments, although I accept that my field of vision is very narrow.

 

Most RPBs report reductions in negative outcomes from monitoring visits

The following illustrates the percentage of monitoring visits that resulted in a “negative outcome” (my phrase):

Graph8

As you can see, most RPBs are clocking up between c.10% and 20% of monitoring visits leading to some form of negative consequence and, although individual records have fluctuated considerably in the past, the overall trend across all the regulatory bodies has fallen from 30% in 2008 to 20%.

However, two bodies seem to be bucking the trend: CARB and the SoS.

Last year, I didn’t include CARB (the regulatory body for members of the Institute of Chartered Accountants in Ireland), because its membership was relatively small. It still licenses only 41 appointment-taking IPs – only 3% of the population – but, with the exit of SoS authorisations, I thought it was worth adding them to the mix.

I am sure that CARB’s apparent erratic history is a consequence of its small population of licensed IPs and this may well explain why it is still recording a much greater percentage of negative outcomes than the other RPBs. Nevertheless, CARB does seem to have recorded exceptionally high levels for the past few years.

The high SoS percentage is a little surprising: 50% of all 2015 visits resulted in some form of negative outcome – these were all “plans for improvement”. CARB’s were a mixture of targeted visits, undertakings and one penalty/referral for disciplinary consideration.

So what kind of negative outcomes are being recorded by the other RPBs? Are there any preferred strategies for dealing with IPs falling short of expected standards?

 

What responses are popular for unsatisfactory visits?

The following illustrates the actions taken by the top three RPBs over the last 4 years:

Graph9

* The figures for ICR/self certifications requested and further visits should be read with caution. These categories do not appear in every annual review, but, for example, it is clear that RPBs have been conducting targeted visits, so this graph probably does not show the whole picture for the 2012 and 2013 outcomes.  In addition, of course the ICAEW requires all IPs to carry out annual ICRs, so it is perhaps not surprising that this category has rarely featured.

I think that all this graph suggests is that there is no trend in outcome types!  I find this comforting: it might be difficult to predict what outcome to expect, but it suggests to me that the RPBs are flexible in their approaches, they will implement whatever tool they think is best fitted for the task.

 

Looking back on 6 years of SIP16 monitoring
We all remember how over the years so many people seemed to get hot under the collar about pre-packs and we recall some appallingly misleading headlines that suggested that around one third of IPs were failing to comply with regulations. Where have the 6 years of InsS monitoring of SIP16 Statements got us?  I will dodge that question, but I’ll simply illustrate the statistics:

Graph10

Note: several years are “estimates” because the InsS did not always review all the SIP16 Statements they received. Also, the Service ended its monitoring in October 2015.  Therefore, I have taken the stats in these cases and pro rated them up to a full year’s worth.

Does the graph above suggest that a consequence of SIP16 monitoring has been to discourage pre-packs? Well, have a look at this one…

Graph11

As you can see, the dropping number of SIP16s is more to do with the drop in Administrations. In fact, the percentage of pre-packs has not changed much: it was a peak of 31% of all Administrations in 2012 and was at its lowest in 2014 at 24%.

I guess it could still be argued that the SIP16 scrutiny has persuaded some to sell businesses/assets in the pre (or immediately post) liquidation period, rather than use Administration.  I’m not sure how to test that particular theory.

So, back to SIP16 compliance, the graph-but-one above shows that the percentage of Statements that were compliant has increased. It might be easier to see from the following:

Graph12

Unequivocal improvements in SIP16 compliance – there’s a good news story!

A hidden downside of all this focus on improving SIP16 compliance, I think, is the costs involved in drafting a SIP16 Statement and then, as often happens, in getting someone fairly senior in the practice to double-check the Statement to make sure that it ticks every last SIP16 box.  Is this effort a good use of resources and of estate funds?

Now that the Insolvency Service has dropped SIP16 monitoring, does that mean we can all relax a bit? I think this would be unwise.  The Service’s report states that it “will review the outcome of the RPBs’ consideration of SIP16 compliance and will continue to report details in the Annual Review”, so I think we can expect SIP16 to remain a hot regulatory topic for some time to come.

 

The changing profile of pre-packs

The Service’s reports on SIP16 Statements suggest other pre-pack trends:

Graph13

Personally, I’m surprised at the number of SIP16 Statements that disclose that the business/assets were marketed by the Administrator: last year it was 56%. I’m not sure if that’s because some SIP16 Statements are explaining that the company was behind some marketing activities, but, if that’s not the reason, then 56% seems very low to me.  It would be interesting to see if the revised SIP16, which introduced the “marketing essentials”, makes a difference to this rate.

 

Have some pity for the RPBs!

The Service claimed to have delivered on their commitments in 2015 (incidentally, one of their 2014 expectations was that the new Rules would be made in the autumn of 2015 and they would come into force in April 2016 – I’m not complaining that the Rules are still being drafted, but I do think it’s a bit rich for the Executive Foreword to report pleasure in having met all the 2014 “commitments”).

The Foreword states that the reduction in authorising bodies is “a welcome step”. With now only 5 RPBs to monitor and the savings made in dropping SIP16 monitoring (which was the reported reason for the levy hike in 2009), personally I struggle to see the Service’s justification for increasing the levy this year.  The report states that it was required in view of the Service’s “enhanced role as oversight regulator”, but I thought that the Service did not expect to have to flex its new regulatory muscles as regards taking formal actions against RPBs or directly against IPs.

However, the tone of the 2015 Review does suggest a polishing of the thumb-screws. The Service refers to the power to introduce a single regulator and states that this power will “significantly shape” the Service’s work to come.

In 2015, the Service carried out full monitoring visits to the ICAEW, ICAS and CARB, and a follow-up visit to the ACCA. This is certainly more visits than previous years, but personally I question whether the visits are effective.  Of course, I am sure that the published visit reports do not tell the full stories – at least, I hope that they don’t – but it does seem to me that the Service is making mountains out of some molehills and their reports do give me the sense that they’re concerned with processes ticking the Principles for Monitoring boxes, rather than being effective and focussing on good principles of regulation.

For example, here are some of the molehill weaknesses identified in the Service’s visits that were resisted at least in part by some of the RPBs – to which I say “bravo!”:

  • Pre-visit information requested from the IPs did not include details of complaints received by the IP. The ICAEW responded that it was not convinced of the merits of asking for this on all visits but agreed to “consider whether it might be appropriate on a visit by visit basis”.
  • Closing meeting notes did not detail the scope of the visit. The ICAEW believed that it is important for the closing meeting notes to clearly set out the areas that the IP needs to address (which they do) and it did not think it was helpful to include generic information… although it seems that, by the time of the follow-up visit to the ICAEW in February 2016, this had been actioned.
  • The Service remains “concerned” that complainants are not provided with details of the independent assessor on their case. “ACCA regrets it must continue to reject this recommendation as ACCA does not believe naming assessors will add any real value to the process… There is also the risk of assessors being harassed by complainants where their decision is not favourable to them.”
  • Late bordereaux were only being chased at the start of the following month. The Service wanted procedures put in place to “ensure that cover schedules are provided within the statutory timescale of the 20th of each month and [to] follow up any outstanding returns on 21st or the next working day of each month”. Actually, CARB agreed to do this, but it’s just a personal bug-bear of mine. The Service’s report to the ICAEW went on about the “vital importance” of bonding – with which I agree, of course – but it does not follow that any bordereaux sent by IPs to their RPB “demonstrate that they have sufficient security for the performance of their functions”. It simply demonstrates that the IP can submit a schedule on time every month. I very much suspect that bordereaux are not checked on receipt by the RPBs – what are they going to do: cross-check bordereaux against Gazette notices? – so simply enforcing a zero tolerance attitude to meeting the statutory timescale is missing the point and seems a waste of valuable resources, doesn’t it?

 

Future Focus?

The Annual Review describes the following on the Insolvency Service’s to-do list:

  • Complaint-handling: in 2015, the Service explored the RPBs’ complaint-handling processes and application of the Common Sanctions Guidance. The Service has made a number of recommendations to improve the complaints process and is in discussion with the RPBs. They expect to publish a full report on this subject “shortly”.
  • Debt advice: also in 2015, they carried out a high-level review of how the RPBs are monitoring IPs’ provision of debt advice and they are currently considering recommendations for discussion with the RPBs.
  • Future themed reviews: The Service is planning themed reviews (which usually mean topic-focussed questionnaires to all RPBs) over 2016 and 2017 covering: IP monitoring; the fees rules; and pre-packs.
  • Bonding: the Service has been examining “the type and level of cover offered by bonds and considering both the legislative and regulatory arrangements to see if they remain fit for purpose”. They are cagey about the outcomes but do state that they “will work with the industry to effect any regulatory changes that may be necessary” and they refer to “any legislative change” being subject to consultation.
  • Relationship with RPBs: the Service is contemplating whether the Memorandum of Understanding (“MoU”) with the RPBs is still needed, now that there are statutory regulatory objectives in place. The MoU is a strange animal – https://goo.gl/J6wmuN. I think that it reads like a lot of the SIPs: a mixture of principles and prescription (e.g. a 10-day acknowledgement of complaints); and a mixture of important standards and apparent OTT trivia. It would be interesting to see how the Service approaches monitoring visits to the RPBs if the MoU is removed: they will have to become smarter, I think.
  • Ethics? The apparent focus on ethical issues seems to have fallen from the list this year. In 2015, breaches of ethics moved from third to second place in the list of complaints received by subject matter (21% in 2014 and 27% in 2015), but reference to the JIC’s work on revising the Ethics Code has not been repeated in this year’s Review. Presumably the work is ongoing… although there is certainly more than enough other tasks to keep the regulators busy!

 

 


Leave a comment

The Insolvency Service’s labours for transparency produce fruits

IMGP3472

The Insolvency Service has been busy over the past months producing plenty of documents other than the consultations. Here, I review the following:

  • First newsletter;
  • Report on its visit to the SoS-IP monitoring unit;
  • Summary of its oversight function of the RPBs;
  • IVA Standing Committee minutes; and
  • Complaints Gateway report.

The Insolvency Service’s first newsletter

http://content.govdelivery.com/accounts/UKIS/bulletins/d469cc

Although this is a bit of a PR statement, a couple of crafty comments have been slipped in.

The newsletter explains that the Service’s “IP regulation function has been strengthened and we have raised the bar on our expectations of authorising bodies”. I started off sceptical but to be fair the Service’s summary of how it carries out its oversight function of the authorising bodies – https://www.gov.uk/government/publications/insolvency-practitioner-regulation-oversight-and-monitoring-of-authorising-bodies – does convey a more intensive Big Brother sense than the Principles for Monitoring alone had done previously.  This document puts more emphasis on their risk-based assessments, desk-top monitoring and themed reviews, as well as targeting topical areas of concern, which can only help to provide a better framework in which their physical monitoring visits to the RPBs can sit.

I commend the Service for establishing more intelligent regulatory processes, but two sentences of the newsletter stick in the throat: “We saw the impact that our changing expectations had in a few areas. Things deemed acceptable a few years ago were now being picked up as areas for improvement.” This is a reference to its report on the visit to its own people who monitor SoS-authorised IPs, the Insolvency Practitioner Services (“IPS”): https://www.gov.uk/government/publications/monitoring-activity-reports-of-insolvency-practitioner-authorising-bodies.  Having worked in the IPA’s regulatory department from 2005 to 2012, I would like to assure readers that many of the items identified in the Service’s report on IPS have been unacceptable for many years – at least to the IPA during my time and most probably to the other RPBs (I am as certain as I can be of that without having worked at the RPBs myself).

I am aghast at the Service’s apparent suggestion that the following recent discoveries at the IPS were acceptable a few years ago:

  • A 5-year visit cycle with insufficient risk assessment to justify a gap longer than 3 years;
  • Visits to new appointment-takers not carried out within 12 months and no evidence of risk assessment to justify this;
  • No evidence that one IP’s receipt of more than 1,000 complaints in the previous year (as disclosed in the pre-visit questionnaire) was raised during the visit, nor was it considered in any detail in the report;
  • No evidence of website checks (which the Service demanded of the RPBs many years ago);
  • “Little evidence that compliance with SIP16 is being considered”;
  • “No evidence that relevant ethical checklists and initial meeting notes from cases had been considered”; and
  • “Once a final report has been sent to the IP, there does not appear to be any process whereby the findings of the report are considered further by IPS”.

Still, that’s enough of the past. The Service has now thrown down the gauntlet.  I shall be pleased if they now prove they can parry and thrust with intelligence and effectiveness.

Worthy of note is that the newsletter explains that, in future, sanctions handed down to IPs by the RPBs will be published on the Service’s website (presumably more contemporaneously than within its annual reviews).

IVA Standing Committee Minutes 17 July 2014

https://www.gov.uk/government/publications/minutes-from-the-iva-standing-committee-july-2014

“Standardised Format”

The minutes report that the IPA will have a final version – of what? Presumably a statutory annual report template? – within “a couple of weeks” and that two Committee members will draft a Dear IP article (there’s a novelty!) to explain that use of the standard is not mandatory.

Income and Expenditure Assessments

The minutes recorded that Money Advice Service had been preparing for consultation a draft I&E statement – which seems to be an amalgam of the CFS and the StepChange budget with the plan that it will be used for all/a number of debt solutions. The consultation was opened on 16 October: https://www.moneyadviceservice.org.uk/en/static/standard-financial-statement-consultation

IVA Protocol Equity Clause

As a consequence of concerns raised by an adviser about the equity clause, DRF has agreed to “draft a response” – it seems this is only intended to go to the adviser who had written in, although it would seem to me to have wider interest – “to clarify the position, which is that a person will not be expected to go to a subprime lender and the importance of independent financial advice”. It is good to have that assurance, but what exactly does the IVA Protocol require debtors to do in relation to equity?  Does the Protocol clause need revising, I wonder.

Resistance to refunding dividends when set-off applied

I see the issue: a creditor receives dividends and then sets off mis-sold PPI compensation against their remaining debt. Consequently, it could be argued that the creditor has been overpaid a dividend and should return (some of) it.  The minutes state that “it is a complicated issue and different opinions prevail” (well, there’s a revelation!), although it has been raised with the FCA.

Variations

It seems that the Committee has only just cottoned on to the fact that the Protocol does not allow the supervisor to decide whether a variation meeting should be called, so they are to look at re-wording the standard terms to “give supervisor discretion as to whether variation is appropriate so when one is called it is genuine and in these instances the supervisor will be entitled to get paid”.

I’m sorry if I sound a little despairing at this, not least because of course the cynic may see this as yet another avenue for IPs to make some easy money! It was something that I’d heard about when I was at the IPA – that some IPs were struggling with IVA debtors who wanted, say, to offer a full and final settlement to the creditors that the IP was confident would be rejected by creditors, but under the Protocol terms it seemed that they had no choice but to pass the offer to creditors.  I’m just surprised that this issue has not yet been resolved.

Recent pension changes

The minutes simply state: “InsS to enquire with colleagues as to how it is planned to treat these in bankruptcy and feed back”. About time too!  Shortly after the April proposals had been first announced, I’d read articles questioning whether the government had thought about how any lump sum – which from next April could be the whole pension pot – would be treated in a bankruptcy.  Presumably, legislation will be drafted to protect this pot from a Trustee’s hands, but that depends on the drafter getting it right.  The lesson of Raithatha v Williamson comes to mind…

Well, I’m assuming that this is what the Committee minutes refer to, anyway.

Report on the First Year of the Complaints Gateway

https://www.gov.uk/government/publications/insolvency-practitioner-complaints-gateway-report-august-2014

Aha, so Dr Judge has been able to spin an increased number of complaints as evidence that the gateway “is meeting the aim of making the complaints process easier to understand and use”! I wonder if, had the number of complaints decreased, his message might have been that insolvency regulation had played a part in raising standards so that there were fewer causes for complaint.

The report mentions that the Service is “continuing dialogue” with the SRA and Law Society of Scotland to try to get them to adopt the gateway.

The Service still seems to be hung up about the effectiveness of the Insolvency Code of Ethics (as I’d mentioned in an earlier post, http://wp.me/p2FU2Z-6I) and have reported their “findings” to the JIC “to assist with its review into this area”.

The Service also seems to have got heavy with the RPBs about complaints on delayed IVA closures due to ongoing PPI refunds. The ICAEW and the IPA “have agreed to take forward all cases for investigation” – because, of course, some complaints are closed at assessment stage on the basis that the complaints reviewer has concluded that there is no case to answer (i.e. it is not that these complaints do not get considered at all) – “where the delay in closing the IVA exceeds six months from the debtor’s final payment”.  Does this mean that the general regulator view is that any delay under 6 months is acceptable?  Hopefully, this typical Service measure of setting unprincipled boundaries will not result in a formulaic approach to dealing with all complaints about delayed closure of IVAs.  And, although the other RPBs may license a smaller proportion of IVA-providing IPs, I wonder what their practices are…

The report also explains that the Service has persuaded the ICAEW to modify its approach a little in relation to complaints resolved by conciliation. Now, such a complaint will still be considered in the context of any regulatory breaches committed by the IP.  Years ago, the Service urged the RPBs to consider whether they could make greater use of financial compensation (or even simply requiring an IP to write an apology) in their complaints processes, but there was some resistance because it seemed that the key objective of the regulatory complaints process – to pick up IPs failing to meet standards – was at risk of getting lost: might some IPs be persuaded to agree a swift end to a complaint, if it meant that less attention would be paid to it?  To be fair, this has always been an IP’s option: he can always satisfy the complainant before they ever approach the regulator.  However, now settling a complaint after it has started on the Gateway path may not be the end of it for the IP, whichever RPB licenses him.

The Statistics

I think that the stats have been more than adequately covered by other commentaries. In any event, I found it difficult to draw any real conclusions from them in isolation, but they also don’t add much to the picture presented in the Insolvency Service’s 2013 annual review.  That’s not to say, however, that this report has no use; at the very least, it will serve as a reference point for the future.

Ok, the complaints number has increased, but it does seem that the delayed IVA closure due to PPI refunds is an exceptional issue at the moment. Given that the IPA licenses the majority of IPs who carry out IVAs, it is not surprising therefore that the IPA has the largest referred-complaint per IP figure: 0.63, compared to 0.54 over all the authorising bodies (although the SoS is barely a whisker behind at 0.62).  My personal expectation, however, is that the Insolvency Service’s being seen as being more involved in the complaints process via the Gateway alone may sustain slightly higher levels of complaints in the longer term, as perceived victims may not be so quick to assume that the RPB/IP relationship stacks the odds so heavily against them receiving a fair hearing.


Leave a comment

A Closer Look at Six Years of Insolvency Regulation

IMGP5575

Have you ever wanted evidence-based answers to the following..?

• Which RPB issues the most – and which the least – sanctions?
• What are the chances that a monitoring visit by your authorising body will result in a sanction or a targeted visit?
• How frequent are monitoring visits and is there much difference between the authorising bodies?
• Do you receive more or less than the average number of complaints?
• Are there more complaints now than in recent years?

Of course, there are lies, damned lies, and statistics, but a review of the past six years of Insolvency Service reports on IP regulation provides food for thought.

The Insolvency Service’s reports can be found at: http://www.bis.gov.uk/insolvency/insolvency-profession/Regulation/review-of-IP-regulation-annual-regulation-reports and my observations follow. Please note that I have excluded from my graphs the three RPBs with the smallest number of IPs, although their results have been included in the results for all the authorising bodies combined. In addition, when I talk about IPs, I am looking only at appointment-taking IPs.

Regrettably, I haven’t worked out how to embed my graphs within the text, so they can be found here. Alternatively, if you click on full article, you will be able to read the text along with the graphs.

Monitoring Visits

How frequently can IPs expect to be monitored and does it differ much depending on their authorising body?

The Principles for Monitoring set out a standard of once every three years, although this can stretch to up to six yearly provided there are satisfactory risk assessment processes. The stated policy of most RPBs is to make 3-yearly visits to their IPs. But what is it in reality and how has it changed over time? Take a look at graph (i) here.

This graph shows that last year all RPBs fell short of visiting one third of their IPs. However, the Secretary of State fell disastrously short, visiting only 8% of their IPs last year. I appreciate that the Secretary of State expects to relinquish all authorisations as a consequence of the Deregulation Bill, but this gives me the impression that they have given up already. Personally, I would expect the oversight regulator to set a better example!

Generally-speaking, all the RPBs are pretty-much in the same range, although the recent downward trend in monitoring visits for all of them is interesting; perhaps it illustrates that last year the RPBs’ monitoring teams’ time was diverted elsewhere. Fortunately, the longer term trend is still on the up.

What outcomes can be expected? The Insolvency Service reports detail the various sanctions ranging from recommendations for improvements to licence withdrawals. I have amalgamated the figures for all these sanctions for graph (ii) here.

Hmm… I’m not sure that helps much. How about comparing the sanctions to the number of IPs (graph (iii) here).

That’s not a lot better. Oh well.

Firstly, I notice that the IPA has bucked the recent downward trend of sanctions issued by all other licensing bodies, although the longer term trend for the bodies combined is remarkably steady. I thought it was a bit misleading for the Service report to state that “the only sanction available to the SoS is to withdraw an authorisation”, as that certainly hadn’t been the case in previous years: as this shows, in fact the SoS gave out proportionately more sanctions (mostly plans for improvements) than any of the RPBs in 2009, 2010 and 2011. Although ACCA and ICAS haven’t conducted a large number of visits (30 and 25 respectively in 2013), it is still a little surprising to see that their sanctions, like the SoS’, have dropped to nil.

However, the above graphs don’t include targeted visits. These are shown on graph (iv) here.

Ahh, so this is where those bodies’ efforts seem to be targeted. Even so, the SoS’ activities seem quite singular: are they using targeted visits as a way of compensating for the absence of power to impose other sanctions?

Complaints

The Insolvency Service’s report includes a graph illustrating that the number of complaints received has increased by 45% over the past three years, with 33% of that increase occurring over the past year. My first thought was that perhaps the Insolvency Service’s Complaints Gateway is admitting more complaints into the process, but the report had mentioned that 22% had been turned away, which I thought demonstrated that the Service’s filtering process was working reasonably well.

Therefore, I decided to look at the longer term trend (note that the number of IPs has crept up pretty insignificantly over these six years: a minimum of 1,275 in 2008 and a maximum of 1,355 in 2014). Take a look at graph (v) here.

So the current level of complaints isn’t unprecedented, although why they should be so high at present (or indeed in 2008), I’m not sure. It also appears from this that the IPA has more than its fair share, although the number of IPA-licensed IPs has been growing also. Let’s look at the spread of complaints over the authorising bodies when compared with their share of IPs (graph (vi) here).

Interesting, don’t you think? SoS IPs have consistently recorded proportionately more complaints. Given that the SoS has no power to sanction as a consequence of complaints, I wonder if this illustrates the deterrent value of sanctions. Of further interest is that the proportion of complaints against IPA-licensed IP has caught up with the SoS’ rate this last year – strange…

Moving on to complaints outcomes: how many complaints have resulted in a sanction and have the RPBs “performed” differently? Have a look at graph (vii) here.

At first glance, I thought that this peak reflected the fact that fewer complaints had been received – maybe the actual number of sanctions has remained constant? – so I thought I would look at the actual numbers (graph (viii) here).

Hmm… no, it really does look like the number of sanctions increased in years when fewer complaints were lodged. However, I’m sceptical of this apparent link, as I would suggest that, in view of the time it takes to get a complaint through the system, it may well be the case that the 2012/13 drop in sanctions flowed from the 2010/11 reduction in complaints lodged. I shall be interested to see if the number of sanctions pick up again in 2014.

Going back to the previous graph, personally I am reassured by the knowledge that in 2013 the RPBs generally reported a similar percentage of sanctions… well, at least closer than they were in 2010 when they ranged from 2% (ICAEW) to 38% (ICAS).

The ICAEW’s record of complaints sanctions seems to have kept to a consistently low level. However, let’s see what happens when we combine all sanctions – those arising from complaints and monitoring visits, as well as the ordering of targeted visits (graph (ix) here).

Hmm… that evens out some of the variation. Even the SoS now falls within the range! Of course, this doesn’t attribute any weights to the variety of sanctions, but I think it helps answer those who allege that some authorising bodies are a “lighter touch” than others, although I guess the sceptic could counter that by saying that this illustrates that IPs are still more than twice as likely to receive a sanction from the IPA than from ICAS. Ho hum.

Overview

To round things off, here is a summary of all the sanctions handed out by all the authorising bodies over the years (graph (x) here).

This suggests to me that targeted visits seem to have gone out of fashion, despite monitoring visits generally giving rise to more sanctions than complaints… but, with the hike in complaints lodged last year, perhaps I should not speak too soon.