Insolvency Oracle

Developments in UK insolvency by Michelle Butler


Leave a comment

Monitoring the monitors: targeting consistency and transparency

IMGP2570

The Insolvency Service’s 2014 Review had the target of transparency at its core. This time, the Insolvency Service has added consistency.  Do the Annual Reviews reveal a picture of consistency between the RPBs?

My second post on the Insolvency Service’s 2015 Annual Review of IP regulation looks at the following:

  • Are the RPBs sticking to a 3-year visit cycle?
  • How likely is it that a monitoring visit will result in some kind of regulatory action?
  • What action are the RPBs likely to take and is there much difference between the RPBs?
  • What can we learn from 6 years of SIP16 monitoring?
  • How have the RPBs been faring in their own monitoring visits conducted by the Insolvency Service?
  • What have the Service set in their sights for 2016?

 

RPBs converge on a 3-yearly visit cycle

The graph of the percentages of IPs that had a monitoring visit last year gives me the impression that a 3-yearly visit cycle has most definitely become the norm:

Graph7

(Note: because the number of SoS IPs dropped so significantly during the year – from 40 to 28 – all the graphs in this article reflect a 2015 mid-point of SoS-authorised IPs of 34.)

Does this mean that IPs can predict the timing of their next routine visit? I’m not sure.  It seems to me that some standard text is slipping into the Insolvency Service’s reports on their monitoring visits to the RPBs.  The words: “[RPB] operates a 3-year cycle of rolling monitoring visits to its insolvency practitioners. The nature and timing of visits is determined annually on a risk-assessment basis” have appeared in more than one InsS report.

What do these words mean: that every IP is visited once in three years, but some are moved up or down the list depending on their risk profile? Personally, this doesn’t make sense to me: either visits are timed according to a risk assessment or they are carried out on a 3-year cycle, I don’t see how you can achieve both.  If visit timings are sensitive to risk, then some IPs are going to receive more than one visit in a 3-year period and, unless the RPB records >33% of their IP number as having a visit every year (which the graph above shows is generally not the case), the corollary will be that some IPs won’t be visited in a 3-year period.

My perception on the outside is that, generally, the timing of visits is pretty predictable and is now pretty-much 3-yearly. I’ve seen no early parachuting-in on the basis of risk assessments, although I accept that my field of vision is very narrow.

 

Most RPBs report reductions in negative outcomes from monitoring visits

The following illustrates the percentage of monitoring visits that resulted in a “negative outcome” (my phrase):

Graph8

As you can see, most RPBs are clocking up between c.10% and 20% of monitoring visits leading to some form of negative consequence and, although individual records have fluctuated considerably in the past, the overall trend across all the regulatory bodies has fallen from 30% in 2008 to 20%.

However, two bodies seem to be bucking the trend: CARB and the SoS.

Last year, I didn’t include CARB (the regulatory body for members of the Institute of Chartered Accountants in Ireland), because its membership was relatively small. It still licenses only 41 appointment-taking IPs – only 3% of the population – but, with the exit of SoS authorisations, I thought it was worth adding them to the mix.

I am sure that CARB’s apparent erratic history is a consequence of its small population of licensed IPs and this may well explain why it is still recording a much greater percentage of negative outcomes than the other RPBs. Nevertheless, CARB does seem to have recorded exceptionally high levels for the past few years.

The high SoS percentage is a little surprising: 50% of all 2015 visits resulted in some form of negative outcome – these were all “plans for improvement”. CARB’s were a mixture of targeted visits, undertakings and one penalty/referral for disciplinary consideration.

So what kind of negative outcomes are being recorded by the other RPBs? Are there any preferred strategies for dealing with IPs falling short of expected standards?

 

What responses are popular for unsatisfactory visits?

The following illustrates the actions taken by the top three RPBs over the last 4 years:

Graph9

* The figures for ICR/self certifications requested and further visits should be read with caution. These categories do not appear in every annual review, but, for example, it is clear that RPBs have been conducting targeted visits, so this graph probably does not show the whole picture for the 2012 and 2013 outcomes.  In addition, of course the ICAEW requires all IPs to carry out annual ICRs, so it is perhaps not surprising that this category has rarely featured.

I think that all this graph suggests is that there is no trend in outcome types!  I find this comforting: it might be difficult to predict what outcome to expect, but it suggests to me that the RPBs are flexible in their approaches, they will implement whatever tool they think is best fitted for the task.

 

Looking back on 6 years of SIP16 monitoring
We all remember how over the years so many people seemed to get hot under the collar about pre-packs and we recall some appallingly misleading headlines that suggested that around one third of IPs were failing to comply with regulations. Where have the 6 years of InsS monitoring of SIP16 Statements got us?  I will dodge that question, but I’ll simply illustrate the statistics:

Graph10

Note: several years are “estimates” because the InsS did not always review all the SIP16 Statements they received. Also, the Service ended its monitoring in October 2015.  Therefore, I have taken the stats in these cases and pro rated them up to a full year’s worth.

Does the graph above suggest that a consequence of SIP16 monitoring has been to discourage pre-packs? Well, have a look at this one…

Graph11

As you can see, the dropping number of SIP16s is more to do with the drop in Administrations. In fact, the percentage of pre-packs has not changed much: it was a peak of 31% of all Administrations in 2012 and was at its lowest in 2014 at 24%.

I guess it could still be argued that the SIP16 scrutiny has persuaded some to sell businesses/assets in the pre (or immediately post) liquidation period, rather than use Administration.  I’m not sure how to test that particular theory.

So, back to SIP16 compliance, the graph-but-one above shows that the percentage of Statements that were compliant has increased. It might be easier to see from the following:

Graph12

Unequivocal improvements in SIP16 compliance – there’s a good news story!

A hidden downside of all this focus on improving SIP16 compliance, I think, is the costs involved in drafting a SIP16 Statement and then, as often happens, in getting someone fairly senior in the practice to double-check the Statement to make sure that it ticks every last SIP16 box.  Is this effort a good use of resources and of estate funds?

Now that the Insolvency Service has dropped SIP16 monitoring, does that mean we can all relax a bit? I think this would be unwise.  The Service’s report states that it “will review the outcome of the RPBs’ consideration of SIP16 compliance and will continue to report details in the Annual Review”, so I think we can expect SIP16 to remain a hot regulatory topic for some time to come.

 

The changing profile of pre-packs

The Service’s reports on SIP16 Statements suggest other pre-pack trends:

Graph13

Personally, I’m surprised at the number of SIP16 Statements that disclose that the business/assets were marketed by the Administrator: last year it was 56%. I’m not sure if that’s because some SIP16 Statements are explaining that the company was behind some marketing activities, but, if that’s not the reason, then 56% seems very low to me.  It would be interesting to see if the revised SIP16, which introduced the “marketing essentials”, makes a difference to this rate.

 

Have some pity for the RPBs!

The Service claimed to have delivered on their commitments in 2015 (incidentally, one of their 2014 expectations was that the new Rules would be made in the autumn of 2015 and they would come into force in April 2016 – I’m not complaining that the Rules are still being drafted, but I do think it’s a bit rich for the Executive Foreword to report pleasure in having met all the 2014 “commitments”).

The Foreword states that the reduction in authorising bodies is “a welcome step”. With now only 5 RPBs to monitor and the savings made in dropping SIP16 monitoring (which was the reported reason for the levy hike in 2009), personally I struggle to see the Service’s justification for increasing the levy this year.  The report states that it was required in view of the Service’s “enhanced role as oversight regulator”, but I thought that the Service did not expect to have to flex its new regulatory muscles as regards taking formal actions against RPBs or directly against IPs.

However, the tone of the 2015 Review does suggest a polishing of the thumb-screws. The Service refers to the power to introduce a single regulator and states that this power will “significantly shape” the Service’s work to come.

In 2015, the Service carried out full monitoring visits to the ICAEW, ICAS and CARB, and a follow-up visit to the ACCA. This is certainly more visits than previous years, but personally I question whether the visits are effective.  Of course, I am sure that the published visit reports do not tell the full stories – at least, I hope that they don’t – but it does seem to me that the Service is making mountains out of some molehills and their reports do give me the sense that they’re concerned with processes ticking the Principles for Monitoring boxes, rather than being effective and focussing on good principles of regulation.

For example, here are some of the molehill weaknesses identified in the Service’s visits that were resisted at least in part by some of the RPBs – to which I say “bravo!”:

  • Pre-visit information requested from the IPs did not include details of complaints received by the IP. The ICAEW responded that it was not convinced of the merits of asking for this on all visits but agreed to “consider whether it might be appropriate on a visit by visit basis”.
  • Closing meeting notes did not detail the scope of the visit. The ICAEW believed that it is important for the closing meeting notes to clearly set out the areas that the IP needs to address (which they do) and it did not think it was helpful to include generic information… although it seems that, by the time of the follow-up visit to the ICAEW in February 2016, this had been actioned.
  • The Service remains “concerned” that complainants are not provided with details of the independent assessor on their case. “ACCA regrets it must continue to reject this recommendation as ACCA does not believe naming assessors will add any real value to the process… There is also the risk of assessors being harassed by complainants where their decision is not favourable to them.”
  • Late bordereaux were only being chased at the start of the following month. The Service wanted procedures put in place to “ensure that cover schedules are provided within the statutory timescale of the 20th of each month and [to] follow up any outstanding returns on 21st or the next working day of each month”. Actually, CARB agreed to do this, but it’s just a personal bug-bear of mine. The Service’s report to the ICAEW went on about the “vital importance” of bonding – with which I agree, of course – but it does not follow that any bordereaux sent by IPs to their RPB “demonstrate that they have sufficient security for the performance of their functions”. It simply demonstrates that the IP can submit a schedule on time every month. I very much suspect that bordereaux are not checked on receipt by the RPBs – what are they going to do: cross-check bordereaux against Gazette notices? – so simply enforcing a zero tolerance attitude to meeting the statutory timescale is missing the point and seems a waste of valuable resources, doesn’t it?

 

Future Focus?

The Annual Review describes the following on the Insolvency Service’s to-do list:

  • Complaint-handling: in 2015, the Service explored the RPBs’ complaint-handling processes and application of the Common Sanctions Guidance. The Service has made a number of recommendations to improve the complaints process and is in discussion with the RPBs. They expect to publish a full report on this subject “shortly”.
  • Debt advice: also in 2015, they carried out a high-level review of how the RPBs are monitoring IPs’ provision of debt advice and they are currently considering recommendations for discussion with the RPBs.
  • Future themed reviews: The Service is planning themed reviews (which usually mean topic-focussed questionnaires to all RPBs) over 2016 and 2017 covering: IP monitoring; the fees rules; and pre-packs.
  • Bonding: the Service has been examining “the type and level of cover offered by bonds and considering both the legislative and regulatory arrangements to see if they remain fit for purpose”. They are cagey about the outcomes but do state that they “will work with the industry to effect any regulatory changes that may be necessary” and they refer to “any legislative change” being subject to consultation.
  • Relationship with RPBs: the Service is contemplating whether the Memorandum of Understanding (“MoU”) with the RPBs is still needed, now that there are statutory regulatory objectives in place. The MoU is a strange animal – https://goo.gl/J6wmuN. I think that it reads like a lot of the SIPs: a mixture of principles and prescription (e.g. a 10-day acknowledgement of complaints); and a mixture of important standards and apparent OTT trivia. It would be interesting to see how the Service approaches monitoring visits to the RPBs if the MoU is removed: they will have to become smarter, I think.
  • Ethics? The apparent focus on ethical issues seems to have fallen from the list this year. In 2015, breaches of ethics moved from third to second place in the list of complaints received by subject matter (21% in 2014 and 27% in 2015), but reference to the JIC’s work on revising the Ethics Code has not been repeated in this year’s Review. Presumably the work is ongoing… although there is certainly more than enough other tasks to keep the regulators busy!

 

 


Leave a comment

Is the IP regulation system fair?

IMGP0038 (2)

The Insolvency Service’s 2015 review of IP regulation was released in March and, as usual, I’ve dug around the statistics in comparison with previous years.

They indicate that complaint sanctions have increased (despite complaint numbers dropping), but monitoring sanctions have fallen. Why is this?  And why was one RPB alone responsible for 93% of all complaints sanctions?

The Insolvency Service’s report can be found at https://goo.gl/HlATlf.

I honestly had no idea that the R3 member survey issued earlier today was going to ask about the effectiveness of the regulatory system. I would encourage R3 members to respond to the survey (but don’t let this blog post influence you!).

IP number falls to 6-year low

I guess it was inevitable: no IP welcomes the hassle of switching authorising body and word on the street has always been that being authorised by the SoS is a far different experience to being licensed by an RPB. Therefore, I think that the withdrawal from authorising by the SoS (even with a run-off period) courtesy of the Deregulation Act 2015 and the Law Societies was likely to affect the IP numbers.

Here is how the landscape has shifted:

Graph1

As you can see, the remaining RPBs have not gained all that the SoS and Law Societies have lost and ACCA’s and CARB’s numbers have dropped since last year. It is also a shame to note that, not only has the IP number fallen for the first time in 4 years, it has also dropped to below the 2010 total.

Personally, I expect the number to drop further during 2016: I am sure that the prospect of having to adapt to the new Insolvency Rules 2016 along with the enduring fatigue of struggling to get in new (fee-paying) work and of taking the continual flak from regulators and government will persuade some to hang up their boots. I also don’t see that the industry is attracting sufficient new joiners who are willing and able to take up the responsibility, regardless of the government’s partial licence initiative that has finally got off the ground.

Maybe this next graph will make us feel a bit better…

Number of regulatory sanctions fall

Graph2

Although the numbers are spiky, I guess there is some comfort to be had in seeing that the regulatory bodies issued fewer sanctions against IPs in 2015. [To try to put 2010’s numbers into context, you’ll remember that 1 January 2009 was the start of the Insolvency Service’s monitoring of the revised SIP16, which led to a number of referrals to the RPBs, although I cannot be certain that this was behind the unusual 2010 peak in sanctions.]

But what interests me is that the number of sanctions in 2015 arising from complaints far outstripped those arising from monitoring visits, which seems quite a departure from the picture of previous years. What is behind this?  Is it simply a consequence of our growing complaint-focussed society?

Complaints on the decrease

Graph3

Well actually, as you can see here, it seems that fewer complaints were registered last year… by quite a margin.

I confess that some of these years are not like-for-like comparisons: before the Complaints Gateway, the RPBs were responsible for reporting to the Insolvency Service how many complaints they had received and it is very likely that they incorporated some kind of filter – as the Service does – to deal with communications received that were not truly complaints. However, it cannot be said for certain that the RPBs’ pre-Gateway filters worked in the same way as the Service’s does now.  Nevertheless, what this graph does show is that 2015’s complaints referred to the regulatory bodies were less than 2014’s (which was c.half a Gateway year – the “Gateway (adj.)” column represents a pro rata’d full 12 months of Gateway operation based on the partial 2014 Gateway number).

It is also noteworthy that the Insolvency Service is chalking up a similar year-on-year percentage of complaints filtered out: in 2014, this ran at 24.5% of the complaints received, and in 2015, it was 26.5%.

So, if there were fewer complaints lodged, then why have complaints sanctions increased?

How long does it take to process complaints?

The correlation between complaints lodged and complaint sanctions is an interesting one:

Graph4

Is it too great a stretch of the imagination to suggest that complaint sanctions take somewhere around 2 years to emerge? I suggest this because, as you can see, the 2010/11 sanction peak coincided with a complaints-lodged trough and the 2013 sanctions trough coincided with a complaints lodged peak – the pattern seems to show a 2-year shift, doesn’t it..?

I am conscious, however, that this could simply be a coincidence: why should sanctions form a constant percentage of all complaints?  Perhaps the sanctions simply have formed a bit of a random cluster in otherwise quiet years.

Could there be another reason for the increased complaints sanctions in 2015?

One RPB breaks away from the pack

Graph5

How strange! Why has the IPA issued so many complaints sanctions when compared with the other RPBs?

I have heard more than one IP suggest that the IPA licenses more than its fair share of IPs who fall short of acceptable standards of practice. Personally, I don’t buy this.  Also more sanctions don’t necessarily mean there are more sanctionable offences going on.  It reminds me of the debates that often surround the statistics on crime: does an increase in convictions mean that there are more crimes being committed or does it mean that the police are getting better at dealing with them?

Nevertheless, the suggestion that the IPA’s licensed population is different might help explain the IPA peak in sanctions, mightn’t it? To test this out, perhaps we should compare the number of complaints received by each RPB.

Graph6

Ok, so yes, IPA-licensed IPs have received more complaints than other RPBs (although SoS-authorised IPs came out on top again this past year).  If the complaints were shared evenly, then 58% of all IPA-licensed IPs would have received a complaint last year, compared to only 43% of those licensed by the other three largest RPBs.  I hasten to add that, personally, I don’t think this indicates differing standards of practice depending on an IP’s licensing body: it could indicate that IPA-licensed (and perhaps also SoS-authorised) IPs work in a more complaints-heavy environment, as I mention further below.

Nevertheless, let’s see how these complaints-received numbers would flow through to sanctions, if there were a direct correlation. For simplicity’s sake, I will assume that a complaint lodged in 2013 concluded in 2015 – although I think this is highly unlikely to be the average, I think it could well be so for the tricky complaints that lead to sanctions.  This would mean that, across all the RPBs (excluding the Insolvency Service, which has no power to sanction SoS-authorised IPs in respect of complaints), 12% of all complaints led to sanctions.  On this basis, the IPA might be expected to issue 36 complaint-led sanctions, so this doesn’t get us much closer to explaining the 76 sanctions issued by the IPA.

I can suggest some factors that might be behind the increase in the number of complaints sanctions granted by the IPA:

  • The IPA licenses the majority of IVA-specialising IPs, which do seem to have attracted more than the average number of sanctions: last year, two IPs alone were issued with seven reprimands for IVA/debtor issues.
  • The IPA’s process is that matters identified on a monitoring visit that are considered worthy of disciplinary action are passed from the Membership & Authorisation Committee to the Investigation Committee as internal complaints. Therefore, I think this may lead to some IPA “complaint” sanctions actually originating from monitoring visits. However, analysis of the sanctions arising from monitoring visits (which I will cover in another blog) indicates that the IPA sits in the middle of the RPB pack, so it doesn’t look like this is a material factor.
  • Connected to the above, the IPA’s policy is that any incidence of unauthorised remuneration spotted on monitoring visits is referred to the Investigation Committee for consideration for disciplinary action. Given that it seems that such incidences include failures that have already been rectified (as explained in the IPA’s September 2015 newsletter) and that unauthorised remuneration can arise from a vast range of seemingly inconspicuous technical faults, I would not be surprised if this practice were to result in more than a few unpublished warnings and undertakings.

But this cannot be the whole story, can it? The IPA issued 93% of all complaints sanctions last year, despite only licensing 35% of all appointment-takers.  The previous year followed a similar pattern: the IPA issued 82% of all complaints sanctions.

To put it another way, over the past two years the IPA issued 111 complaints sanctions, whilst all the other RPBs put together issued only 14 sanctions.

What is going on? It is difficult to tell from the outside, because the vast majority of the sanctions are not published.  Don’t get me wrong, I’m not complaining about that.  If the sanctions were evenly-spread, I could not believe that c.16% of all IPA-licensed IPs conducted themselves so improperly that they merited the punitive publicity that .gov.uk metes out on IPs (what other individual professionals are flogged so publicly?!).

The Regulators’ objective to ensure fairness

This incongruence, however, makes me question the fairness of the RPBs’ processes.  It cannot be fair for IPs to endure different treatment depending on their licensing body.

You might say: what’s the damage, when the majority of sanctions went unpublished? I have witnessed the anguish that IPs go through when a disciplinary committee is considering their case, especially if that process takes years to conclude.  It lingered like a Damocles Sword over many of my conversations with the IPs.  The apparent disparity in treatment also does not help those (myself included) that argue that a multiple regulator system can work well.

One of the new regulatory objectives introduced by the Small Business Enterprise & Employment Act 2015 was to secure “fair treatment for persons affected by [IPs’] acts and omissions”, but what about fair treatment for IPs?  In addition, isn’t it possible that any unfair treatment on IPs will trickle down to those affected by their acts and omissions?

The Insolvency Service has sight of all the RPBs’ activities and conducts monitoring visits on them regularly. Therefore, it seems to me that the Service is best placed to explore what’s going on and to ensure that the RPBs’ processes achieve consistent and fair outcomes.

 

In my next blog, I will examine the Service’s monitoring of the RPBs as well as take a closer look at the 2015 statistics on the RPBs’ monitoring of IPs.