Insolvency Oracle

Developments in UK insolvency by Michelle Butler

Monitoring the monitors: targeting consistency and transparency

Leave a comment

IMGP2570

The Insolvency Service’s 2014 Review had the target of transparency at its core. This time, the Insolvency Service has added consistency.  Do the Annual Reviews reveal a picture of consistency between the RPBs?

My second post on the Insolvency Service’s 2015 Annual Review of IP regulation looks at the following:

  • Are the RPBs sticking to a 3-year visit cycle?
  • How likely is it that a monitoring visit will result in some kind of regulatory action?
  • What action are the RPBs likely to take and is there much difference between the RPBs?
  • What can we learn from 6 years of SIP16 monitoring?
  • How have the RPBs been faring in their own monitoring visits conducted by the Insolvency Service?
  • What have the Service set in their sights for 2016?

 

RPBs converge on a 3-yearly visit cycle

The graph of the percentages of IPs that had a monitoring visit last year gives me the impression that a 3-yearly visit cycle has most definitely become the norm:

Graph7

(Note: because the number of SoS IPs dropped so significantly during the year – from 40 to 28 – all the graphs in this article reflect a 2015 mid-point of SoS-authorised IPs of 34.)

Does this mean that IPs can predict the timing of their next routine visit? I’m not sure.  It seems to me that some standard text is slipping into the Insolvency Service’s reports on their monitoring visits to the RPBs.  The words: “[RPB] operates a 3-year cycle of rolling monitoring visits to its insolvency practitioners. The nature and timing of visits is determined annually on a risk-assessment basis” have appeared in more than one InsS report.

What do these words mean: that every IP is visited once in three years, but some are moved up or down the list depending on their risk profile? Personally, this doesn’t make sense to me: either visits are timed according to a risk assessment or they are carried out on a 3-year cycle, I don’t see how you can achieve both.  If visit timings are sensitive to risk, then some IPs are going to receive more than one visit in a 3-year period and, unless the RPB records >33% of their IP number as having a visit every year (which the graph above shows is generally not the case), the corollary will be that some IPs won’t be visited in a 3-year period.

My perception on the outside is that, generally, the timing of visits is pretty predictable and is now pretty-much 3-yearly. I’ve seen no early parachuting-in on the basis of risk assessments, although I accept that my field of vision is very narrow.

 

Most RPBs report reductions in negative outcomes from monitoring visits

The following illustrates the percentage of monitoring visits that resulted in a “negative outcome” (my phrase):

Graph8

As you can see, most RPBs are clocking up between c.10% and 20% of monitoring visits leading to some form of negative consequence and, although individual records have fluctuated considerably in the past, the overall trend across all the regulatory bodies has fallen from 30% in 2008 to 20%.

However, two bodies seem to be bucking the trend: CARB and the SoS.

Last year, I didn’t include CARB (the regulatory body for members of the Institute of Chartered Accountants in Ireland), because its membership was relatively small. It still licenses only 41 appointment-taking IPs – only 3% of the population – but, with the exit of SoS authorisations, I thought it was worth adding them to the mix.

I am sure that CARB’s apparent erratic history is a consequence of its small population of licensed IPs and this may well explain why it is still recording a much greater percentage of negative outcomes than the other RPBs. Nevertheless, CARB does seem to have recorded exceptionally high levels for the past few years.

The high SoS percentage is a little surprising: 50% of all 2015 visits resulted in some form of negative outcome – these were all “plans for improvement”. CARB’s were a mixture of targeted visits, undertakings and one penalty/referral for disciplinary consideration.

So what kind of negative outcomes are being recorded by the other RPBs? Are there any preferred strategies for dealing with IPs falling short of expected standards?

 

What responses are popular for unsatisfactory visits?

The following illustrates the actions taken by the top three RPBs over the last 4 years:

Graph9

* The figures for ICR/self certifications requested and further visits should be read with caution. These categories do not appear in every annual review, but, for example, it is clear that RPBs have been conducting targeted visits, so this graph probably does not show the whole picture for the 2012 and 2013 outcomes.  In addition, of course the ICAEW requires all IPs to carry out annual ICRs, so it is perhaps not surprising that this category has rarely featured.

I think that all this graph suggests is that there is no trend in outcome types!  I find this comforting: it might be difficult to predict what outcome to expect, but it suggests to me that the RPBs are flexible in their approaches, they will implement whatever tool they think is best fitted for the task.

 

Looking back on 6 years of SIP16 monitoring
We all remember how over the years so many people seemed to get hot under the collar about pre-packs and we recall some appallingly misleading headlines that suggested that around one third of IPs were failing to comply with regulations. Where have the 6 years of InsS monitoring of SIP16 Statements got us?  I will dodge that question, but I’ll simply illustrate the statistics:

Graph10

Note: several years are “estimates” because the InsS did not always review all the SIP16 Statements they received. Also, the Service ended its monitoring in October 2015.  Therefore, I have taken the stats in these cases and pro rated them up to a full year’s worth.

Does the graph above suggest that a consequence of SIP16 monitoring has been to discourage pre-packs? Well, have a look at this one…

Graph11

As you can see, the dropping number of SIP16s is more to do with the drop in Administrations. In fact, the percentage of pre-packs has not changed much: it was a peak of 31% of all Administrations in 2012 and was at its lowest in 2014 at 24%.

I guess it could still be argued that the SIP16 scrutiny has persuaded some to sell businesses/assets in the pre (or immediately post) liquidation period, rather than use Administration.  I’m not sure how to test that particular theory.

So, back to SIP16 compliance, the graph-but-one above shows that the percentage of Statements that were compliant has increased. It might be easier to see from the following:

Graph12

Unequivocal improvements in SIP16 compliance – there’s a good news story!

A hidden downside of all this focus on improving SIP16 compliance, I think, is the costs involved in drafting a SIP16 Statement and then, as often happens, in getting someone fairly senior in the practice to double-check the Statement to make sure that it ticks every last SIP16 box.  Is this effort a good use of resources and of estate funds?

Now that the Insolvency Service has dropped SIP16 monitoring, does that mean we can all relax a bit? I think this would be unwise.  The Service’s report states that it “will review the outcome of the RPBs’ consideration of SIP16 compliance and will continue to report details in the Annual Review”, so I think we can expect SIP16 to remain a hot regulatory topic for some time to come.

 

The changing profile of pre-packs

The Service’s reports on SIP16 Statements suggest other pre-pack trends:

Graph13

Personally, I’m surprised at the number of SIP16 Statements that disclose that the business/assets were marketed by the Administrator: last year it was 56%. I’m not sure if that’s because some SIP16 Statements are explaining that the company was behind some marketing activities, but, if that’s not the reason, then 56% seems very low to me.  It would be interesting to see if the revised SIP16, which introduced the “marketing essentials”, makes a difference to this rate.

 

Have some pity for the RPBs!

The Service claimed to have delivered on their commitments in 2015 (incidentally, one of their 2014 expectations was that the new Rules would be made in the autumn of 2015 and they would come into force in April 2016 – I’m not complaining that the Rules are still being drafted, but I do think it’s a bit rich for the Executive Foreword to report pleasure in having met all the 2014 “commitments”).

The Foreword states that the reduction in authorising bodies is “a welcome step”. With now only 5 RPBs to monitor and the savings made in dropping SIP16 monitoring (which was the reported reason for the levy hike in 2009), personally I struggle to see the Service’s justification for increasing the levy this year.  The report states that it was required in view of the Service’s “enhanced role as oversight regulator”, but I thought that the Service did not expect to have to flex its new regulatory muscles as regards taking formal actions against RPBs or directly against IPs.

However, the tone of the 2015 Review does suggest a polishing of the thumb-screws. The Service refers to the power to introduce a single regulator and states that this power will “significantly shape” the Service’s work to come.

In 2015, the Service carried out full monitoring visits to the ICAEW, ICAS and CARB, and a follow-up visit to the ACCA. This is certainly more visits than previous years, but personally I question whether the visits are effective.  Of course, I am sure that the published visit reports do not tell the full stories – at least, I hope that they don’t – but it does seem to me that the Service is making mountains out of some molehills and their reports do give me the sense that they’re concerned with processes ticking the Principles for Monitoring boxes, rather than being effective and focussing on good principles of regulation.

For example, here are some of the molehill weaknesses identified in the Service’s visits that were resisted at least in part by some of the RPBs – to which I say “bravo!”:

  • Pre-visit information requested from the IPs did not include details of complaints received by the IP. The ICAEW responded that it was not convinced of the merits of asking for this on all visits but agreed to “consider whether it might be appropriate on a visit by visit basis”.
  • Closing meeting notes did not detail the scope of the visit. The ICAEW believed that it is important for the closing meeting notes to clearly set out the areas that the IP needs to address (which they do) and it did not think it was helpful to include generic information… although it seems that, by the time of the follow-up visit to the ICAEW in February 2016, this had been actioned.
  • The Service remains “concerned” that complainants are not provided with details of the independent assessor on their case. “ACCA regrets it must continue to reject this recommendation as ACCA does not believe naming assessors will add any real value to the process… There is also the risk of assessors being harassed by complainants where their decision is not favourable to them.”
  • Late bordereaux were only being chased at the start of the following month. The Service wanted procedures put in place to “ensure that cover schedules are provided within the statutory timescale of the 20th of each month and [to] follow up any outstanding returns on 21st or the next working day of each month”. Actually, CARB agreed to do this, but it’s just a personal bug-bear of mine. The Service’s report to the ICAEW went on about the “vital importance” of bonding – with which I agree, of course – but it does not follow that any bordereaux sent by IPs to their RPB “demonstrate that they have sufficient security for the performance of their functions”. It simply demonstrates that the IP can submit a schedule on time every month. I very much suspect that bordereaux are not checked on receipt by the RPBs – what are they going to do: cross-check bordereaux against Gazette notices? – so simply enforcing a zero tolerance attitude to meeting the statutory timescale is missing the point and seems a waste of valuable resources, doesn’t it?

 

Future Focus?

The Annual Review describes the following on the Insolvency Service’s to-do list:

  • Complaint-handling: in 2015, the Service explored the RPBs’ complaint-handling processes and application of the Common Sanctions Guidance. The Service has made a number of recommendations to improve the complaints process and is in discussion with the RPBs. They expect to publish a full report on this subject “shortly”.
  • Debt advice: also in 2015, they carried out a high-level review of how the RPBs are monitoring IPs’ provision of debt advice and they are currently considering recommendations for discussion with the RPBs.
  • Future themed reviews: The Service is planning themed reviews (which usually mean topic-focussed questionnaires to all RPBs) over 2016 and 2017 covering: IP monitoring; the fees rules; and pre-packs.
  • Bonding: the Service has been examining “the type and level of cover offered by bonds and considering both the legislative and regulatory arrangements to see if they remain fit for purpose”. They are cagey about the outcomes but do state that they “will work with the industry to effect any regulatory changes that may be necessary” and they refer to “any legislative change” being subject to consultation.
  • Relationship with RPBs: the Service is contemplating whether the Memorandum of Understanding (“MoU”) with the RPBs is still needed, now that there are statutory regulatory objectives in place. The MoU is a strange animal – https://goo.gl/J6wmuN. I think that it reads like a lot of the SIPs: a mixture of principles and prescription (e.g. a 10-day acknowledgement of complaints); and a mixture of important standards and apparent OTT trivia. It would be interesting to see how the Service approaches monitoring visits to the RPBs if the MoU is removed: they will have to become smarter, I think.
  • Ethics? The apparent focus on ethical issues seems to have fallen from the list this year. In 2015, breaches of ethics moved from third to second place in the list of complaints received by subject matter (21% in 2014 and 27% in 2015), but reference to the JIC’s work on revising the Ethics Code has not been repeated in this year’s Review. Presumably the work is ongoing… although there is certainly more than enough other tasks to keep the regulators busy!

 

 

Author: insolvencyoracle

In working life, I am a partner of the Compliance Alliance, providing compliance services to insolvency practitioners in the UK. I started blogging as Insolvency Oracle in 2012 after leaving the IPA and on realising that I was now free to express my personal opinions in public.

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.