In this post, I analyse the Insolvency Service’s annual review of IP regulation, asking the following questions:
- Are the regulators visiting their IPs once every three years?
- How likely is it that a monitoring visit will result in some kind of negative outcome?
- How likely is a targeted visit?
- Has the Complaints Gateway led to more complaints?
- What are the chances of an IP receiving a complaint?
- How likely is it that a complaint will result in a sanction?
The Insolvency Service’s reports can be found at: http://goo.gl/MZHeHK. As I did last year (http://wp.me/p2FU2Z-6C), I have only focussed attention on the authorising bodies with the largest number of IPs (but included stats for the others in the figures for “all”) and only in relation to appointment-taking IPs. Again, regrettably, I don’t see how I can embed the graphs into this page, so they can be found at: Graphs 23-04-15. You might find it easier to read the full article along with the graphs here(2).
Monitoring Visits
- Are the regulators visiting their IPs once every three years?
Graph (i) (here(2)) looks at how much of each regulator’s population has been visited each year:
Is it a coincidence that the two regulators that were visited by the Service last year – the ACCA and the Service’s own monitoring team – have both reported huge changes in monitoring visit numbers? Of course, this graph also shows that those two regulators carried out significantly less monitoring visits in 2013, so perhaps they were already conscious that they had some catching-up to do.
I’m not convinced that it was the Service’s visit that prompted ACCA’s increase in inspections: the Service’s February 2015 report on its 2014 visits to the ACCA did not disclose any concerns regarding the visit cycle and I think it is noteworthy that ACCA had a lull in visits in 2010, so perhaps the 2013 trough simply reflects the natural cycle. Good on the Insolvency Service, though, for exerting real efforts, it seems, to get through lots of monitoring visits in 2014!
The trend line is interesting and reflects, I think, the shifting expectations. The Service’s Principles for Monitoring continue to set the standard of a monitoring visit once every three years with a long-stop date of six years if the regulator employs satisfactory risk assessment processes. However, I think most regulators now profess to carry out 3-yearly visits as the norm and most seem to be achieving something near this.
The ICAEW seems a little out-of-step with the other regulators, though. At their 2014 rate, it would take 4½ years to get around all their IPs. The report does explain, however, that the ICAEW also carried out 32 other reviews, most of which were “phone reviews” to new appointment-taking IPs. The Service hasn’t counted these in the stats as true visits, so neither have I.
- How likely is it that a monitoring visit will result in some kind of negative outcome?
Graph (ii) (here(2)) lumps together all the negative outcomes arising from monitoring visits: further visits ordered; undertakings and confirmations; penalties, referrals for disciplinary consideration; plans for improvement; compliance/self-certification reviews requested; and licence withdrawals (3 in 2014).
It’s spiky, but you can see that, overall around 1 in 4 visits in 2014 ended up with some kind of action needed.
Above this line, ACCA and ICAEW reported the most negative outcomes. Most of the ACCA’s negative outcomes related to the ordering of a further visit (20% of their visits). The majority of ICAEW’s negative outcomes related to the request for a compliance review (16% of their visits). Of course, ICAEW IPs are required to carry out compliance reviews every year in any event. I understand that this category involves the ICAEW specifically asking to see and consider the following year’s compliance review and/or requiring that the review be carried out by an external provider, where weaknesses in the IP’s internal review system have been identified.
I find ICAS’ flat-line rather interesting: for two years now, they have not reported any negative outcome from monitoring visits. The Service had scheduled a visit to ICAS in April this year, so I’ll be interested to see the results of that.
- How likely is a targeted visit?
Let’s take a closer look at ACCA’s ordering of further visits (graph (iii) here(2)): is this a new behaviour?
The 2015 estimated figures are based on the outcomes reported for the 2014 visits, although of course some could already have occurred in 2014.
ACCA seems to be treading a path all its own: the other RPBs – and now even the Service – don’t seem to favour targeted visits.
Complaints
- Has the Complaints Gateway led to more complaints?
It’s hard to tell. The Service’s first-year report on the Complaints Gateway said that, as it had received 941 complaints in its first 12 months – and by comparison, 748 and 578 complaints were made direct to the regulators in 2013 and 2012 respectively – “it may be that this increase in complaints reflects the improvement in accessibility and increased confidence in the simplification of the complaints process”.
However, did the pre-Gateway figures reflect all complaints received by each regulator or only those that made it through the front-line filter? If it is the latter, then the Gateway comparison figure is 699, not 941, which means that fewer complaints were received via the Gateway than previously (or at least for 2013), as this graph (iv) (here(2)) demonstrates.
The stats for 2013 are a mixture: for half of the year, the regulators were receiving the complaints direct and for the second half of the year the Gateway was in operation. It seems to me that the Service has changed it reporting methodology: for the 2013 report, the stats were the total complaints made per regulator, but in 2014 the report refers to the complaints referred to each regulator.
Therefore, I don’t think we can draw any conclusions, as we don’t know on what basis the regulators were reporting complaints before the Gateway. We cannot even say with confidence that the number of complaints received in 2013/14 is significantly higher than in 2012 and earlier, as this graph suggests, because it may be that the regulators were filtering out more complaints than the Gateway is currently.
About all we can say is that marginally fewer complaints were referred from the Gateway for the second half of 2014 than for the first half.
- What are the chances of an IP receiving a complaint?
Of course, complaints aren’t something that can be spread evenly across the IP population: some IPs work in a more contentious field, others in high profile work, which may attract more attention than others. The Service’s report mentioned that the IPA is still dealing with 34 complaints from 2012/2013 that relate to the same IVA practice.
However, graph (v) (here(2)) may give you an idea of where you sit.
This illustrates that, if complaints were spread evenly, half of all IPs would receive one complaint each year – and this figure hasn’t changed a great deal over the past few years.
As I mentioned last year, I do wonder if this graph illustrates the deterrent value of RPB sanctions: given that the Service has no power to order disciplinary sanctions on the back of complaints, perhaps it is not surprising that, year after year, SoS-authorised IPs have clocked up the most complaints. I believe that the IPA’s 2013 peak may have had something to do with the delayed IVA completion issue (as I understand that the IPA licenses the majority of IPs specialising in IVAs). It’s good to see that this is on the way down.
I am also interested in the low number of complaints recorded by ICAS-licensed IPs: maybe this justifies their flat-lined actions on monitoring visits explained above: maybe their IPs are just more well-behaved! Or does it reflect that individuals involved in Scottish insolvency procedures may have somewhere else to go with their complaints: the Accountant in Bankruptcy? Although the AiB website refers complainants to the RPB (shouldn’t this be to the Gateway?), it also states that they can write to the AiB and it seems to me that the AiB’s statutory supervisory role could create a fuzzy line.
- How likely is it that a complaint will result in a sanction?
Although at first glance, this graph (vi) (here(2)) appears to show that the RPBs “perform” similarly when it comes to deciding on sanctions, it does show that, on average, the IPA issues sanctions on almost twice as many complaints when compared with the average over the RPBs as a whole. Also, it seems that IPA-licensed IPs are seven times more likely to be sanctioned on the back of a complaint than ICAEW-licensed IPs. The ACCA figure seems odd: no sanctions at all were reported for 2014.
Of the 43 complaints sanctions reported in 2014, 35 were issued by the IPA: that’s 82% of all sanctions. That’s a hefty proportion, considering that the IPA licenses only 34% of all appointment-taking IPs. It is no wonder that, at last week’s IPA conference, David Kerr commented on the complaints sanction stats and stressed the need for the RPBs to be working, and disclosing, consistently on complaints-handling.
Overview
Finally, let’s look at the negative outcomes from monitoring visits and complaints sanctions together (graph (vii) here(2)).
Of course, this doesn’t reflect the severity of the outcomes: included here is anything from an unpublicised warning (when the RPB discloses them to the Service) to a licence withdrawal. And, despite what I said earlier about the timing of the Service’s visit to the ACCA, I am still tempted to suggest that perhaps the Service’s visits have pushed the regulators – the Insolvency Service’s monitoring team and ACCA – into action, as those two regulators have recorded significant jumps in activity over the past year.
The Service has a busy year planned: full monitoring visits to ICAEW, ICAS, CARB, LSS and SRA (although that may be scaled back given the decision for the SRA to pull out of IP-licensing), and a follow up visit to ACCA. No visit planned to the IPA? Perhaps that suggests that the Service is looking as closely at these stats as I am.