CX Matters Podcast S1EP15 Launch of the Call Centre Rankings Reports

Launch of the ACXPA Call Centre Rankings Report

In this episode of the CX Matters Podcast, host and ACXPA CEO Justin Tippett speaks to the new General Manager of Quality Insights about the launch of the Call Centre Rankings report.

It’s a great insight into the passion and experience behind the new reports, along with a high-level overview of how the assessment model works.

Listen now to the Podcast

Just click play below to listen right here in your browser. If you’d rather listen on your favourite podcasting platform, you’ll find the links at the bottom of the page.

CX Matters Podcast

November 29, 2023 · Season 1 · Episode 15

Launch of Australian Call Centre Rankings Report

15 Min, 45 Sec · By ACXPA

In this episode of the CX Matters Podcast, host and ACXPA CEO Justin Tippett speaks to the new General Manager of Quality Insights about the launch of the Australian Call Centre Rankings report.  It's a great insight into the passion and experience behind the new reports, along with a high-level overview of how the assessment model works. 

Watch the CX Matters Podcast live recording

Launch of the Call Centre Rankings Report CX Matters Podcast
Subscribe to our YouTube Channel
Read the Transcript from this CX Matters Podcast

G’day everyone, and welcome to the CX Matters Podcast. My name is Justin Tippett. It’s been a while since I’ve spoken to you all, but there’s a good reason for that. We’ve been flat out launching the Call Centre Rankings report, which you may have already seen some results banded around on LinkedIn, etc.

I thought, well, you know what, we haven’t actually informed you, the listeners, of what it’s all about. I thought, well, I could tell you, but why don’t we get the expert who’s actually the key part of the man behind it all? I’m bringing our General Manager of Customer Quality Insights, I should say, Simon Blair. Welcome, Simon.

Hey, Justin. Thanks for having me. This is going to be a bit rough. I haven’t done a podcast in a while, so Jesus. We’ll wing it, mate.

So, look, obviously, really excited to get you on board as part of the ACXPA journey. We’ve known each other and worked together on and off for years anyway, through our CX Skills training side of things. But to go down the benchmarking path is bloody exciting. You bring with it literally nearly 30 years of experience to the table. So I thought the first thing to do is maybe just share with the listeners or the audience of how the hell did you get into benchmarking in the first place?

Well, like most of us, it’s a natural organic sort of process when you end up doing certain things. For me, it’s just the natural evolution of the work I’ve done over 30 years in the contact centre space. I started at the front line, like a lot of us, taking calls, worked for Telstra for my first seven, eight years of my career, big large call centre operation, cut my teeth at the coalface, did that as a consultant, was a team leader for years, training and then moved into the quality management, quality monitoring assessment space, working for companies like Telstra, ANZ, a few others along the way, an outsource provider, etc.

As a result of that experience working for companies in producing core quality models, in addition to the training and the coaching, it was a natural evolution. Then I managed to get a job with a relatively new organisation, Global Reviews, who were a benchmarking company. They built their business on the back of website benchmarking. They had their clients basically crying out, “What about our biggest channel, the call centre?” But of course, they didn’t have any expertise. They were website guys. They reached out to the market, and I responded. It was the perfect timing, like everything when it works well. Timing is everything for me to find that role out of the work I was doing at ANZ.

I thought it’s a good time to step into the consultancy space, and I got to basically build the benchmarking business for contact centres for them with the models I refined and developed and tweaked over many years internally. Since then, really, that’s been the second half of my career as a consultant in the benchmarking space, initially for a good five years with Global Reviews, and then with my own business, 5 Degrees, where it’s always been a combination, like it is with CX Skills and ACXPA. It’s a combination of both the training space and the development side of it, and now, obviously, with the benchmarking sort of services and call assessment services.

Yeah, fantastic. It really excites me in a number of different ways because I think we’re cut from the same cloth. If you’re going to be doing things like quality assessments and quality monitoring, whatever you want to call it, there’s got to be a reason why you’re doing it. It’s not to whack people over the head. You want to improve performance. The model that we’ve got at Exper now is really refined in terms of helping people be able to do that.

We’ll go into the depth around the model itself in some future podcasts. But at a really high level for people that are listening, we measure things over two key metrics. One is what we call the accessibility, how easy was it to connect to a live call centre agent? Then also, the quality of the experience, that interaction with the call centre agent themselves. In total, there’s 48 different metrics that we capture across both of those.

If I start with the accessibility just at a high level, as we said, it’s about trying to get through the difficulties people challenge. What are some of the things that we’re looking for within that? Ease of use and speed. Let’s face it, as consumers, we hate waiting, and having to spend time navigating through. The quicker, simpler, and easier, the better. That Benchmark for accessibility is built around that, certainly rewarding those companies that do make it quick and easy.

But we still want to get to the right area. We certainly appreciate having an IVR and the options there, certainly for the bigger operations. At the same time, we want to have the options very simple, very obvious, very straightforward, especially when we’re calling as prospective customers for sales. That’s a big one, isn’t it? Absolutely. Just get us through. We’re a new customer, get us speaking to someone as quickly as we can. Have the right options on the IVR, low wait times, make sure the audio is nice and clear, etc. All of those things are wrapped into that accessibility score.

Now, of course, things go wrong sometimes. We’d like to think the world’s perfect; we know it’s not, especially in the call centre world. There are also some deductions that come off those scores for things that can seriously impact that customer experience. One of those is from the most recent review and results we put out around some car insurance results and just one organization where it was very clear and obvious from their messaging, their recorded messaging before and after the menu system. On two calls, there were four messages. On another call, there were five, which basically said, “Go to our website, go online. You should go online, or we can send you a text message so you can go online.” That was before we even got to the queue, and it’s like, well, I’ve picked up the phone for a reason. Stop trying to push me away from the channel I’ve chosen. It wasn’t a great experience as the first impression for a prospective customer looking to switch providers. Certainly isn’t. Without dating this podcast, it’ll be published today anyway. Yesterday, I filmed a story for Channel 7, which I think is airing tonight or tomorrow somewhere around there.

As part of the filming, I actually rang that particular company on speaker phone and had that very same experience. I don’t know if that’ll end up on the editing floor or not, but I had to do some fake reactions like, “Ah, this is crazy.” It wasn’t a great experience, put it that way. We’re looking for those things around accessibility as part of that metric. The other one, of course, is around quality. We’ve got five different competencies. Simon, off the top of your head, engage, discover, educate, close, energy. Look at that. See, because if I said it, I would have stuffed one up, I’m sure. I live and breathe this stuff. We’ll unpack those in a later episode. The whole idea of those is they are applicable.

I guess maybe this is a question that people are wanting to know. My call centre is a little bit different or sales is different from service, etc. When you’re doing this judging or the rating, is it really fair? Does it apply to my call centre? How would you respond to someone that said that?

Everyone likes to think they’re special, and I don’t criticize that because, of course, they are. The things that they think are important and the things they need to focus on operationally just aren’t the things that customers care about or need. In the case of sales, make a buying decision, and in the case of service, get resolution. They’re focused on the issue. The beauty of the quality benchmark is that it’s independent of product, service, industry. What makes a great experience when you’re selling widgets also makes a great experience in terms of call handling, if you’re selling ice to the Eskimos, for example. Regardless of the product or service, how well do you represent whatever you need to represent? What are your key features, your key services, your key benefits in a way that is matched and personalized to the needs of that individual caller? That’s the heart and soul of our benchmarks and those competencies that I mentioned earlier. It’s a great start, great finish, and what you do in the middle in terms of discovering needs and tailoring a solution and doing it with energy. Energy, don’t be boring. Number one rule that I say in all my training courses, don’t be boring. With audio over the phone, that’s very important. Like us, we don’t talk like this in real life, do we? In terms of this energy, but the energy is so critical for phone communication, like it is for a podcast because you’re in the world of audio. All of those things come into play.

Yeah, I love it. We do tweak some of the criteria depending if it’s for sales or service. The majority of them apply to either call. Having been a part of this for a while now, it’s on the money, as far as I’m concerned. I haven’t sort of… We’ll unpack the depth around it in another podcast. The one thing I like, Simon, is with our assessors that you’ve all trained and they’ve all got lots of experience on doing the calls. As well as actually doing all the scoring that we do, we actually get the assessors to just do the pub test score, so to speak. We just ask them to give it a score out of 10 of what they thought the call was like as a customer. It’s amazing just how well that tends to align with the scores, the actual formal scores that we get.

To pull you up on something there, Justin, we have assessors who assess the call. I’m one of those myself. We do a portion of those, but we have separate shoppers whose sole focus is just to role-play that customer effectively and therefore react based on the experience. As you said, with that rating, they do get to rate the score out of 10.

When I’m assessing a call, I’ve already got that rating.

It’s a good calibration exercise to see what the score gets produced and how does that compare to that pure customer reaction. Yeah, it’s been a key part of our assessment training and calibration activity, whether it’s with clients or just internally with ourselves, to keep ensuring our assessment decisions are in line with the benchmark itself, producing results that are in line with the customer’s perception. Thank you for pulling me up on that. It’s an important distinction.

Well, Justin, that’s why you’re doing news channel news stories, and I’m sitting in the background actually doing all the work. Yeah, I know, right? I’m normally the detail guy. I’m loving not having to worry about the detail. This is great.

The other thing with the program, and I think again it’s just important to clarify for people, is there are different ways that we assess quality, and those metrics that we use for quality, there are 18 different metrics. They’ve been refined over literally decades, so they’re pretty robust in what we do. But we’re not looking for things around compliance in relation to your products and services because we don’t know your products and services. If part of the scripting is that you have to say, “Welcome to ACXPA. This is Justin,” or whatever, we’re not prescribing what your greeting should be. We’re not prescribing that you have to do an ID check. We’re not prescribing all those things that might be important to your business. Our quality stuff works around that. Have I explained that well, Simon?

You have. You have a future in this. A good example is, again, just using car insurance as an example because it’s very heavy on those compliance recordings that we often get played, and then the legals, the same in banking, the legals that get read out. You never penalize for doing that, but it’s like how do you do that? Do you make sure that before and after, you manage expectations? It’s not transactional, even if the nature of the service is transactional. You’re still dealing with a human being, so you never want to be a walking talking website. You hear me say that phrase a lot, and that’s just one example of how to make the experience less of a chore, less of a process, engaging, and interactive. You’re never penalized, and that’s a key consideration. Terrific.

The reason for splitting the accessibility and the quality, I think the whole idea, as we said right at the start, is to make sure you’ve got feedback to improve. If you’ve got a low-quality score, it’s like, well, which one of those five competencies is the cause of your low-quality score? Because what we want to do is help you improve. Whether it’s coming to some of our symposiums, we’ve got training courses that Simon runs that actually teaches some of the fundamentals behind how to get good scores, how to give a good experience, whether it’s sales or service, etc. We would love to see an improvement right across the industry. We’re only a couple of months in. It’s exciting. We’re doing the third month of results at the moment. Car insurance, as we said, is a new sector that’s just about to drop in the next couple of days.

Next year, we might look at adding a couple more. We’ll unpack all the other stuff around it. If you’ve got any other questions at all, you know where to find us. Just drop us an email or LinkedIn, etc. Pop in a comment, and we’re happy to respond to it. We’ll also be looking at doing a couple of LinkedIn lives as well so you can join in and ask us some questions. That’s it from us. Simon, thank you, mate, for a bit of time. I know you’re flat out doing assessments at the moment. I’ll get back and do the real work. You just keep doing the TV thing, being the star of the show and taking all the glory. I’m fine with that. Ouch! Awesome. Well, thank you, everyone, for listening. Look forward to catching you again on a podcast on YouTube very soon. Otherwise, bye for now. See you!

ACXPA Resources

ACXPA is packed full of great resources for people working in contact centres, customer experience, digital service and customer service including a range of exclusive member-only benefits.

Learn more aboutACXPA
ACXPA Supplier Directory
Learn more aboutour Memberships
View Upcoming
courses & events

Don’t miss another CX Matters episode! Subscribe now >

Listen to CX Matters Podcast on Spotify
Listen to CX Matters Podcast on Apple Podcasts
CX Matters Podcast on Google Podcasts
If you want to share and earn points please login first
0 Comments

Leave a reply

ACXPA PLATINUM SPONSORS

ACXPA Platinum SPONSORS
ACXPA SILVER SPONSORS
ACXPA Platinum SPONSORS
ACXPA Platinum SPONSORS
ACXPA BRONZE SPONSORS
ACXPA Platinum SPONSORS
ACXPA Platinum SPONSORS
Copyright © 2024 | Australian Customer Experience Professionals Association | Phone: +61 3 9492 2871 | Website Terms of Use 

Log in with your email address

or Become an ACXPA Member

Forgot your details?

Create Account