Next 360 degree feedback seminar

Share Button

We’re scheduling our next 360 degree feedback seminar for June.  We’ll confirm the exact date and location – which will be London based – in due course, but please register now if you’d like us to keep you informed as we get everything organised.  The seminar concentrates on how to effectively implement 360 degree feedback in your organisation.  We focus particularly on how to handle the debrief sessions.

Brendan

Share Button

Great 360 degree feedback seminar

Share Button

A note of thanks to those people who attended the seminar yesterday and to Davenport Lyons for hosting.  360 degree feedback always throws up interesting issues and ideas and yesterday was no exception.  John and I had a great time and thoroughly enjoyed everyone’s contributions.

We are starting our plans for the next seminar, so if you are interested get in touch (we’re currently switching over the advertising but we’ll know what you intend if you register for a past seminar!) by selecting the webinar/seminar option from our blog or website  and we’ll let you know the date and venue of the next one.

Brendan

Share Button

What is the average colour of a traffic light?

Share Button

How this links to 360 degree feedback will follow!

Let’s assume we have a basic traffic light system.  And we find out that we have the following distribution

  • Red 50% of the time
  • Amber 10% of the time
  • Green 40% of the time

And someone wants to know what colour it is on average.  What to do?

Average requires numbers.  

So, let’s give Red the number 1, Amber the number 2, and Green the number 3.  A bit of Maths will find the average now (50% *1) + (10% * 2) + (40% *3) … 1.9 is the answer.

So our average is 1.9 which is nearest to Amber (which we gave the number 2).  So on average the colour of the traffic light is Amber … somewhere in the middle.

We know this is wrong – the light is on amber least of all – but it was an attractive solution somehow. 

Even more tempting is to ask people to respond to a question with responses that can be

  • Strongly Agree
  • Agree
  • Neither agree nor disagree
  • Disagree
  • Strongly Disagree

and then give people a score out of 5 for how people answered on average.

Let’s say we have done that and the average is 3.1.  What does that mean?  We’re going to say that on average the respondents roughly "neither agree nor disagree" with the statement.  But go back to our traffic light example… the colour that the traffic light is least often is amber, what’s to say the same hasn’t happened here?

Statisticians will tell you that the underlying problem here is that you are treating categorical data as if it is numerical data; if you are very unlucky, you may find yourself in a debate about likert scales and the Polytomouse Rasch model.  What I want to highlight is, this superficially simple concept of giving each point a number and averaging the responses is not that simple and may lead to inaccurate conclusions.

Way’s round it?  Either don’t use averages in your report, or make it clear that you are using the scale to reflect a mark out of 5.  That at least gives your average credibility even if it doesn’t get around the problem of the average obscuring the underlying scoring. 

Note, this doesn’t stop you using numbers or charts in your report – I’ll discuss that in another post.

This article forms part of the structure of a new white paper I am writing on reporting in 360 degree appraisals (I promise this is the heavily statistical bit and the rest is more down to what we see as best practice!).  If you sign up to our white papers, then you will receive that document as it is completed.  Click here to sign up.

Brendan

Share Button

Transparent 360 degree feedback within Google

Share Button

A brief note to highlight a recent article which had an interview with Google Europe boss John Herlih, in which he describes, amongst other things that make Google work well, how they take their people through 360 degree feedback every 6 months.

It is clear that they have a passion for attracting, recruiting, developing and retaining the best people, and this comes through in most articles written about Google – what is interesting here as well is how the 360 degree feedback results are shared with the whole company, providing a transparent process, which presumably serves to foster a more open culture.

Certainly not for everyone, but then when did Google ever follow the crowd…?

John

Share Button

360 degree feedback; throwing the baby out with the bathwater…

Share Button

Here is a recent blog post I came across recently which lambasts a few management practices, one of these being 360 degree feedback processes.

As with many articles of this nature, I often find myself agreeing with some of what is said; poorly executed management practices, such as a badly implemented 360 degree appraisal process, can do more harm than good – so if people have a ‘bad experience’, it can colour their view about such practices permanently.

However, as with most things in life, this isn’t a ‘black & white’ situation, and despite their being poor practice in evidence, there is also (certainly in our own experience) very good practice around which suggests that there is a danger of throwing the proverbial ‘baby out with the bathwater’ as one looks to kick against bad practice.

360 degree feedback should complement the whole myriad of management practices, tools and processes out there – it isn’t a complete substitute for open, honest and regular communication between bosses and direct reports, peer to peer, etc,  which should most definitely be encouraged, but it certainly adds value as organisations seek to create this kind of transparent culture which can take time to take root.

John

Share Button

Extracting wisdom from 360 degree feedback

Share Button

Hal Varian, Google’s chief economist, is quoted as saying that "Data are widely available, what is scarce is the ability to extract wisdom from them".  I’m focusing heavily at the moment on debriefing 360 degree feedback and the 360 degree feedback report.  For both an upcoming seminar, and a new whitepaper, I’m looking to fine tune our thoughts around how we make best use of a 360 feedback process.

The current line of thinking is to consider how data becomes information becomes knowledge/wisdom.  The 360 feedback questionnaire generates data.  Our challenge is to take that data and produce information from which the recipient gains knowledge.  Along the way we have to avoid the dangers of losing information or of forming unwise conclusions.

The report and the conversation around the report is where the transformation happens and where best practice can lead to the best knowledge outcomes.

You can register for our 360 feedback seminar by clicking here.  If you are interested in our white papers then subscribe here – you would then automatically receive the white paper described above as it is produced.

Brendan

Share Button

Data, data, everywhere – what to pay attention to

Share Button

360 degree feedback deliberately generates data from a range of sources – it creates more data than a standard performance appraisal.  Annual performance appraisals are also starting to commonly seek information from a range of sources – it reflects a move to more networked organisations and less structured boss->subordinate relationships.

That all makes sense to me – it is a sensible growth in data.  But when it comes to the annual performance appraisal working out what is worth measuring is important.  I’ve been thinking about this a lot as I’ve been looking recently as my wife develops a new business promoting deals and discounts for days out in the UK.

Her website has Google analytics that tells her how many people visit the site, which pages are popular, etc.  The blog http://blog.topdogdays.com tells her how many people have subscribed to the blog.  Her twitter service http://www.twitter.com/topdogdays tells her how many people are following her.  Amazon tell her how many people have bought a book having visited the site, and google tell her how many people have clicked on an advert on the site.  Data, data, everywhere.  Eventually all of this data can distract from the purpose of the business and managing it.  But it is highly seductive and of course in the early days it is great feedback.

For all of us when reviewing performance – or setting the targets for next year – it is critical to boil down the measures to the key performance indicators.  A term that makes a lot of sense but is often abused.  We need to watch the key performance indicators – not all of them.  In a previous life, I ran a call centre operation.  We had stats coming at us from all directions – all that really mattered was 1) did we answer the calls and 2) did we provide a great service when we did.  Ring time, call duration, "not ready" time, and hundreds of other numbers were indicators but not key indicators.

Brendan

Share Button