Too much 360 feedback – time to be selective

Share Button

I was listening to Radio 5 in the UK.  There was a debate about class – a standard fixation.  Initially a couple of experts in the field discussed the issues around class and social mobility, with the presenter facilitating the debate.  All good so far and I was learning stuff.  Then "Chris from Manchester" called in.  It is a feature of many news avenues now that we are all invited to contribute.  Chris wasn’t an expert, he was a perfectly fine person with an opinion and I was now listening to it.  It wouldn’t be long before another 20 people would call in with their opinion and I began to learn less and less.

In 360 degree feedback we seek feedback – opinion – from a variety of people.  We sometimes get a desire to garner feedback from many, many people.  The concern is that if we don’t ask everyone possible then 1) someone will feel left out and 2) we will miss a piece of valuable feedback.  At times, for certain clients, these are valid concerns and the solution is to get feedback from 20-30 people.  But usually, getting feedback from too many people has two problems.

First, we get the "Chris from Manchester" problem.  Chris feels the need to offer an opinion even though he’s not an expert (read; he doesn’t actually know this person that well).  Chris writes a lot of commentary and his scoring gets equal weight with people who spend every day with the 360 feedback recipient.

Second, when we get the 360 feedback report back, we can’t see the wood for the trees.  It’s like spending the day on Twitter.  You know something important must be in here but you can’t find it because of the volume of data that is coming to you.

So – unless you have specific reasons (normally cultural concerns) that you have to be aware of – be selective in who gives feedback.  The quality of your knowledge will improve.

Brendan

Share Button

SMARTER objectives for performance appraisals

Share Button

I was reading a book on long distance cycle training.  The author referred to SMARTER objectives.  The acronym was used differently, although the SMART was very similar to the HR useage.

Specific
Measurable
Agreed (he referred to sharing your goal with someone else)
Realistic
Time-phased
Exciting
Recorded

The two latter lines are the new ones to me.  Recorded is fine – it just makes sense, but exciting catches the eye.  An exciting objective in an performance appraisal sound far-fetched?  But why not?  And if exciting is too far – surely interesting is something we could look for?

Too often you see dry goals that are unlikely to drive someone to higher performance or gain personal reward from achieving the goal.

I’ve started training to complete a cycle ride from Land’s End to John O’Groats in the UK (about 900 miles) in 9 days.  A wonderfully SMART objective – although I have dark moments where realistic is in doubt!  And I find the idea so exciting that I’m motivated for training, buying books to learn about how I can improve endurance, and putting my own milestones in place to make sure I’m on track.

Worth a thought – a SMARTER goal in an annual performance review form could lead to considerable improvement.

Brendan

Share Button

Performance related pay and the annual appraisal

Share Button

One author wrote that if you want to make performance appraisals really difficult then link the individual’s pay to their numerical rating.

Without judgement, we take the position that some organisations wish to use the performance appraisal process to help them determine the level of renumeration – salary or bonus – of individuals. If that is the  case, then how should the performance appraisal process be run to best achieve this?

First, let us consider what is a good outcome. We would argue that a good outcome for the advocates of performance related pay is

  • Individuals motivated to achieve targets that will improve the organisation and meet the organisational strategy.
  • The correct people getting the correct rewards
  • An efficient performance review process that delivers the benefits without using those benefits up in administrative burden
  • A robust process that stands scrutiny from external parties particularly on equality

When you consider the list above you are immediately struck by the need to get the start right. It is not so much the system of calculating rewards that matters – more it is a matter of ensuring that the measures are generated well. Better that our grading structure is simplistic than we skip past the step of generating fair targets.

So, first and foremost if you are looking to implement performance related pay and are using performance appraisals to support that implementation – spend a lot of time thinking about how to get the measure right. Continuing our humble theme of not knowing what is right for you, let us describe some options that we have seen work.

  • Weighted objectives, agreed between manager and employee and cascaded from the organisational strategy and graded for achievement.
  • Value statements derived from the company values and graded for compliance.
  • KPI indicators developed in consultation with employees
  • Monthly targets, adjusted each month against which employees are graded/scored each month
  • Team/Organisation objectives against which whole team’s are measured
  • Survey based data – e.g. customer satisfaction scores, against which individuals and teams are reviewed

Before embarking on performance related pay we would advocate a thorough consideration of what you are looking to achieve.  If you decide that it is right for your organisation then I hope you find this note useful as a first step to delivering a robust process.

Brendan

Share Button

The performance appraisal meeting

Share Button

The appraisal meeting strikes fear into many managers. They fear its time-consuming nature and they fear the meeting itself. The former issue is often cultural. The time spent on performance appraisals is a fantastic investment for managers if the process is run well. A  stitch in time saves nine.

The meeting itself is only feared by poorly trained managers who are uncertain of how to handle the meeting. Appraising an individual is an unnatural task for many managers but it can be trained.

The structure of our own training course for the performance appraisal meeting is as follows.

  • Understanding the purpose of performance management and the annual cycle
  • How to review performance in-year
  • How to conduct the end-year performance appraisal meeting
  • How to handle performance and behaviour problems
  • Use of coaching within appraisals: the GROW model
  • Core skills: listening, asking questions, giving feedback, confronting, supporting.

Contact us if you are interested in this training course, or if you would like the performance appraisal white paper that this blog post is an excerpt from.

Brendan

Share Button

How often should you conduct a performance appraisal

Share Button

This is an excerpt from our performance appraisal white paper.

The common advice is that at the annual appraisal nothing should come as a surprise. Through regular feedback the manager should ensure that an employee is always aware of how things are going, where they stand, where their greatest achievements lie and where they need to develop. We agree with all of this. There remains a question of how often the organisation and the people involved want to formalise this process.

The benefits of the formalisation is it ensures noone is slipping under the radar, allows the organisation to get some data back that it can direct training and other interventions towards, and it supports company practices such as pay reviews, and promotions.

Annual is too infrequent – too much changes in a year. But every other structure after that is down to individual organisations. Our view is that commonly an interim 6 month review is what is needed for formal appraisal. A monthly meeting should be scheduled in as good practice but keep that unbureaucratic.

Brendan

Share Button

Personal development plan within annual appraisal

Share Button

It may be worth pointing out at this stage that these excerpts from our performance appraisal white paper is a suggested order of working rather than a suggested order of importance. Done properly the personal development plan should  be the most important part of the performance appraisal.   Assuming that one of the main intentions of this process is to have people improve and so lead to improved organisational performance then a development plan is the key.

The reason for the order is that we find that a practical consideration is whether the  development plan is part of the same annual appraisal meeting and form completion process as the  objectives, values, and scoring. Often it is not. Indeed, practical constraints aside we would suggest that the development plan is kept separate from the appraisal form itself. It  requires a slightly different mindset and lives in a different way.

In principle the development plan should describe the skills, knowledge, and behavioural  changes that the individual is looking to develop over the coming time period. It generally  follows that most of the development plan should flow out of the review of prior year  objectives / values and consideration of goals for the coming year. This is important and  needs guidance to those completing the form. That an individual does not know Spanish and  would like to learn the language is only relevant if 1) they need to know it for work or  2) the organisation has a value of broadening peoples abilities.

The performance appraisal form is commonly a general HR domain. The development plan must be produced in concert with the learning and development/training team. Much completing of  Excel spreadsheets can be eliminated by a well designed, online development plan.

This is an excerpt from our performance appraisal white paper.

Brendan

Share Button

Performance grades in the annual performance appraisal

Share Button

Your annual performance grade is something that sticks with you. It can overwhelm the whole process. But it is not the most important part of the process and indeed performance appraisal can work perfectly well without one.

If you are not looking to use the appraisal as a link to pay then consider long and hard whether you need this one line/number summary of the year.  We find it distracts the appraisal meeting and distracts the appraisal project itself – heavily influencing how objectives, values and 360s are designed and completed.

If you do need the grade (and most of our clients do) then let’s consider our options.

Commonly we see two sorts of grade; a numeric grade (e.g. 1, 2, 3, 4) or a narrative grade (e.g. Strong performer, Competent, Development required). What we see less commonality  on is how this grade is determined.

End of the form grade

Still the most common, there is a simple drop down box of options that the manager selects  from.

Calculated average

Seemingly growing in popularity, we see grades calculated from other ratings on the annual  performance review form, or built from grades in interim reviews across the year. The most common is to grade how objectives have been completed.

Suggested calculated average with override

A late entrant, but increasingly a request, is to calculate an average within the system and  then give an option for the manager to override the calculation – normally with a forced  narrative option to explain the discrepancy.

Forced distribution

All of the above options can be subject to a forced distribution (e.g. 20% of people will be an  A, 40% will be a B, etc.). Some form of scoring drives this distribution which can be  across the whole organisation or across departments.

360 degree feedback influenced

From scoring on 360 feedback, the annual performance review grade or evaluation is  influenced or calculated.  Take great care with this option.

I have to say our view is not set in stone here. In an ideal world I suspect we would avoid  the annual grade – it can be distracting and it can be more controversial than it is useful. But, if you are going to run performance related pay (a debate in itself) then a grade is likely to form an element of the review process. I believe then that having followed a sensible  process, managerial discretion is required on the grading. Whether that is assisted -  through averaging – isn’t actually that important.

With such an important topic area, training is the key to ensuring the managers are able to  apply the chosen process in a fair manner and in a way that achieves the performance  appraisal process objectives. Managers will need training and role playing that ensures that  the gradings being presented are even across the organisation. This training can of course  be included in training on how to handle the appraisal meeting.

The above is an excerpt from our performance appraisal white paper.

Brendan

Share Button

Competency assessment as part of a performance appraisal

Share Button

Competency assessments allow the organisation to set a standard set of statements  against which employees are assessed. Generally those competencies are specific to particular roles.

We have clients who make extensive use of competency assessments – often with many  banks of statements for a wide range of roles. At the other extreme we have clients with just one competency set for all people. It very much depends on your organisation and on the time you have to dedicate to the task.

For an organisation of any size with anything but the most homogenous set of roles then  we would recommend competencies that are role or at least job group specific.

We find  competency assessments work particularly well in performance appraisals (as opposed to 360 degree feedback) where we are looking at roles with straight forward tasks. For example, a very  effective tick box structured competency assessment of a manual work role can be developed that is easy to complete and generates exactly the conversation and   development plan you are looking for. Often, this competency assessment is better than an objectives section for these roles.

Take some care if you wish to introduce a 360 feedback element to this section.  Generally 360 works best as a development tool rather than an appraisal.  Also, it adds a significant burden into the process.  Gaining feedback from a range of sources can be done in a simpler manner than a full 360.

As a point of detail : the rating scale for a competency assessment is suited to “strong to weak” rating rather  than frequency based. We suggest you use a set of words that are consistent with other  areas of the form or other wordings used within the organisation.

This is an excerpt from our performance appraisal white paper.

Brendan

Share Button

Free Seminar; how to successfully implement a 360 degree feedback process

Share Button

Following the success of our last 360 degree feedback seminar in December, we are really pleased to announce a date for the next one in March.

Spaces will again be limited, so if you wish to register your interest then please click here.

"How to succesfully implement a 360 degree feedback process"

Date: 24th March 2010
Time: 10am-12pm
Location: Davenport Lyons, London

What to expect

  • Understand the critical factors that will ensure success when introducing 360 into your business
  • Take away a checklist to help you work logically through the implementation process
  • Appreciate the key principles that will help you design a great questionnire, communicate effectively to get company wide ‘buy-in’ and facilitate face-to-face debriefs.

This seminar will be very interactive and allow plenty of opportunity to network with other delegates, discuss best practice and offer ample time for Q & A if you have specific issues to be addressed. Places will be free but limited, so if you would be interested in attending please register here and we will send you specific joining instructions in due course.

We hope to see some of you there
John

 

Share Button

Reviewing values in a performance appraisal

Share Button

A performance appraisal offers the opportunity to discuss the values of the organisation and how the individual in their day-today work is promoting those values. A values review serves two key purposes

  • Reminds all involved in the process of the values and the importance the organisation places on them
  • Highlights practices that are outside of the desired values

You may wish to consider who is best placed to comment on this.

Often the manager is not best placed. Colleagues and direct reports are more likely to see the actions of an individual particularly in remote teams. 360 degree feedback is often used to gain insight into behavioural indicators – although be careful not to lose 360′s key benefit of being developmental. A good manager should have enough contact with the team and colleagues to have this insight without 360 – but it is an option.

A values review can be very subjective. Compared to determining whether a well-formed objective has been achieved or partially achieved, it can be more contentious to discuss  integrity or openness.

The aim of the process here is generally to prompt an open conversation between the manager and the individual so we need something simple. A clean, rating based assessment with overall comments can offer a quick route for the manager (and perhaps the individual themselves) to give an overview and then prompt a conversation. As a matter of detail, a frequency rating scale often works well here. It is easier to answer “Displays integrity” with
Often or Very Often than it is to say Good or Excellent.

The subjective nature of values reviews also lends a problem for using their scoring for an  overall score or link to performance related pay. The benefit of linking them to pay for a  number of organisations is that it demonstrates their importance. The organisation is saying  we don’t only care whether you achieve the big goals we also care how you go about the work. You have to balance the inherently difficult nature of scoring values with the benefits of demonstrating their importance.  Our instinct is to not link it to pay but it’s not a hard and fast rule.

This is an excerpt from our performance appraisal white paper.

Brendan

Share Button

The future of 360 degree feedback…..?

Share Button

Just a brief pointer to a news story which caught my eye last week; the annoucement of a new soon to launch website www.failin.gs which offers the opportunity get feedback from anybody who knows you.

With a slight air of whimsy, it allows users to sign up and request anonymous feedback from anyone they choose; friends, family, colleagues, etc.

Not unsurprisingly, the comparison with 360 degree feedback in the workplace is made and it naturally brings out comments from psychologists and alike who question it’s usefulness and merit; rightly so, they highlight that it has to be handled appropriately if being used for meaningful ends.

The idea of feedback from outside the workplace is potentially a good one for people who may wish to pinpoint changes they wish to make in their lives; it carries forward the idea that others can see our strengths and weakenesses with greater clarity then sometimes we can.

Or maybe it’s just a chance to tell a friend they are a little mean when it comes to buying a drink after work…..

John

Share Button

360 Degree Feedback and Lessons Learnt

Share Button

Having worked with Bowland and a number of other 360 tools over the last few years, my aim here is to share some lessons learnt and gain views from others.  As a bit of background, I work as a people development consultant with professional services and public sector clients.  In my former career, I was Head of Learning and Development for a 'top five' Built Asset Consultancy that employed over 3000 people internationally.   So, here are the five top lessons I've learnt over the years. 

1.  Understand the context

Working as a consultant, often a 360 degree feedback process forms part of the client's desire to change and develop their people as part of a leadership programme, is part of a performance appraisal process, or is an external manifestation of a desire to develop a coaching culture where 'open and honest feedback' becomes the norm.   If a client has experience of the 360 process or is an HR/L&D expert then this is often the case – i.e. you immediately understand the context in which you are approaching the process.

In my experience, however, many clients will launch into a 360 solution before really understanding the issue they wish to address, or an understanding of the process.   I've had calls which simply say 'Sue, John needs a 360 report, he's got the self-awareness of a bull in a china shop and has no idea.  I need evidence.  Can you send him a link to a 360 thing so we can sort it out?".   The short answer is "no".  The long answer is "why?"

My view is that to really get the most from a 360 tool, you need to ask the right questions of the organisation upfront to understand what they want to achieve – both at an organisation level and for the individual.  Only once you know this, all the planning, the communication and the design process can begin.

2.   Select the right tool for the job

Only once I know what the client wants to achieve can I select the right tool.  Countless times I've had clients (SME's and large organisations) ask me to "create something quickly" or not appreciate the amount of preparation, positioning and communication that needs to sit around a 360 tool for the process to be successful.  For very small audiences, this might include being asked to use Word or Excel to get feedback and co-ordinate it all via e-mail, or using a tool like Survey Monkey. 

I have personally received feedback using processes like this and whilst the feedback itself was useful, I can be sure the time it took the administrator to co-ordinate the process and produce a half-decent report was money ill-spent. 

Just because you are already paying an administrator a salary, don't think that it's time well spent to create a cheap 360 process in house using Excel/Word/E-Mail etc. and that the cost of using a proper 360 feedback tool is wasted – think of the quality of the output, the ease of the process and the benefits of letting a tool and an expert manage the process, so you can focus on the results.  Pay the money and let the tool manage the process.

3.  Position the process and manage expectations.

Depending on the context of the 360 process, you need to design a communications strategy to manage the expectations of those being rated and the respondents.  This will be different depending on the circumstances and might include meeting with line managers to gain their support, ensuring those being rated (and those doing the rating) know why, what and how the data will be used. 

I quite like the model of having a senior group of people going through the process before their team members, so that they can extol the virtues of the 360 feedback process to them and gain trust in the process.  This also means that those being rated may have also just rated their line manager.

Depending on the organisation's culture and purpose of the process, I might also suggest that the 360 feedback is kept entirely confidential – i.e. the results are only seen by the external consultants (me and my team) and the individual who has been rated.   Although I cannot categorically prove this alters the results and drives more honesty, I have seen it create trust in the process where there may otherwise have been scepticism.  Often the client sponsor may only receive a summary of trends and results.

4.  Keep focused on the outcome

This is something that I think is often lost on a 360 project.  Yes, you will want to align any questions and tools with your own organisational competencies.  Yes, you might want to brand your 360 feedback process to make it your own.  And yes, you might need to get sign off on all communications from your Board to ensure they align with your business strategy and goals. 

But remember, the purpose of the process is (probably) to give individual's feedback on their performance, to develop their skills in certain areas and build their self-awareness.  So, whilst I completely understand that anything you do needs to align with your people strategy, when designing a 360 don't lose sight ot the outcome and then react accordingly. 
A client once said to me, "Sue, we can't issue the 360 until our leadership competency framework is complete and that's not being signed off for nine months".   My response was to understand the context of the 360 (in this case a leadership programme affecting a small proportion of the employees) and to ask the client what generic leadership competencies he thought the business might want to measure (Bowland has a 'vanilla 360' which lists out the common one's which is useful).  And then we were off – giving feedback to individual's in the programme within weeks, not months.  As I've said, focus on the outcome for the individual and the organisation and make a decision accordingly.

5.  Manage the individual's receiving the feedback.

I know from the recent Bowland talk I attended that Brendan is passionate about giving the feedback report to the individual in the feedback session and not before.  Having done this both ways, one big lesson learnt is to wholeheartedly agree.   However innocous you may view the comments in a report, the recipient may read it differently, given their perception of their environment. 

You can easily end up with a defensive individual in the session on a witch hunt to find the person who says they 'always do X' when they think they only 'sometimes' do it – and this rather detracts from the process.   Likewise, you have no idea of the emotional state of the individual if you send them the report 'cold'.

And finally.. 

As a consultant I would always recommend repeating the process regularly (6-12 months), integrating it with others business processes and using it as a benchmark of performance improvement.  In my mind 360s are a great way for an L&D or HR Leader to address the 'return on investment issue' of any project without having to covert the result into hard cash, which is often a tricky one to prove. 

A conversation which runs "50% of our people thought our line managers were poor in leadership skills a year ago, but now 85% of them think they are strong, and this is down to our leadership programme/coaching development/investment in X" is music to my ears.

Sue Miles is Director at Chaseville Consulting Ltd and works alongside clients as their extended arm to design and deliver people development projects.
www.chaseville.co.uk

Share Button

360 Degree Feedback; calculating the environmental cost

Share Button

We hear much today about what the environmental cost is around manufacturing, flying, food production, etc – It struck me that there is a parallel here to 360 degree feedback which provides a glimpse of the 'environmental cost' of our own behaviour.

It has become clear that it is not enough for companies to make vast profits for shareholders whilst dumping toxic waste in nearby rivers; the 'What' was being achieved but the 'How' was creating terrible fallout.

As employees,  it is often the case that whilst people are achieving their goals or targets, how they go about it can come at a cost to their immediate environment; the office, their colleagues, their family, etc.

360 degree feedback provides the ideal opportunity for respondents to indicate what the fallout is of certain behaviours they see as that person goes about achieving the 'What'.

The 'How' becomes important, because without succeeding in both areas, you cannot have a sustainable model for success.

John

Share Button