Having worked with Bowland and a number of other 360 tools over the last few years, my aim here is to share some lessons learnt and gain views from others. As a bit of background, I work as a people development consultant with professional services and public sector clients. In my former career, I was Head of Learning and Development for a 'top five' Built Asset Consultancy that employed over 3000 people internationally. So, here are the five top lessons I've learnt over the years.
1. Understand the context
Working as a consultant, often a 360 degree feedback process forms part of the client's desire to change and develop their people as part of a leadership programme, is part of a performance appraisal process, or is an external manifestation of a desire to develop a coaching culture where 'open and honest feedback' becomes the norm. If a client has experience of the 360 process or is an HR/L&D expert then this is often the case – i.e. you immediately understand the context in which you are approaching the process.
In my experience, however, many clients will launch into a 360 solution before really understanding the issue they wish to address, or an understanding of the process. I've had calls which simply say 'Sue, John needs a 360 report, he's got the self-awareness of a bull in a china shop and has no idea. I need evidence. Can you send him a link to a 360 thing so we can sort it out?". The short answer is "no". The long answer is "why?"
My view is that to really get the most from a 360 tool, you need to ask the right questions of the organisation upfront to understand what they want to achieve – both at an organisation level and for the individual. Only once you know this, all the planning, the communication and the design process can begin.
2. Select the right tool for the job
Only once I know what the client wants to achieve can I select the right tool. Countless times I've had clients (SME's and large organisations) ask me to "create something quickly" or not appreciate the amount of preparation, positioning and communication that needs to sit around a 360 tool for the process to be successful. For very small audiences, this might include being asked to use Word or Excel to get feedback and co-ordinate it all via e-mail, or using a tool like Survey Monkey.
I have personally received feedback using processes like this and whilst the feedback itself was useful, I can be sure the time it took the administrator to co-ordinate the process and produce a half-decent report was money ill-spent.
Just because you are already paying an administrator a salary, don't think that it's time well spent to create a cheap 360 process in house using Excel/Word/E-Mail etc. and that the cost of using a proper 360 feedback tool is wasted – think of the quality of the output, the ease of the process and the benefits of letting a tool and an expert manage the process, so you can focus on the results. Pay the money and let the tool manage the process.
3. Position the process and manage expectations.
Depending on the context of the 360 process, you need to design a communications strategy to manage the expectations of those being rated and the respondents. This will be different depending on the circumstances and might include meeting with line managers to gain their support, ensuring those being rated (and those doing the rating) know why, what and how the data will be used.
I quite like the model of having a senior group of people going through the process before their team members, so that they can extol the virtues of the 360 feedback process to them and gain trust in the process. This also means that those being rated may have also just rated their line manager.
Depending on the organisation's culture and purpose of the process, I might also suggest that the 360 feedback is kept entirely confidential – i.e. the results are only seen by the external consultants (me and my team) and the individual who has been rated. Although I cannot categorically prove this alters the results and drives more honesty, I have seen it create trust in the process where there may otherwise have been scepticism. Often the client sponsor may only receive a summary of trends and results.
4. Keep focused on the outcome
This is something that I think is often lost on a 360 project. Yes, you will want to align any questions and tools with your own organisational competencies. Yes, you might want to brand your 360 feedback process to make it your own. And yes, you might need to get sign off on all communications from your Board to ensure they align with your business strategy and goals.
But remember, the purpose of the process is (probably) to give individual's feedback on their performance, to develop their skills in certain areas and build their self-awareness. So, whilst I completely understand that anything you do needs to align with your people strategy, when designing a 360 don't lose sight ot the outcome and then react accordingly.
A client once said to me, "Sue, we can't issue the 360 until our leadership competency framework is complete and that's not being signed off for nine months". My response was to understand the context of the 360 (in this case a leadership programme affecting a small proportion of the employees) and to ask the client what generic leadership competencies he thought the business might want to measure (Bowland has a 'vanilla 360' which lists out the common one's which is useful). And then we were off – giving feedback to individual's in the programme within weeks, not months. As I've said, focus on the outcome for the individual and the organisation and make a decision accordingly.
5. Manage the individual's receiving the feedback.
I know from the recent Bowland talk I attended that Brendan is passionate about giving the feedback report to the individual in the feedback session and not before. Having done this both ways, one big lesson learnt is to wholeheartedly agree. However innocous you may view the comments in a report, the recipient may read it differently, given their perception of their environment.
You can easily end up with a defensive individual in the session on a witch hunt to find the person who says they 'always do X' when they think they only 'sometimes' do it – and this rather detracts from the process. Likewise, you have no idea of the emotional state of the individual if you send them the report 'cold'.
As a consultant I would always recommend repeating the process regularly (6-12 months), integrating it with others business processes and using it as a benchmark of performance improvement. In my mind 360s are a great way for an L&D or HR Leader to address the 'return on investment issue' of any project without having to covert the result into hard cash, which is often a tricky one to prove.
A conversation which runs "50% of our people thought our line managers were poor in leadership skills a year ago, but now 85% of them think they are strong, and this is down to our leadership programme/coaching development/investment in X" is music to my ears.
Sue Miles is Director at Chaseville Consulting Ltd and works alongside clients as their extended arm to design and deliver people development projects.