January 1, 1970 (Vol. , No. )

William Ronco, Ph.D. Biotech Leadership Institute

This second part of a two-part article describes how to analyze 360 survey results and convert insights into actions.

“I know these results are anonymous, but I know who wrote these comments.”
—A scientist beginning to review his 360 results

360 surveys have strong potential to improve scientists’ communications, but getting the most from the surveys requires thoughtful planning and analysis. This second part of a two-part blog describes how to analyze 360 survey results and convert your insights into thoughtful actions. (Part 1 outlined 360 survey potentials and problems, along with steps to plan and launch your survey.)


360 surveys can give you a fairly accurate, comprehensive, and calibrated sense of others’ perceptions of your communications effectiveness. [© Yuri Arcurs – Fotolia.com]

“I Know Who Wrote This”

Initially reviewing their 360 survey results, many people ignore the quantitative data, turn to the open-ended comments, quickly find the one or two less-than-wonderful phrases, and even more quickly conclude, “I know who wrote this.” Many others comb quickly through the data, focusing, then obsessing about the several less-than-perfect response sets.

Both responses are understandable but not terribly productive strategies for getting the most from 360 survey results. Finding patterns and trends in responses generates more useful insights than attempting to identify the source of a rogue response. And the statisticians among our readers know very well what to do with the outliers in any dataset.

Scientists react with less than scientific responses to their 360 survey data because the data addresses personal issues. Despite our professional preferences, traces of emotion stubbornly remain. However much we claim to—and really do—want objective feedback about our communications effectiveness, reviewing data describing our actual performance tests the limits of our Spockian ideals.

As a scientist you will also probably quibble with the validity of 360 data, and your quibbles are valid. 360 survey data is subjective; your sample size is small; respondents completed the survey under less than perfect lab conditions. Still, the survey gives you a fairly accurate, comprehensive, and calibrated sense of others’ perceptions of your communications effectiveness.

Look Through The Johari Window

The Johari Window provides useful perspective for interpreting 360 results. Named not, as it sounds, after a mystical Eastern philosophy but rather, for its inventors Joseph Luft and Harrington Ingam, the contingency table divides ways to understand one’s 360 responses.Optimally, all of a survey recipient’s reactions to his or her 360 data would fit in the “Public” block; 360 data recipients would be able to perfectly predict all their responses.

Working with thousands of scientists and their 360 data, we have not yet encountered anyone who perfectly predicted—or has been completely surprised—by his or her 360 responses. A few peoples’ predictions closely approximate, and some broadly miss, their actual results, but most are generally accurate. Nearly all miss a few things. They say the survey results are helpful in pointing out specific items they need to work on.

Some of the things people need to work on reside in the “Blind” block. These are aspects of their communications that elicit reactions they were not aware of. Scientists’ “Blind” block often includes others’ reactions to their efforts to criticize. The scientist did not realize that the criticism of others’ work that they thought was “crisp” and “rigorous but fair,” was perceived by the recipients of the criticism as “devastating,” “withering,” “over the top,” or “dehumanizing”.

Other, different things people need to work on fit more accurately in the “Hidden” block. These are aspects of their communications that they thought they had made clear, but that others apparently have not seen. This occurs when scientists thought they were providing more than enough information about progress on key tasks, only to find that their 360 survey respondents want and need much more.

The distinction between “Blind” and “Hidden” responses is important because each requires a different action response. Responding to “Blind” issues asks 360 survey recipients to generally withdraw, to pull back on behaviors that irritate, annoy, or anger others. Responding to “Withheld” issues, on the other hand, requires 360 survey recipients to generally step up, push forward, and communicate more about matters that interest, involve, or impact others.


A Johari Window.

6 Tips To Help You Get The Most From Your 360 Results

1. Focus first on the quantitative responses, then look at the open-ended comments. The quantitative data is less colorful but much more useful to enable you to identify patterns and themes in your communications effectiveness. Once you’ve got a grasp on your overall response profile, move on to the open-ended comments to provide more “color” to help you understand the quantitative shades of grey. Try not to dwell on any open-ended comment, no matter how interesting or upsetting it may be. Remember, the people writing these comments often don’t intend them to reflect their best thinking.

2. Begin with the “Overall” question. Most people start reviewing their data by looking at the responses to the first question on the survey, then moving on to each question in sequence. It’s more useful to skip to the “overall,” summary questions that usually conclude the results because they provide an overall context for all the other questions. It’s also useful to prioritize the questions, focusing most on those that address issues most important to your effectiveness overall.

3. Aim for a majority of “5s on a five-point scale” on the key questions. If you’re wondering what to aim for, what’s considered “Excellent” 360 responses, aim for a majority of “5s” on a five-point scale. 4s are fine, and 3s may also be all right. However, it’s worth aiming for 5s, especially on the questions that address issues most important to your job success for the same reasons you aim for excellence in your science work. Yes, some respondents just don’t give 5s for anything and yes, you can’t please everyone. But the 360 is about communicating effectively, not pleasing people.

4. Convert your insights to actions. You’re most likely to make improvements and convert your 360 survey insights into actions if you:

  • Keep the data in front of you. Some people tape printouts to their monitor, or keep a view of the data open on their browser
  • Schedule times when you specifically address the survey insights

5. Discuss your survey results with respondents. Of course you shouldn’t ever ask respondents how they themselves responded to your survey. However, discussing your reactions, insights, and action plans with respondents usually elicits positive responses from them. It shows them you’re taking their responses seriously. Also—as long as you don’t attempt to defend or explain yourself—discussing your results with respondents usually generates additional insight that’s very useful for you.

6. Plan to re-survey in 6–12 months. Circumstances change, and you’ve been working on addressing the issues that came up in that survey. Getting into the habit of doing a 360 annually enables you to benchmark and chart your progress in leadership and communications effectiveness the same ways you do for the technical aspects of your science.

Director of the Biotech Leadership Institute William Ronco, Ph.D. ([email protected]), consults on leadership, communications, team, and partnering performance in pharmaceutical, biotech, and science organizations.

Previous articlePersonalized Medicine: From Biomarkers to Companion Diagnostics
Next articleEdison Will Receive Up To $525M in Mitochondrial and CNS Disease Deal