Monday, March 28, 2011

Do Something...Even if it is Wrong!

By: James Barlament

My dad has a lot of sayings. One of them he used to say to me in frustration: “Do something, even if it’s wrong.” I heard that one a lot on Saturday mornings when I’d rather be asleep than outside mowing the lawn. He was an Army man, and that’s an old Army adage. It basically means don’t freeze up in the face of pressure; doing nothing is much worse than making a mistake.

This mindset is not restricted to grunts in the trenches making snap decisions in the face of a real enemy. We see it in government policy every day. Americans do not elect politicians who choke. Those in charge must seem to be always moving forward, solving problems with actions, even if these are blind actions that may or may not work.

I fundamentally disagree with my dad’s saying, and to be fair, he doesn’t agree with it either. After he retired from the Army, he became a financial manager and investor. I have a Master’s degree in history, and I’m now a researcher and program evaluator. I love spontaneity, but not when making decisions that will affect the future of an organization.

In my line of work, I come across a lot of highly motivated people who follow the mentality implied by my dad’s saying. They are doers, busy bees, go-getters, men and women of action. They are Mack trucks of industriousness, and I’m a road block or weigh station. I once did a presentation on the relationship between evaluator and evaluatee, and one of the major conflicts between the two parties comes from the fact that we, the evaluators, slow the evaluated down. We keep them from doing.

A few years ago, my team was asked by a government agency to evaluate its largest organization, which had existed for 25 years with multiple moving parts. On the surface, it was performing at a high level, meeting at least 85% of its milestones and objectives annually; however, on close inspection, all of the benchmarks were predicated on meeting volume numbers: “We will perform x task for x number of people”; “We will distribute x number of fliers to x number of people”; “We will conduct x number of surveys with x number of people.”

I was encouraged by the surveys. I asked, “Have you ever analyzed this survey data to see if your programs are working?” “No,” was their answer. I asked where they stored this survey data, thinking that it would be in an Excel chart or, hope against hope, SPSS or SAS. A woman led me out to a storage unit behind their facility where I was introduced to the most enormous set of filing cabinets I’d ever seen: 12 foot high drawers upon drawers of pre and post surveys from every class, rally, training or outreach event the organization ever conducted since 1984. I was floored and immediately despondent. They had been doing and “actioning” for 25 years without pausing to reflect on the effectiveness of the programs. Evaluation tools were mere check marks on a milestone chart. If a new issue in the field came up, they invented a new program to address the problem and found a new drawer for its surveys.

My colleagues and I sat down to make an action plan, and we quickly came to a decision about the giant filing cabinets: no way were we or anyone else going to climb through that massive amount of data. We needed a baseline, so we requested surveys from the last three years of each program, which we converted to SPSS files to analyze the data. What we found was not surprising: surveys from all individual programs showed increases in knowledge, attitudes, and behaviors from pre tests to post tests; however, there was a year-to-year level of stagnation in overall success. This stagnation resulted from using the same surveys for 25 years without modification and from performing programs that had never been pared down or strictly examined for strengths and weaknesses.

Currently, the updated surveys conducted by this organization are more applicable and entered directly into a database. Plans to begin online surveys are in the works. Programs conducted by the organization are more streamlined and productive, with increasingly high survey success rates. Coincidentally in the last two years, the organization has hired new employees who are dedicated to data-driven approaches to self-assessment and who have contributed much to the joy of my evaluation team. The milestones and objectives are now based on judging the quality, not the quantity of their programs.

“Do something, even if it’s wrong” might work on the battlefield. Sometimes the wrong decision may work because the enemy won’t be expecting it. But, if you are afforded the time to self evaluate and think before acting, take it. Careful research, planning, and experimentation lead to good evaluation and good decisions.

========================================================================
This is a guest post by James Barlament, an HRCT preferred provider for Programs Analysis.

Mr. Barlament received two degrees from the University of Georgia, finishing a Master of Arts degree in the History of Science in 2005. He has authored and co-authored several articles, publications, and presentations and has presented his work at numerous state, national, and international conferences.

No comments:

Post a Comment