Skip to main content

Someone's got the message -- evaluate, evaluate!!

 

There's nothing wrong with trying something new -- you've got no restrictions in private practice -- but don't forget that public health service managers are responsible for spending other people's money, amidst a sea of potentially competing priorities. But, someone's got the message, if you're going to innovate, evaluate the success of what you do. See the story here. It's really encouraging to see this coming about in the ever-so-budget conscious UK National Health Service -- it was bad enough 15 years ago when I worked in the UK -- the NHS service in which I worked had no psychological tests of their own -- they only used (illegally) photocopied tests!!! And the situation's only gotten worse since. So it's especially heartwarming to see one of the latest developments reported here. While it talks about a "mental health centre" we know many of those served will have history of ACEs, and as we know many of us with ACEs histories may often come from a background where choice over what happens to us might have been severely limited, so it's great to see that sometimes somebody does listen. So, in the long run, it can work out cheaper, and more effective.

One thing I would encourage those who wish to innovate, though, is to incorporate evaluation into the planning and ongoing review of any service implemented -- here it looks like they've left it till it's been "somewhat established". Learn about the difference between formative and summative evaluation; identify process and outcome goals; find measurable objective-based measures; and build opportunities to record those measures as you go along, making sure you're staying "on track".

Add Comment

Comments (9)

Newest · Oldest · Popular
Dr. Cathy Anthofer-Fialon posted:

Russell,

Evaluation is essential. At Good Harbor Institute when we work with an organization we begin with benchmarking where the organization is in regards to "trauma informed", and then throughout the year of programming we offer we not only re-survey the participants, but we also conduct two rounds of semi-structure interviews and focus groups to add the understanding of lived experience to the data. Without evaluation we will not understand if there is an impact and/or what that impact is. Additionally trainings, programs, organizations need information to understand what next steps are in implementing this information.

~ Cathy

Cathy, thank you, that last sentence is the crux of it, so easily passed over if read quickly  - - where to next, having brought about the change, how to settle it into place and bring about the next step

Russell,

Evaluation is essential. At Good Harbor Institute when we work with an organization we begin with benchmarking where the organization is in regards to "trauma informed", and then throughout the year of programming we offer we not only re-survey the participants, but we also conduct two rounds of semi-structure interviews and focus groups to add the understanding of lived experience to the data. Without evaluation we will not understand if there is an impact and/or what that impact is. Additionally trainings, programs, organizations need information to understand what next steps are in implementing this information.

~ Cathy

Karen Gross posted:

I agree that we need evaluation --- otherwise, we will not know the effect and effectiveness of programs and interventions.  Indeed, big data are increasing and so is data-informed decision-making. These are all important improvements and pathways toward programmatic success.  For far too long, we have proceeded without data collection and analysis.

One major caveat -- and this is despite my academic background and my own experience collecting and interpreting data:  data and empiricism can be a pawn used to prevent innovation.  People comment:  "we do not have enough data to proceed."  Or, "we need data to proceed."  I had a colleague who every time he did not like an idea proposed, he'd say: "Where are the data?"  

Here's the reality: data collection can take time and money -- especially longitudinal data. So, I worry that the quest for empiricism can be a tool NOT to make change but to deter change.  

So, sometimes, we need to proceed and act in the absence of complete data because waiting is more dangerous than proceeding. Consider drug trials. Sometimes, we need to cut the process short and give patients access to the new drugs because they will pass away if the full testing and data are completed.  Thus, a word of caution about empiricism: it matters; it is needed but it cannot and should not be an impediment to needed change in the shorter term. This is true in education, in healthcare and in other social fields.

Consider pilot program and short term testing like focus groups and survey monkey like testing.  These can be a bridge between no empiricism and full blown empirical data assessment.

Let's not throw the baby out with the bathwater. Data matter but data should not impede potentially valuable progress where short term data results can be garnered while we await longer term results.

 

Sorry, can NOT agree with so much of your comment. Funny how you bring up drug trials -- remember the unconscionable rush to market that happened with SSRIs and think about how VERY much better off the world would be without them. What the world needed to deal with, to prevent, this problem is MORE data, not less.

If you're not aware of the problems with SSRIs have a look at this Facebook site, and especially the contributions of psychiatrist Rob Purssey; see the attached; and check out the series of three videos available on the 'net -- "Who Cares in Sweden"

and of course, many survivors of ACEs hopefully end up as survivors of problems due to prescribed drugs; but sadly, many will end their lives after getting fed up with those who prescribe these and other drugs.

But, in terms of a simple way to start evaluate such peer involved services, one that is both timely and cost-efficient, would be to follow the recommendations of Lloyd-Evans (2014) See here

Attachments

Last edited by Russell Wilson
Pamela Denise Long posted:

If an agency is going to exist the leadership must embrace program evaluation as a central aspect of operations so that both improvement and innovation are appropriate and effective.  The emphasis on "co-producing" is a practical and collaborative way to operationalize client voice and choice (trauma-informed values) in a way that is substantive, developmental, and powerful.  Hope to see more of this type of service planning!

Hear, Hear!!! 

There's nothing stopping ANY service provider incorporating a sign-up sheet for research as part of their intake procedures; especially easy for those in private practice. This is why it's so frustrating not to see much more data on some forms of therapy - - if you don't research you can't really understand what you are doing, or how it can be improved, not in the objective rigorous way it needs to be done. 

This old coot ain't totally silly, you know. Fortunately, he/I did this study (of methodology and evaluation) as part of his training ;-) though most might not have - - hence my talking about the difference between Formative and Summative evaluated (do a Google search and the first refs you'll find will be about these in Education - -  BUT  but you can start off SMALL -- formative evaluation of some easily measured core objectives -- "agency" / empowerment / social support?? 

Last edited by Russell Wilson

If an agency is going to exist the leadership must embrace program evaluation as a central aspect of operations so that both improvement and innovation are appropriate and effective.  The emphasis on "co-producing" is a practical and collaborative way to operationalize client voice and choice (trauma-informed values) in a way that is substantive, developmental, and powerful.  Hope to see more of this type of service planning!

I agree that we need evaluation --- otherwise, we will not know the effect and effectiveness of programs and interventions.  Indeed, big data are increasing and so is data-informed decision-making. These are all important improvements and pathways toward programmatic success.  For far too long, we have proceeded without data collection and analysis.

One major caveat -- and this is despite my academic background and my own experience collecting and interpreting data:  data and empiricism can be a pawn used to prevent innovation.  People comment:  "we do not have enough data to proceed."  Or, "we need data to proceed."  I had a colleague who every time he did not like an idea proposed, he'd say: "Where are the data?"  

Here's the reality: data collection can take time and money -- especially longitudinal data. So, I worry that the quest for empiricism can be a tool NOT to make change but to deter change.  

So, sometimes, we need to proceed and act in the absence of complete data because waiting is more dangerous than proceeding. Consider drug trials. Sometimes, we need to cut the process short and give patients access to the new drugs because they will pass away if the full testing and data are completed.  Thus, a word of caution about empiricism: it matters; it is needed but it cannot and should not be an impediment to needed change in the shorter term. This is true in education, in healthcare and in other social fields.

Consider pilot program and short term testing like focus groups and survey monkey like testing.  These can be a bridge between no empiricism and full blown empirical data assessment.

Let's not throw the baby out with the bathwater. Data matter but data should not impede potentially valuable progress where short term data results can be garnered while we await longer term results.

 

Russel:

What a great story. I LOVE reading about this and the co-producing phase. Some of the changes in approach seem so simple and common sense and some seem so radical. I love hearing about it all. It will take whole new evaluation tools though I bet to assess how effective and effectively new approaches are. Thanks for sharing this!

Cissy

Staff and guests – those who stay are not termed patients – join forces to cook, clean and tend the fruit and veg they then sit down to eat together at Gellinudd, which is the UK’s first inpatient mental health centre to be designed by service users and their carers. β€œIf you’re a psychiatrist you’ll still be expected to be in the kitchen chopping vegetables alongside everyone else,” says the centre’s director, Alison Guyatt.

Over three years, via consultation meetings attended by up to 50 people and annual general meetings attracting as many as 300, service users and carers who are also members of the Welsh charity Hafal, which runs the centre, have influenced everything from the policies and procedures to the decor, facilities and recovery-focused activities on offer.

β€œThey’re the experts,” says Guyatt. β€œThey can say how it feels to be on the receiving end of care, how anxious you would be, what your concerns would be. They have such powerful stories to tell.” The lack of privacy and dignity in hospital settings, together with old and decrepit buildings that provide little access to fresh air, were common themes among those who gave input. β€œA lot of them feel very clinical, rather than homely and welcoming,” Guyatt says.

Post
Copyright Β© 2023, PACEsConnection. All rights reserved.
×
×
×
×
Link copied to your clipboard.
×