The systems thinking and training blog - Bryan Hopkins

Go to content

Main menu:

Update on my forthcoming book

Published by in Evaluation ·
This is a slightly different entry, but is an update with information on my new book, "Learning and Performance: A Systemic Model for Analysing Needs and Evaluating Training ".

This is a practical guide for using systems thinking concepts such as boundary definitions, multiple perspectives and relationships in carrying out training needs analyses and programme evaluation.

It explains how to use techniques such as the Viable Systems Model and Soft Systems Methodology to explore areas of concern in organisational performance, in order to identify a holistic set of solutions which can improve performance. In the case of evaluating training, it uses these tools to provide a practical approach to evaluating both the learning and impact of training.

The book will be published by Routledge in late 2016 with an expected cover price of about £45. However, after some discussions with my publisher, we have decided to take pre-publication orders at the heavily discounted price of £20.00. That looks like a very good deal.

If you are interested, all you have to do to register a pre-publication order is click here to contact Jonathan at Routledge:

Why I like 70:20:10

Published by in Informal learning ·
I have just finished reading the report "70+20+10=100: The Evidence Behind The Numbers", produced by Charles Jennings and Towards Maturity. A most interesting and worthwhile read.

For clarity, the 70:20:10 model refers to an observation that 70% of what people learn comes from real life and on-the-job experience, 20% from working with other people and 10% from formal training. These figures came from observations on leadership in a largely male target group, and, as the report acknowledges, different figures have been derived for female groups. Other research, not referred to in the report, discusses how 80% of what people learn comes from 'informal' means.

So there is some disagreement about the numbers, but this is largely because of the difficulties of actually defining what these categories of learning mean: what is the difference between 'on-the-job experience' and 'working with others'. What exactly is 'informal' learning?

But really, the numbers are not important. Why 70:20:10 is such a useful concept is that it provides a simple model around which people can conceptualise the importance of integrating formal and informal learning, which is something which the training industry has struggled with for many years. As the report says, its value is in helping people to realise that learning is a complex, multi-faceted activity, but that taking steps to facilitate non-formal learning opportunities and integrating them with formal training can bring rich rewards to organisations.

As I read the report I felt a strong sense of vindication that my own systems-based approach to analysing performance issues and developing training strategies is completely justified. Using a systems approach automatically means that we develop an understanding of each part of the 70:20:10 triad, about the dynamics of the workplace, how people work with each other and share information, what barriers and enablers may exist to implementing new knowledge and skills and so on. This can help us to design training which explicitly helps people to integrate their informal learning opportunities with training. Definitely a good thing!

The importance of boundary spanners

Published by in Reflections ·
I have not been able to write any blogs recently because all of my time has been taken up with finalising the manuscript for my new book. Provisionally entitled "Performance and evaluation: a systems thinking approach", I eventually managed to get everything packaged up sent off to Routledge, my publishers, by the end of last week and so this should hopefully now be published towards the end of 2016.

It has taken about two years to put the book together, pulling together experiences from work that I have done and my ongoing studies with the Open University. Taking that amount of time has some advantages but also means that what you learn over time makes the content of sections written early on seem somewhat inadequate. There is also the problem of changing writing styles as time goes by: what seemed a good way to do it in 2014 was not by 2016. These inconsistencies were exposed by the review process. I asked a number of trusted associates to look at the draft manuscript and their comments were invaluable, pointing out how the logic of the structure could be improved and identifying weaknesses in words, sentences and paragraphs.

It made me realise that a fundamental problem with the book was bringing together two academic disciplines, the training world and the systems thinking world. A core theme within the book is about learning as a networked activity, of connections of people in groups. One of the topics I discussed within the book is that of Social Network Analysis, a way of quantifying how people work together. Some early research within this field was by Mark Granovetter, whose article "The strength of weak ties" looked at how working-class people relied on social networks to find information about job opportunities. He saw that people have both strong ties, those with friends and family, and weak ties with friends of friends and infrequent contacts. His research showed that the people who were most useful for identifying work opportunities were actually the weak ties, because these people were connected with other networks. They functioned as 'boundary spanners', a term coined by Michael Tushman to describe people who form connections between networks, and play an important part in helping information and ideas to move one network to another.

Receiving feedback from both training professionals and systems people made me realise that my book will be operating as a boundary spanner, that it tries to communicate training ideas to system professionals and vice versa. Really, the target for the book is training professionals so the priority will be for them to develop an understanding of how systems thinking ideas can be useful.

It remains to be seen how useful as a boundary spanner my book will become!

Evaluating the unintended or unexpected

Published by in Evaluation ·
Last week I was asked for some advice about questions to ask when carrying out a Level 3 evaluation (in Kirkpatrick-speak). Conventional wisdom is that this is about behaviour change, whether or not people change their behaviour in line with the objectives of the training. And of course, when we are trying to do a Level 3 evaluation we are also interested in what impact any changes of behaviour have at a team or organisational level (the fabled Level 4).

All well and good, but one of the possible weaknesses in traditional approaches to training evaluations which becomes apparent when using a systemic perspective is that because they start with the reference point of the learning objectives, evaluation against the learning objectives can end up being the only thing that is done, when, of course, training can have other consequences.

There are basically two sorts of other consequences. The first is a Level 3 change, what other behaviours have changed as a result of the training? For example, has there been any change in the way people interact with each other, perhaps talking about the training course, questioning how useful it has been, what implications it has for everyday practice? Has this improved what people do, in the subject of the training or in other areas? This is all about different aspects of knowledge management, about the impact of the formal training on informal learning practices.

The second is at Level 4, what are the outcomes at the team or organisational level and what impact does this have? If we don't think about what changes the training makes on a team, then it becomes harder to look at impact that it might then enable. For example, people who attend a training course may get to know each other better, may develop trust and therefore become better team players in the future: this is an example of improving 'social capital', a somewhat nebulous but nevertheless very important factor in improving organisational effectiveness, but which may only become apparent when there is a lack of it! Of course, the implications of this may be much longer term than a straightforward observable output, such as level of sales or of widgets manufactured, and it may also be harder to measure, but this does not mean that it is any less important. If we do not recognise it as important it is unlikely that we will even try to measure it.

For example, if training does lead to an increase in levels of widgets produced over a six-month period but leads to increased workforce dissatisfaction with potential longer-term implications, what is its true value?

Learning styles: serious tool or parlour game?

Published by in Training design ·
I have recently been involved in looking at several different training of trainers events. Although the events have been for different target groups and in different sectors, in all cases some time in each course was spent on analysing (and subsequently referring back to) learning styles of different types, in particular those based around Kolb's experiential learning cycle.

Now, I've often done similar activities in my own training, and know that participants seem to find this kind of self-analysis quite fun and interesting ...  but is it just a bit of fun or is it really of significance?

I've started to ask this question more since I have been looking at training and learning from a systems thinking perspective. Every system has to have its own environment with which it has some sort of relationship, and this relationship influences the functioning of the system in some way. What does this mean if 'learning' is the system?

Thinking particularly about Kolb, his work comes from a humanistic psychology perspective, which means that he considers how a whole being behaves, and does not consider how that behaviour has come to be. This contrasts with more psychoanalytical approaches which seek to understand how a person's history (i.e. their environment) has affected their behaviour. So his cycle of experiential learning describes how some free-floating individual makes sense of new information, which is fine as we, when using the idea, can consider how what is going on around the individual might influence how it works.

However, quite a few writers have suggested that when we take Kolb's ideas further by saying that individuals have a preference for one or two of the stages in the learning cycle, that the humanistic approach creates a problem by ignoring the effects of the environment. Learning styles questionnaires work by asking people to reflect on how they learn in different situations, and they then receive some sort of summary as having one or two 'preferred' learning styles. Their contention is that this analysis is only valid for the situations considered in the questionnaire and at that moment in time, so for different situations or at another time the individual might respond quite differently. Which means that there may be no such thing as a person's always preferred learning style, only a preference at a given moment. Which makes the questionnaire a bit pointless ...

For me I know that I approach new learning situations differently depending on various factors, such as what the situation is, how familiar I am with it as a general class, how much time I have, how well I need to be able to respond, and so on.

So I'm left feeling that learning styles might be a bit of fun to talk about, but that pinning "Activist" or "Reflector" badges on people might be at best a bit of a waste of time, or worse, misleading and perpetuating one of training's great myths.

Training needs analyses: do they exist?

Published by in Reflections ·
When I was doing the research for my upcoming book on training needs analysis and systems thinking I came across an article in the Journal of Applied Psychology (see reference below) which summarised a meta-analysis looking at what factors seem to influence the success of training programmes. One statistic which caught my eye was that their data suggested that only 6% of training programmes were based on a training needs analysis. 6%, not many!

The authors of the study did point out that it was often not clear what a 'training needs analysis' constituted, and that their research looked at published studies, so it was possible that in the 'real world' organisations were indeed carrying out needs analysis activities. So, to try and get some different perspectives on this I asked the question in one of my LinkedIn groups: "Training needs analyses: do they exist?".

Very quickly the question attracted over 100 comments from many different people, and they are still coming, so clearly the question was of interest, and in general showed a lot of frustration with the current situation within organisations as far as conducting needs analyses is concerned.

With so many comments, it is difficult to control specific conclusions, but there were a number of common threads which appeared during the course of the conversation.

Essential but not happening. Many people agreed with my initial proposition that while TNAs are universally said to be essential, they are often not carried out in any significant way.

TNAs take too much time. Organisations want quick results and running a training course is a quick solution (although of course it does not guarantee quick results, which many people pointed out).

What is a TNA? Quite a few people discussed the difference between a training analysis and a performance analysis, seeing the performance analysis as something which came first, to identify what factors are affecting performance, followed up by the training analysis to decide how training can contribute. Interestingly, several of these comments mirrored what I have seen in the standard TNA literature, that these are sequential events, which, from a systems perspective, runs the risk of creating stand-alone solutions which do not necessarily integrate with each other.

The lack of clarity about what a TNA actually is seems to mean that all kinds of activities can fall within the definition of a TNA, ranging from gut reactions to systematic organisation-wide surveys.

Adult learning. Another thread was the common lack of understanding amongst non-training professionals as to how adults learn, leading to inappropriate solutions.

Developing baselines. The intimate relationship between a training needs analysis and an evaluation was also pointed out: how can you carry out an evaluation of the effectiveness if you have no idea what the original problem was.

An interesting exercise, in eliciting views, and one which highlights how far the training profession has to go in making organisations realise how important it is to really think about the reasons for embarking on training programmes.

For more information see: Arthur Jr, W., Bennett Jr, W., Edens, P.S. & Bell, S.T., "Effectiveness of training in organizations: a meta-analysis of design and evaluation features", Journal of Applied Psychology, American Psychological Association, 2003, 88, 234

Training - event or part of a system?

Published by in Reflections ·
In  a few days time I will click a Send button and my Master's dissertation will go off to meet its marker. That will be the culmination of 15 months of alternating periods of reflection, inaction and obsession, and I will not be sorry that it is all over. I have lost count of the number of nights where I have woken up in the dark to start worrying about some area of research I have not yet explored but definitely must. And then woken blearily to the rising sun with little clear memory of those nocturnal moments of complete clarity.

Research such as this makes you focus on increasingly narrow subjects, and it becomes very difficult to to see the bigger picture of what you are looking at. So when I came to a question in the dissertation template which asked me to include some observations about what the wider implications of the research would be for my professional practice, I was somewhat taken aback.

It took me a few days before I was able to adjust my focus and think of an appropriate answer. I realised eventually that one way in which I would now be able to look at my professional practice different was to stop looking at a workshop or an e-learning course as a single event but to always see it as part of what might be called 'a learning system'.

In an earlier post I talked about comments made by several people at OEB 2015 about the 'training as pizza' model: how long would you like the workshop to be? Too often, training is seen as the only solution which is needed to solve performance problems, and overlooks the operational context of how people do their work, by experimenting and reflecting, by asking other people for help, by discussing things which they don't understand and so on.

By thinking systemically about how learning contributes to improve performance we become able to see much more clearly the small part that single events such as a workshop play in the whole learning system, supporting the development of social learning networks, strengthening social capital and other intangible outcomes. Formal training should never be just the only answer, but should always be designed to be just a part of a systemic change process which strengthens learning and the ability to apply new skills.

Call centres, requisite variety and poor training

Published by in Reflections ·
A few days ago my wife decided she needed to change her mobile phone provider (XXX for the purposes of this story). The main reason for this was the poor signal that we got where we live: Sheffield, like Rome, is built on seven hills, and this seems to make mobile phone and television signal reception quite problematic, so receiving calls on this network in our house has always been chancy.

So as she entered the last four weeks of her contract she rang the provider and said she wanted her PAC code. The person in the call centre checked her records and said that there would be a penalty, as her contract did not end until 27th May. "Oh no it doesn't!", said my wife, "I have the contract here, and it says 27th February."

This unfortunately did not impress the call handler, who said that their system definitely said 27th May and there was nothing that could be done about it. So my wife asked to speak to a manager, which seemed to be a surprising request to the call handler, who said there were none available, but that he would find one and ask them to ring her back immediately. Nobody rang.

The next day my wife called again, and spoke to another call handler, who seemed equally unable to understand that a data inputting error might possibly have led to a handwritten '2' becoming a '5' on the indisputably and unassailably correct computer system. However, she did suggest that my wife could go to a local XXX shop, show them the contract and see what they could do about it. Fortunately, there was such a shop about five minutes walk away, so she went off, spoke to a human being in person, who made a note on a computer system and told her to call XXX customer services again.

So she did as she was told, and now everything was okay and she was given her PAC code.

As I listened to this story unfold, I started to think about what it said from a systemic perspective. Back in 1956 Ross Ashby put forward his Law of Requisite Variety: that for a system to be viable with respect to its environment, it needed to be able to display at least the same amount of control variety as the variety present in the environment.

What does this mean here? The call centre handlers at XXX are undoubtedly trained, and probably work off some sort of system which guides them through how to respond to customer issues. This provides them with a certain amount of control variety. However, my wife presented them with unexpected variety, a contradiction between the written contract and the information on the system, and clearly the call handlers were unable to deal with this new environmental variety. Fortunately, one person panicked and suggested a solution which actually worked, but which, nevertheless, showed that they themselves were unable to deal with this particular form of variety.

To my mind that displays a weakness in the training that XXX's call handlers receive, they are unable to deal with all of the environmental variety that they receive, and do not have enough autonomy or confidence to be able to make decisions about how to deal with new problems on their own. Instead, they pass the buck and hope that somebody else picks it up. This time it worked, but it's really not a great strategy, and we will certainly not be recommending XXX as a mobile phone provider to any of our friends.

Bored doctors or well-trained doctors - your choice

Published by in Training design ·
A few days ago a friend of mine who works as a doctor in a local hospital called by for a cup of tea. She was just on her way home from a training course where she had been all day. Always interested in other people's experiences of training, I asked her how it had been. "Really boring", she said, "Just listening to somebody reading off slides all day."

I always find comments like this somewhat depressing. How is it that organisations these days can still think that it is cost-effective to take highly paid people, sit them in a room and make them listen to an expert talking on all day long. The direct and opportunity costs of an event such as that must have been considerable, and with the overall effect being to bore the participants.

My experience in quite a few organisations who rely on this type of training delivery is that they still think that the information transfer model (or what I call 'information dumping') is the best or only way to communicate content. I guess that a major reason why this happens is that people are familiar with lectures from their university days, where a person with all the knowledge attempts to transfer this to people with very little knowledge, like pouring water into an empty vessel, as the analogy goes.

The difference with training professionals is that they already know an awful lot, and really need to be able to integrate new information with what they already have, to refine their existing mental models.

Probably many of the people who are called on to deliver training of this sort will never have studied ideas about cognition, so concepts such as Kolb's learning cycle will be unfamiliar to them. The diagram below is a representation of Kolb's theory, and its familiarity to learning professionals means that it needs no explanation.

However, from a systems perspective one of the things that I have noticed about how this is normally presented is that it is portrayed as an individual activity: each one of the four stages is described as something which goes on inside one person's head. However, this is not what really happens in reality, because if it did there would quickly be no learning because each of us would eventually run out of the energy and inspiration needed to reflect and develop new conceptualisations.

Instead, what happens is that as we work round and round the learning cycle we draw in ideas from the world around us, in particular from other people. This new energy coming into our learning cycle is what enables reflection so that each iteration of the cycle improves our mental modelling. We can therefore represent what is happening as a networking of learning cycles, as in this diagram.

This is essentially the idea of the social construction of understanding, an idea attributed to the Russian psychologist Vygotsky. He reasoned that children learn by conversation and negotiation, which leads to a shared understanding of how to behave and how to do things.

This, I think, is why trainers who can let go of the control of the PowerPoint presentation and let people talk about stuff will usually get better results. I'm sure my daughter would have had a much better learning experience had she been able to engage with the trainer and other participants in discussions where she could talk about things she did not understand, listen to other explanations and so end up with a much better understanding of the subject.

I would certainly feel more comfortable lying on a hospital bed feeling that the medical staff looking after me have had the chance to really get to understand their subject, rather than having spent days being bored stiff by PowerPoint presentations.

Why use systems thinking approaches in training?

Published by in Reflections ·
My last blog post reflected on reflective practice, and the power that could have in improving performance. One of my own personal forms of reflection-on-practice has been writing down what I do. I've always found this to be a very useful if challenging process, trying to capture in some form of systematic way what has often felt to be very ad hoc in the moment.

So it was that two years ago I started on the process of writing another book, this one on how to incorporate systems thinking approaches into training needs analysis and training evaluation activities. Writing a book is a long and lonely process, and one which constantly makes one question whether what you are doing is worthwhile, will it help anyone, will anyone be interested, would it be better to focus on fee-paying work? The fundamental question is therefore why do it, and one answer to that comes from thinking about what value I personally see in utilising systems thinking in these areas: why does a systems thinking approach leads to better training solutions?

A fundamental reason has to be the importance that systems thinking places on context. The interaction between the operational environment and people in the workplace is crucial to high levels of performance, and systemic enquiry makes this interaction fundamental, by making us think about how much variety there is in the environment and how what people do has to be able to manage this variety. Often training programmes are about standard, centralised processes and procedures, and not enough attention is paid to the kinds of skills needed to be able to monitor what is happening in the environment and adapt to it.

This means that training should be giving people more analytical skills in the ability to monitor the environment and adapt as necessary. This is often overlooked in conventional approaches to needs analysis, which develop objectives for a set of skills which will help to enable some sort of standardised approach, but which may not help people to deal with micro-variety (day-to-day differences) or macro-variety (trends over time).

The trends over time point is important, because the environment changes, and a needs analysis carried out today may identify solutions which are not appropriate tomorrow because things have changed. Again, an awareness of the principles of systemic enquiry can make us sensitive to these potential problems.

And thirdly, systemic enquiry forces us to think about the role that informal learning plays in managing performance. Informal learning, such as unplanned coaching and discussions between colleagues, plays an important part in helping to manage variety, but if the needs analysis process fails to look at how existing channels for informal learning work, it is less likely that support mechanisms will be built into any training plans that are developed.

So I guess that is a reason why I have pushed on with the project to write the book. I am certainly relieved that it is almost finished now, and that I should be able to deliver it to my publisher on time!

Previous | Next
Copyright 2015. All rights reserved.
Back to content | Back to main menu