Wednesday, 13 January 2016

The Quantification Conundrum - what are we learning?

I've been struggling with this as a post for so long I'm not sure how to go with it.  We seem so obsessed these days with quantifying things that it seems like we're missing the main learnings.  I've alluded to it before, touched on some of the daft measures we use and levels of accuracy that are so artificial as to be ridiculous.  I've also talked a fair bit in the past around our obsession with discrete learning objectives and the learning world's obsession with them, often at the detriment of learning ironically enough.  I know that there's a need for measures of sort, and if you're strictly in the compliance world then it's all good (actually it's probably not, but that one will wait for another post) but have we got this all wrong?  


When we put things in boxes we have to remember that we're making an approximation. The world isn't black and white, in fact we have to remember it isn't discrete shades of colours either, it's a truly analogue world.  What do I mean here? As the world becomes increasingly digital it's really important that we remember that it's not really digital at all, there are no simple states - all models are inherently an approximation of reality.  For example, we can measure the temperature of some warm water.  We can use our hand and determine that it's 'warm'.  We can use a thermometer - let's say the thermometer say the water is about 40 degrees celcius (or 104 F if you're that way inclined).  Want more accuracy? Great let's use a digital thermometer - it reads 39.7C - now we're getting more accurate right? Not really actually, even if the digital thermometer has a very high level of calibration, we're still converting temperature from an analogue (or continuous) state to a digital one with discrete values - inherently an approximation.  If we were able to look closely at the analogue thermometer we'd see that line hovering around the 40 mark, if we had more marks we may be able to get that number more accurate? Say we had 0.1 indicators and we could see the line now sitting around the 39.7 - guess what, we're still using a digital scale to measure against and converting analogue to digital ourselves using that scale.  In actual fact the temperature of the water is what it is, but no matter how many digits we put in to a measure it doesn't actually make it more accurate.  In fact our initial assessment of 'warm' is no worse in many ways than the 39.74322 degrees we can get on our uber expensive digital meter - the reality is that the temperature is what it is, regardless of what scale we put it on.  The whole scale, be it Farenheit, Celcius or even Kelvin is only an artificial measure that we've created to make understanding temperature easier.

Come to think of it, all the classification that we learned in biology is similar in that it's an approximation - useful to us, but inherently inaccurate.  Let's take the most basic of classifications that's used in plants; fruits and vegetables, when is a veggie really a fruit? We know that tomatoes are officially a fruit right? What about cucumbers, peppers, advocadoes, string beans? All fruits officially from a science perspective.  And these are examples with pretty black and white descriptions like whether or not they have the seeds (don't start with strawberries as they're apparently pretty controversial).  So rhubarb isn't a fruit okay, it's a veggie of sorts - that is if we could agree on the definition of what a vegetable was.  And there's the point.  Classification relies upon absolutes.  Classification is just black and white with a few more well-defined shades thrown in - a digital approach to an analogue world.

I'm sure some of you are aware of the observer effect in physics? In layman's terms this is when things change by measuring them.  We see this plenty in the learning world, when we start to measure feedback or assess students it changes behaviours and has an effect on both teaching and learning.  This seems great for learning, knowing that by measuring what we're doing we're affecting it, perfect, measure away!  But there's more to it, I've also touched on Quantum Mechanics before and the ideas of things like Schrodinger's cat and furthermore Heinsberg's Uncertainty Principle.  In essence these things tell us there's only so much we can really 'know'.  As we measure and observe one thing, something else changes and each known unveils another unknown.  Combining this again with another high-level theory (relativity of sorts) we get a system where when we measure something we change it, when we get that measurement we lose sight of another measurement and the 'fact' we have determined is only relative - true in that instant and with other unknowns! It's probably fair to ask do we know more at this point than we did before we measured?

Okay, too much physics.  Let's look at this in simpler logic terms.  When we say by the end of this course you will learn XYZ, it means we have to accurately define XYZ and then measure against it for a point in time for all students.  What about the next week, the next month, year, decade?  What are we actually measuring? Using our simplistic quantification of learning is only true (even if it were properly and accurately measured) at the instance it is was measured.  Now add to that the issues in measuring XYZ.  Writing effective assessments is a difficult task for the most experienced of learning designers and testing beyond recollection of 'knowledge' is probably the minority.  So our learning objectives tend to rely heavily first upon a classification that we've already seen is somewhat of an approximation; a digital version of the analogue world.  They tend to be badly designed and badly measured too - so we end up with an approximation with dodgy measurements, that's only good for when it's measured; how good do those learning objectives really look?

Now let's add some more of those oh-so dodgy classification types into our learning to further make things foggy.  How about learning 'types' - remember when everyone was hot on auditory learners etc? Back to our point earlier around quantifying and classifying; there's no such thing as an auditory learner - it's a simplistic approximation of the way some people may learn best; and even that's been questioned in recent years. How about those Myres-Briggs personality types - heck, I'm still unsure at 44 whether I'm an introvert or an extrovert let alone the deeper analysis - and just because I was an extrovert yesterday doesn't mean I'll be one tomorrow either or I'll take that role consistently throughout an exercise.

You could also kind of twist uncertainty principles in here again when we talk digital and analogue.  The more highly defined something becomes the less confidence we can actually have in its accuracy.  In learning objectives does that mean the more SMART we make them, the less smart they really are? Maybe, but what it really means is that the deeper we try to classify something, the less sure we can actually be we've got it exactly in the right place.

So that's kind of the conundrum we face, the more me quantify the less confident we can be about what we've defined.  As a friend of mine once said in a drunken moment 'the more you know, the less you know, aye, know what I mean?'

Friday, 8 January 2016

Scoring an Own-Goal in Life and Learning

Well if ever there was a time of year to talk about goals it inevitably seems to come up around the start of a new year... I've got issues with this and I'll come on to them shortly, but for now let's take a look at goals, what they are and how to avoid putting the ball in the back of your own net (football anecdote - soccer to some of you too).  These days it's pretty trendy to have goals all round isn't it?  The look of horror and disgust on your friends' faces if you admit to drifting through an entire year without having a well-defined and documented personal goal - let alone the work goals.

Now before I get too far in to this let me start by saying I've got nothing against goals - in football it's frustrating and sometimes even depressing sitting through 90 minutes without one - but I do believe we have a tendency to over-analyse and put too much detail in to our goals.  We had a discussion on this on #pkmchat this week and I couldn't help but be drawn back to when I was a child and had my heart set on being a stunt-man like my childhood hero Lee Majors in The Fall Guy.  It was fairly well articulated in that I was happy to discuss it with anyone who cared to listen, but I didn't have a ten-step plan and neither, as it turned out, was it terribly realistic.  Does that mean my 'goal' aged 7 was a bad one? Probably, but maybe not for the reasons we may have it pegged as.

Firstly if you really want to do your goals an injustice make them really easy and if possible really un-inspiringly dull.  If my goal at 7 was to be able to ride a bike or walk to the shops I would have achieved the goal, ticked it off and reaped the awards.  Well, kinda.  I did those
things and sometimes little things were an achievement both back then and now - but if I was always going to achieve them then putting them as a goal seems pointless.  I might as well have said breathe in and out was a goal (one I'm glad to still be achieving at this point at least).  If your work goal is to hit a target that you know you're going to make easily regardless of how much effort you put in, then well, yes, it's a pointless exercise that really doesn't make you a better person or worker.  On the flip side of that there's a certain futility in setting goals that you'll never achieve.  I'm a bit of a dreamer so I'm all for those big hairy audacious goals that will really push you, but if you set them in the realm of those that you know in your heart you'll never do then they're a de-motivator rather than a motivator.

Modern self-help literature may have told you that it's essential that you write down your goal and tell as many people as possible to increase chances of success.  This definitely can hold some truth, particularly if you often lack self-motivation, but it really does depend on what your goal is and how you want to achieve it.  I do think it's daft to make a goal, write it down and tell everyone about it and then just not bother to do much about it. You'd be better off keeping it to yourself and achieving than sharing and doing nothing.  In fact, whilst we're on the whole achieving or not thing, it's great to share what you're doing and how it's going, but don't just share when things go well, we learn and get so much from what we don't achieve, particularly when we try to do our best and still don't meet the targets we set.  You may not achieve what you set out to do, but often the achieving of goals is less important than what we've learnt along the way... so if you're after that own-goal again try making sure your goal is totally achievable (if not easy as above) and that you brag like hell about it after you get there.

Make it complicated! Everyone knows that the more complex and clever sounding a thing is the better it must be. Don't go for simple goals like 'do a bungee jump', make sure it's a specific type of jump involving a bunch of conditions, the right temperature, participating friends, the right location, before the 15 December this year, the time of day, the colour of the rope etc etc.  In fact, here we go, let's make it SMART - you all know that acronym right? Specific (tick), Measurable (tick), Achievable, Realistic and Time-bound.  Inwardly I groan at these type of things that we've been taught from day one it seems.  Don't get me wrong, there are times we need to get specific and we need some conditions at time - but if I'd set this up and done a different bungee jump on the wrong coloured rope on the 18 December would my achievement of the goal be invalid?

You can only set goals in January.  Just remember that just like resolutions or changes of any sort we put aside a specific time for them and after that the window closes and you're doomed. Don't try and give me that flexibility mumbo-jumbo - you either get ready for the 1 Jan or forget it.

The end totally justifies the means - succeed at all costs.  Don't get drawn in to the journey argument, you know the only important thing is that you can tick off the goal at the end of the day/week/month/year.  Your success will be totally measured on the ability to achieve the original goal you set no matter what occurs.  Make sure your goal is set nicely in reinforced concrete - what good is a goal that changes with your life? I should be judged purely on my 7 year old goal (okay, it's closer to 40 years ago, but it was the goal I set back then) and it's pretty damn important that I make it to be a stunt-man or my whole life has been wasted. 

Okay, so tongue out of cheek, it's hopefully clear that we don't have to be super anal to set and use goals in life.  We have to remember a couple of things; firstly is that the idea of a goal is to inspire you to achieve something and secondly (and perhaps more importantly) the achievement of the goal itself isn't necessary the measure of success. On my failed career as a stunt-man I've actually enjoyed some of the learnings and hopefully had some positive influence without breaking as many bones.

Oh learning you say?  Well here's the thing, read all of the above and replace the word 'goal' with learning objectives and ask yourself if learning can take place without them?