Monday, 16 October 2017

A Transdisciplinary Synthesis for Educational Technology Theory

I've been very critical of the current state of theory in educational technology. When comparing the theoretical and scientific work on learning and technology today with that of the 1970s, it doesn't look very good.

A lot of today's e-learning academics are happy to promote rhetoric about what we should and shouldn't do (social media, yeah! open access, yeah! learning analyics, yawn!), but are silent when addressing the really difficult questions about learning, consciousness, society, institutions, technology and systems.

Part of the problem is an unwarranted consensus about educational theory being "established": Critical pedagogy, constructivism, open education, etc have all become 'real' things (ironically in the case of constructivism), alongside more pernicious real things like "modules" and "learning outcomes". Their effect is to provide an anchor for education academics. Each hides numerous implicit ontological assumptions which are never critiqued. On more than one occasion, I have attempted to challenge people who ought to know better with questions like "but what do you mean by..." and then I'm met with silence. I think this is indicative of professional insecurity rather than a general ignorance of the fundamental questions. There's plenty of insecurity out there.

I want to change this - but to change it means to escape the educational consensus. We have to do what the originators of our current consensus (Piaget, Vygotsky, von Glasersfeld, Freire) did - engage in a truly interdisciplinary inquiry which engages at the forefront of current scientific, political, philosophical, technological and artistic knowledge.

What is happening in physics today which is relevant to learning? What is happening in biology? What about philosophy? Or maths? Or logic? Or systems? Or the arts?

This isn't just a trawl for new theory. We need a new accommodation between theory and experiment. Our empirical foundation in education is dreadful - "8 out of 10 learners preferred..." We must do better.
In any empirical enterprise we need:

  • A logic for expressing what we think might happen
  • A means of measuring what actually happens
  • A method for restructuring our logic in the light of experience.

Our logic depends fundamentally on mathematics. At the forefront of pure mathematics are inquiries about complex topologies, explored through techniques like category theory. In maths today, the very issue of "categorisation" is a question - perhaps the central question which is exercising minds. So what of our categories of "education" or "learning"? At the heart of these investigations is the pursuit of better ways of understanding recursion (our categories about most things - and certainly education - are recursive).

On measurement, perhaps we should look to the physicists exploring the properties of quantum mechanical systems, where their focus is on the measurement of uncertainty, symmetry and contingency. After all, it is these systems which will form the basis of our next generation of computers. Or we could look to biologists who are examining the ways in which cells organise themselves in their environment. At the forefront of research, the physicists and the biologists may be looking at the same thing, and often with similar tools taken from information theory.

Finally, what of our method for adjusting our knowledge? This goes to the heart of a technological and organisational question. To change what we know is to change our structure. Where does the structure of an individual's knowledge end and the structure of the society in which the individual exists begin? Doing science entails social structural change. Doing uncertain science - which seems to be what we now need to do - entails doing this continuously. Our hierarchical social structures of education and science do not provide sufficient flexibility. Only heterarchical structures will be able to absorb the variety of the environment in which they operate. In its favour, OER is heterarchical.

What we have now in education is not scientific, but scientistic. Stupid applications of technology in education like Learning Analytics adopt the pretence of science to give it kudos. The stupidity is upheld (and exacerbated) by institutional hierarchy. We need to move on all three fronts: the logical, the empirical and the structural. But it is the structural which, currently, is our biggest problem - but one that can't be addressed without the other two.

Monday, 9 October 2017

An Ashby Growth Machine

Imagine studying the dynamics of Ashby's homeostat where each unit produces a string of numbers which accord to the various values of each dial. The machine comes to its solution when the entropies of the dials are each 0 (redundancy 1). At this moment, the machine 'dies' - there's nothing else left to do.  

As the machine approaches its equilibrium, the constraint of each dial on every other can be explored by the relative entropies between the dials. If we wanted the machine to keep on searching (and living!) and not to settle, it's conceivable that we might add more dials into the mechanism as its relative entropy started to approach 0. What would this do? It would maintain a counterpoint in the relative entropies within the ensemble. 

So there's a kind of pattern: machine with n dials gradually approaches equilibrium. An observer measuring the relative entropy of the machine adds new dials when the relative entropy approaches 0. So, say there's n+1 dials, and the process is repeated. But growth also entails the death of parts of the machine. Maybe the same observer looks at the relative entropies between sub-sections of components. Maybe they decide that some subsections can be removed also as a way of increasing the relative entropy of the ensemble. 

But what about the entropy of the observer's actions in adding and taking away dials? Since this action is triggered by the relative entropy of the ensemble, the relative entropy between the relative entropy of the machine and the entropy of the observer should approach 0. What if we add observers? What would the relative entropy between the observers be?

Each observer might be called a "second-order" observer. Each dial in the homeostat is a first-order observer of the other dials. Each second-order observer sees the ensemble of first-order observers as if it was a single dial. A second second-order observer would also see this, and would see the other second-order observer. A third-order observer could add or remove a second-order observer. And so on. 

Does the growth of this Ashby organism display an emergent symmetry?

Saturday, 16 September 2017

Notation, Constraint and Logic

I’ve been doing some experiments with notation in music and video. I’ve got a great music writing app on my tablet called “StaffPad” which does handwriting recognition (so I can handwrite squiggly notes and the software converts them into proper typeset notes) – which is great, if a little bit fiddly. However, it also has a facility for simply drawing on the score. When the score plays back, it scrolls the music, so both drawing and notes appear.

To write a note on a score is to give an instruction. There is a question about whether the instruction is about exactly “what to do” or it is in fact “what not to do”. In other words, does the symbol on the score denote the sound, or does it contribute to the conditions of freedom within which a performer might act freely?

I squiggled some shapes on the score, and then I attempted to “play” it. I should have made my squiggles a bit easier to play! I did this a couple of times. The sound that I produce can be considered as “alternative descriptions” of something. The symbols/squiggles on the score are also descriptions of the same thing. If there is any similarity between these different descriptions it is in the fact that both the graphical description and the sound descriptions have similar entropy: in other words, what counts a surprise in one, counts as a surprise in the other.

Notation is obviously different from a recording. A recording is a faithful description of exactly what is done. Notation is an invitation to create multiple descriptions. The parameters as to what is permissible and what isn’t is contained in the way the notation conveys the flow of entropy over time.

So what about other kinds of marks or notations which we use?

In logic, I can represent the statement “All humans are mortal” as ∀x:human(x) → mortal(x). What’s the difference between these? The variable x is an invitation to generate possibilities – alternative instantiations of the formula. They produce constraints on the imagination bounded by the ways in which the symbols might be manipulated. The meaning is not in the phrase “all humans are mortal”, or even in ∀x:human(x)→mortal(x), the meaning lies in the interplay between the different descriptions which are made in the light of the notation.

We misunderstand formal logic as a denotation of reason. Really it’s an invitation to generate multiple descriptions from which reason is connoted. This mistake is why attempts to prove computer software in formal systems has failed. If we understand the relationship between logic, notation and meaning differently, then we can find new applications for logic. Education is one of these.

Monday, 11 September 2017

Theory, Explanation and Prediction

The word “theory” means different things in different contexts.

Mathematics: Mathematicians use “theory” with reference to things like “number theory”, “set theory”, “group theory”, “category theory”: basically, different kinds of formal system whose properties can be explored and can often be mapped on to other formal systems: for example, category theory (which is much in vogue at the moment) presents ways of accounting for number theory, set theory, etc. Like those systems it accounts for, it is a self-enclosed formal system.

Physics: Physicists use theory to explain and predict physical events like gravitation or quantum entanglement. Physical theories and mathematical theories are closely related: calculus, for example, was developed as a way of describing the motion of planets. There is some argument as to how physical theories are constructed or discovered: classical science sees theory as the constructed result of the observation of event regularities in nature, for which communities of scientists agree causal explanations. Many of the classical arguments for the construction of theory have been challenged by relativity and quantum mechanics where observing becomes part of the scientific/methodological process, and bias and ego of the scientist, or the power dynamics of institutional science feed into theoretical claims.

Social theory: At its origin, social theory followed the classical scientific model: it was assumed that “event regularities" could be established in the social world through statistics. With statistical regularity, the same process of constructing explanations could be established. Today, we call this positivism, and it was in evidence in some of the early industrial improvement processes in Taylorism or Fordism. This has become the root of arguments about method. Contributions from phenomenology (which grew from mathematics through Husserl), psychoanalysis, philosophy and economics has led to conflicting views about the use of statistics in social science (Keynes, Hayek), subjectivity vs. objectivity in observation, value freedom (Weber), intersubjectivity (Husserl, Schutz), Knowledge vs Action (Marx, Lewin), realism vs constructivism (von Glasersfeld, Archer). Education sits (partly) in this theoretical mess.

Psychological Theory: Like early social theory, psychological theory often pursues a classical science model. Experimental conditions are established, experiments are performed, events observed, regularities established through statistical analysis and causal explanations constructed. Like social theory and physics, questions about objectivity, bias, explanation, etc have divided psychologists between those who uphold an empirical model (often working in cognitive science) and those working in social psychology. Education is also caught up in these debates.

Political/Economic theory: Marxist theory presents perhaps the most coherent account of the connection between the material base of existence, social structures and human agency. Its explanatory success is directly connected to the practical effects on the development of social and economic policy from the late 19th century. It remains the best example of the power and importance of theory, and the connection between coherent explanation and social emancipation.

High level theories in Education: Observation of regularities in social life has led to various high-level categories of causal mechanisms in education. Buzzwords emerge whose definitions are often woolly: sociomateriality, semiotics, critical pedagogy, transformative learning, constructivism, etc are high level constructs whose provenance is obscure. Despite lack of clarity (and maybe because of it) these terms get discussed a lot in the literature. Because of intrinsic rewards of the journal system for helping academics establish their impact and job security, popular terms tend to persist since it leads to citations.

So at one level (e.g. maths) theory is well-defined. For most of physics it remains so, but where physics concerns very small, very fast, or very far-away things, theory bifurcates. In education the theoretical picture is very confused. Added to this is the fact that data analysis is now seen as a viable alternative to theory: prediction, which is one of the principal features of theory, can be achieved from simply crunching numbers (i.e. counting). In this process, explanation is deemed less important.
Having said all this, theory – or the building of explanations – is not something which only occurs in turgid textbooks. Everybody does it. We cannot not theorize. To deny the importance of theory is itself a theory. But it is a theory which doesn’t explain or predict very much, so it is not very good. Holding to multiple inconsistent or bad theories renders us confused.

The quest for a coherent theory of educational technology is a response to a range of questions:

  1. Can we explain (and predict?) the reaction of institutions and individuals to technologies? 
  2. Can we explain (and predict??) the development of students whose demonstrable skill increases with educational engagement? 
  3. Can we explain the reticence of some individuals, or the enthusiasm of others, to engage in technology? 
  4. Can we explain why so many learners (and teachers) seem to prefer face-to-face communication over online? 
  5. Can we explain how we feel when we engage in learning online? 
  6. Can we explain why status, accreditation, certification seem so important in education and society? 
  7. Can we explain why our existing explanations/theories do not explain much of what happens in education? 
  8. Can we explain the difference between university higher learning, school and kindergarten? 
  9. Can we explain curiosity? 
  10. Can we explain why YouTube is fab? Or why there’s so much porn on the internet? 
  11. Can we explain why so many are addicted to social media? 
  12. Can we explain why teachers want to teach? 
  13. Can we explain why scientists continue to publish their work in journals? 
  14. Can we explain why institutions exist? (and why dogs don’t have universities?) 

Because all these things are connected, many different and inconsistent descriptions can only produce confusion which can not only be imprisoning, but confound our ability to develop technologies which make society better: the unforeseen consequences of technical development might take us to self-destruction through lack of critical inquiry.

It is worth noting that those political forces which demonstrate antipathy to deep critical inquiry are those now in control in the US, Turkey, Russia, North Korea and the UK. We need to think our way out of a very dangerous situation.

Monday, 4 September 2017

Vice Chancellors' and Footballers' salaries compared: HOT NEWS! VC Transfer Window Closing soon - Who'll get the Lukaku treatment?

The establishment is closing ranks on VC pay. After the crass "bling display" of George Holmes saying students want to be taught by rich professors (, the VC of Oxford, Louise Richardson, has blamed politicians for stirring-up the pay issue: using many of the same arguments as Holmes! He'll be flattered, I'm sure.

Interestingly, these high calibre and highly sought-after people can't seem to engage with the press without shooting themselves in the foot. Richardson has quite needlessly done this by defending homophobic lecturers: - a gaffe which is in the same league as Holmes's miscalculation. What this all really tells us is that these people are just as confused about education as the rest of us. They try to defend their salaries by pretending that they are not confused by education, but then do or say something which reveals the crassness of their own intellectual position. There is no head of any university anywhere who is not hiding their confusion behind an enormous pay packet.

Here's a quote from Richardson's interview:
"My own salary is £350,000. That’s a very high salary compared to our academics who I think are, junior academics especially, very lowly paid. Compared to a footballer, it looks very different; compared to a banker if looks very different. But actually, we operate, as I keep saying, in a global marketplace,"
Three points to make about this (thanks to Oleg for much of this)

  1. Her "lowly paid academics" are lowly paid because she decides they should be.
  2. Footballers in the premier league earn vast sums of money. Footballers in League 2 earn about £40,000. Oxford is a premier league university. George Holmes's Bolton isn't. So why are all VCs paid the same? It looks like a cartel, doesn't it?
  3. And finally, the Marketplace. What's that, exactly? Is she saying there is a market for Vice Chancellors in the same way there is a market for footballers, or (more appropriately) football managers?

Universities, encouraged by the government, have convinced themselves that the environment in which they operate is a "market". What this means - certainly for places like Bolton - is that obeying the "will of the student who pays their fees" is the essential criterion for success. But then Richardson, who would argue that Oxford "competes" for the brightest students, then says to students uncomfortable about homophobic professors,
"I'm sorry, but my job isn't to make you feel comfortable. Education is not about being comfortable. I'm interested in making you uncomfortable"
Weird market, eh?! The confusion here is that "the market" cannot possibly be the environment of the University; education's environment is society at large - past, present and future - not the "will of the student".  Universities are in trouble because they don't know what environment they are really working in or have to adapt to. Misunderstanding their environment is leading to cruel managerial interventions (such as those at Manchester and the OU at the moment) and this current pay scandal which is stirring-up greater political threats for them in the real environment. The mixed messages and confusion is compounded by the enormous sums of money these people lay claim to (and not to mention their enormous pensions which will bleed an already bleeding university pension system dry).

Our VCs think they are worth £220,000 or £350,000??? Let's put them in the "VC transfer market" and see what happens! Who is the Lukaku or Alex Ferguson of Vice-Chancellors? Holmes? Not likely! Richardson? Well, Oxford's a great "club" - comes top of the league tables... but.. is that because of her? Did she score all the goals? Did she win the research contracts? But let's say she is really great - money talks, so the post that's currently held by Michael Crow at the University of Arizona, which pays him $1,554,058 (see ought to be attractive to her. I'm sure they'd be willing to make an offer. So why doesn't she go?

And then, just for fun, who is the Alan Ball, described by the Guardian as a "ruthlessly efficient relegation machine" ( Well, Bolton isn't exactly at the top of the table. But Ball was sacked. Holmes is still there!

These people are having a laugh at society's expense. They are not, however, as guilty as the bankers, who Richardson also mentions. We must deal with them both.

Sunday, 3 September 2017

What is notation? (in Maths and Music)

When we are taught notations, whether in music or mathematics, we are always taught what certain symbols denote. In music, students have to work out which note is meant to be played when, and all of these "instructions" are contained in the way the note is written. In maths, we are taught which symbol means what and how they can be combined, and how strings of symbols can be manipulated and related to other symbols. So we learn that 2 + 3 = 5 is a legitimate use of the symbols 2, 3, 5, +, =, but 2 + 3 = 6 is not. When we learn maths, we are conditioned to mistake the symbols from the meaning. The meaning we only learn by playing with the symbols and working out what is legitimate and what isn't. What does "legitimate" mean? It must be some kind of social expectation: mathematicians coordinated their "dances with symbols" with the dances of other mathematicians. Without this coordination, there is really no maths at all.

It's the same with music. In any notated music, we are told which notes to play and in what order, and in what time. There is much that we are not told. The symbols are really an attempt to convey the constraints within which one might express oneself freely in music to be coherent with others expectations. The notation tells us what not to do.

Is the flow of logic a flow of constraint? What not to do at time t1 is not the same as what not to do at time t2. When solving a mathematical problem, or doing some kind of formal logical proof, there is a fluctuation in what not to do. To indicate these is to coordinate a common set of constraints between mathematicians.

Notation indicates constraints. But it produces its own constraints. Whatever is the reality of number lies in the common lifeworld which is experienced between people manipulating representations of number. But if notation is assumed to be real of itself, it will produce unexpected results which lead to confusion.

An example (from Lou Kauffman): if Euler's identity is:

which then means
How can an imaginary number equal a real number?

Musicians don't get caught in this. They coordinate their expectations at a deeper level. The mathematical example is produced because there is a double-layer constraint (much like a double-bind). There is constraint between the coordination of expectations about number at one level, and coordination of expectations about the notation at another.

Thursday, 31 August 2017

Medieval Logic, Cybernetics and the art of D.P. Henry

It's a curious thing that I was talking with a friend about curiosity yesterday, a couple of weeks after visiting the British Library and spotting in the display cases a copy of Boethius's "De Institutione Arithmetica" which contained a beautiful picture of the categorisation of number into arithmetic, geometry and harmony. With no apology, I would say that the "harmony" struck a chord with me! There's something about curiosity and "striking a chord" - or rather, looking for a chord to be struck.

I've recently been immersing myself in physics and symmetry, and was about to attend a conference which included contributions from physicists and cyberneticians. What I wasn't expecting was to be presented with very powerful alignments between medieval logic and cybernetics. The presentation by Dino Buzetti sent me off to look for the common patterns between Scotus, Ockham and George Spencer-Brown. What's the key? It's the obsession with what it is to make a distinction.

Dino's references also led me to seek out the work of D.P. Henry. Henry was one of the leading authorities on medieval logic. The epigraph he chose for his book on "Medieval Logic and Metaphysics" from St. Anselm could have been written by many cyberneticians (and particularly by Bateson):

We ought not to be
held back by the way
in which the improprieties
of speech hide the truth,
but should rather aspire
to the precision of the
truth which lies hidden
under the multiplicity
of ways of talking (from De Casu Diaboli)
I'm still digging into the book, but this statement from Anselm seems to me to also be about curiosity: it is a search for the multiplicity of ways of talking.

Henry is less famous today for medieval philosophy than he is for art. He was a champion of machine-generated art, and produced beautiful images like this one:

Sunday, 27 August 2017

Squander, Universities and George Holmes's Yacht

The recent car-crash interview given by George Holmes of the University of Bolton to the FT, repeated by other mainstream press (e.g. here: was deeply unsettling and unwelcome to many in the sector (for example, Oxford's David Palfreyman here: It was hard to work out if Holmes intended to stick two fingers up at the establishment, or he was wanting to parade his Bentley and yacht in front of them as a way of saying, "Well, it may only be in Bolton, but I can cut it with you posh lot!". He seems to flip-flop on this: on the one hand appointing minor royals to ceremonial positions in the University, whilst claiming that the old established universities like Oxford are "dinosaurs waiting to die". The latter comment he made in a TED talk (!) recently at a TED event organised at Bolton at which three senior managers and a few professors (it's very hard to tell the difference between professors and senior management these days) gave the world the benefit of their wisdom. This is worth watching here to gauge the intellectual clarity with which Holmes grasps his mission. It's not, I think, pedantic to note that the Robbins Report on Higher Education was published in 1963. The total self-confidence with which Holmes says it's 1966 and riffs on "route 66" says a lot about him: it's as if he's thinking, "if I say it loudly enough, I can make it true". It would make one question many of his other boasts and indeed his judgement. We see this in the world a lot at the moment.

To the people of Bolton, the obvious point is that Holmes's "success" - his yacht, the Bentley, and his £960,000 house - have been paid for by the university's students (many the children of Bolton) with money they haven't yet earned, and with a debt which will be hanging over them long after his yacht has sunk and the Bentley is at the crushers. He will say (and has), "This is a multi-million pound business". But one might be forgiven for thinking "This is a multi-million pound racket", the product of misguided government policy which turned universities into fiefdoms and put characters like Holmes at the helm without any checks and balances on their behaviour. The fact is, after numerous uncomfortable engagements with the press and previous bad behaviour (see, HE'S STILL THERE. Is it imaginable that the VC of Oxford would survive this kind of thing? I doubt it. Holmes has been able to arrange things to suit him - at the students' and staff's expense. How long will this last?

The FT interview was, by any measure, very poorly judged. It was on a Phillip Green/Mike Ashley/Donald Trump level of "bringing the institution into disrepute" behaviour. I suspect Holmes knows this - but for some reason he can't seem to stop himself. The weird thing here is that he knows he can get away with it. The interview was reckless, irresponsible. It was squanderous - as indeed was the Bentley, the yacht, the £100,000 awayday, the £960,000 house, the sacking of the UCU reps, and so on. It was like some drug and alcohol fuelled bender of the kind that would make de Sade blush. It carries the whiff of the thrill it probably gave him as he posed for the centrefold of the FT (ok, it wasn't the centrefold, but it could have been). Playboy next. Dangerous?! "Yeah, but I'll get away with it."

Amid the austerity agenda, squander doesn't get much of a look in. But it's everywhere. George Bataille wrote an entire economic theory around the concept of squander and waste: he argued that in human history, it was the most regular punctuating mark in civilisation: war and destruction, extravagant building, luxurious art, sexual excess, alcohol and drugs, all the way through to the human sacrifice of ancient civilisation. Bataille put it down to humans having absorbed "excess energy" from the sun, and needing to expend it in various ways. He based his ideas on the anthropological theory of Marcel Mauss who explored gift economies and the "potlatch".

There is a weird symmetry between the squander of Holmes and the squander by students going to his (or other) Universities. Since fees were introduced, university funding has quickly revealed itself to be a "Veblen good" - where the demand for something increases with its price. Veblen I suspect subscribed to a similar theory to Bataille - he regarded University as "atavistic", and had a notorious reputation for seducing the wives of Vice-Chancellors. For the students, University is squander which the government encourages. It's actually a form of Keynesianism: what Colin Crouch calls "Privatised Keynesianism". £50,000 of debt for a piece of paper! Wow! That feels fantastic!

All intellectual life has an aspect of squander. At its best it's like the squander of the artist, or perhaps the squander of the priest who lives in self-imposed poverty. As intellectual accomplishment also carries a social status (What Veblen acerbically says is where "the standing of the savant in the mind of the altogether unlettered is in great measure rated in terms of intimacy with the occult forces") there is always a 'marketing' opportunity to sell the "fairy dust". The market has taken all of these forms of squander and turned them into an economic dynamic where student squander is matched by the squander of the likes of Holmes, and the unpleasant corporations making a killing on student accommodation and other services, and the banks who are ramping up interest on student fees.

Catherine Bennett touched on this in her excellent Guardian piece about Holmes, asking what would happen if it all doesn't work:
"What will motivate our young people, supposing we accept the Holmes analysis, if they do not see how good jobs translate into high-end vehicle choices? How else will leaders like him advertise their very successful careers? There can be only one possible compensation for this loss: more money" (see
Is this sustainable? Can it carry on growing? Let's see more waste from students, and more outrageous peacock displays by the likes of Holmes! But this is an age of "austerity"! How does any of this make sense?

It's not an age of austerity. It's an age of squander. It's an age where a few like Holmes squander on Bentleys and yachts, whilst students squander on education. In the middle are the teachers and academics - the people with no time, let alone the money, to squander. That's the other side of Holmes's squander: slashing staff, hourly paid contracts, no security. This is the students' future. Bataille might suggest that we've reinvented human sacrifice.

Thursday, 24 August 2017

Uncertainty, Universities and Yacht-owning Vice-Chancellors

One of the greatest sources of confusion concerns the impact of technology on social change. When we survey history, we see patterns which appear to suggest that steam power caused the industrial revolution, and the various social and political transformations which accompanied it.  Obviously, online shopping needed the internet; the collapse of the high street could also be seen to be collateral damage from this, as was highlighted in this interesting (and rather depressing) piece about Bolton this week: But this causal connection between technology and social change is what is called "technological determinism" - and it clearly isn't true. Yet because a causal link between technological innovation and social change is constructed, a mindset sets in within institutions like universities and government which sees that the solution to social problems like housing, welfare, employment or health is technological innovation.  But it tends to produce new problems rather than solutions.

We need a new theory which connects technological innovation to social change. I've begun to think that we look at the wrong things if we examine what new tools can do - what psychologists call their "affordances". Rather than doing this, there is a simpler starting point: all new tools provide new ways of doing things. That is, basically, the definition of a tool: it creates a new option for acting.

Today we are continually bombarded with new options for acting: new online communication services (FaceTube), new ways of cutting mobile phone bills (LebaraFone), new ways of getting about (UberLyft), and so on. Our daily conversations often go something like "I use x, it's new - have you tried it? Much better than y".

There are a number of things about options which need to be considered. Any option has to be selected. We use all kinds of techniques for making selections - habit is the most powerful one - but the trouble of deciding on an option is work. David Graeber calls this "imaginative labour". If you give somebody more options, that's more work.

From a more scientific perspective, if the number of options for acting is increased, then the probability that any particular option is selected decreases. If the probability of selecting an option decreases, the chances of somebody else guessing the option you have chosen also decreases. If the other person is not familiar with the scale of options you are selecting from, there is no chance of them guessing. This can make communication more difficult. There is no point in communicating a message to a friend through Twitter if they are not on Twitter.

In order to communicate, it is important that the range of options available to somebody sending a message is the same as the options available to a person receiving a message. If this isn't the case, then there is a chance that something will be selected at one end which cannot be selected at the other. The same thing applies to language: to understand a message, the receiver needs some idea of the inner machinery (psychology) of the person uttering the message so that they can understand how the selection of words was made. These basic principles grow from Ashby's Law of Requisite Variety in cybernetics: that in order to manage the complexity of system a, system b must have an equal or greater amount of complexity. One way of expressing this is through Shannon's information theory which measures the complexity of communications in terms of their "surprisingness", or the degree of uncertainty associated with them.

If adding a new option reduces the probability of selecting an option, and the probability of guessing what option is selected, then uncertainty is increased. Technological innovation does nothing but increases uncertainty.

Because of this, it is incorrect to say that "technology takes people's jobs". Institutions make people redundant, not computers. What causes institutions to do this is the their own reaction to the uncertainties produced by technology and other things in their environment. It is the effect of uncertainty on existential fears of managers, company directors, government ministers and so on which result in the misery of redundancy for staff (and never redundancy for those making the decisions about redundancy!). They tend to respond to it by off-loading their existential crises onto their employees as they seek to defend the structures of their institution in an increasingly uncertain environment.

Present crises of employment, equality, political expression, and institutional corruption are directly the result of institutions struggling to maintain themselves in an increasingly uncertain environment. As institutions seek to defend themselves by reinforcing their structures, sacking staff, and attempting to bend to "market conditions", they feed a growing political crisis which has the effect of ramping-up uncertainty in their environment. It is a positive feedback situation. Political uncertainty leads to new environmental complexities to which the institution has to adapt, sacking staff, reconfiguring courses, etc. Gradually institutions eat themselves.

At the heart of the problem is the way society manages its uncertainty. Traditionally, institutions like churches, universities, hospitals and government have been the means of managing uncertainty, which institutions have done by attenuating the environment and forcing individuals through regulated pathways. This worked because uncertainties could not proliferate because the transformation of means of doing things was a slow process. Computers change that. The transformation of means of doing things is rapid, and the growth of uncertainty is relentless. Institutions in their traditional form are not fit for purpose to manage uncertainty. Indeed, they make the situation worse.

The future of the management of uncertainty in society rests with effective use of technology and self-organisation between individuals. This will not replace universities and hospitals completely. But it will do many of the things which we currently associate with institutions. Meanwhile, the dinosaur institutions will hang on. Vice-chancellors will grab as much as they can to preserve their status and identity whilst things fall apart around them. Some of them will even give unwise interviews to national newspapers about how "well" they are doing. There is no better sign of the crisis we are in than this yacht-owning vice chancellor:

My thoughts this week are with those he's just made redundant. 

Wednesday, 23 August 2017

Aesthetic Judgement and Teaching Evaluation

Is the judgement of good teaching an aesthetic judgement? It certainly isn't treated as if it is - the TEF basically adopts a series of disconnected proxy measures. It is nonsense - the TEF is really all about power structures - both within government and universities. How can judging good teaching not be like judging art? How can it not be highly subjective? How can it not be subject to revision after a period of time? How can it not depend on the degree of knowledge the person making the judgement already possesses? How can it not depend on the social dynamics and context where the judgement takes place? and so on.

There is a sense in which any aesthetic judgement is an assessment of context. A judgement is a process of converting experiences into discourse. If it is a painting, then we might say (perhaps incorrectly) that the object of the painting is converted into discourse. Really it's the experience of the object. If it is a performance - like teaching, or music - then it is a process of articulating the context in which an experience arises (boredom, excitement, the realisation that one has new skills or understanding). What about judging lovers? Or food? Or torture? Or a new car?

Articulated judgements are the result of converting experience into discourse, but discourse creates new objects. Critics and academics create objects out of discourse. These objects carry status and to have a judgement accepted by influential communities (like academic journal editorial boards - or even twitter retweets) is the strategic goal of many academics. Often the pursuit of that goal overrides the authentic articulation of experience in the first place. It becomes easier to cite the judgements of others (i.e. reinforce their object status) than it is to articulate experience from its foundations.

The creation of objects in discourse can reinforce the status of the objects which gave rise to the experience in the first place. Marketeers do this all the time. The new car gives rise to experiences which are codified into a discourse which establishes itself as an object and which reflects back on to the object to which it refers. Only when something breaks down in the original object is this cycle challenged (the VW emissions scandal is a good example).

The process of forming aesthetic judgements is a process of managing uncertainty. Teaching performances, art, music all create an uncertain environment which is confusing. This is its power. Adaptation to this uncertainty involves the perception of pattern and form. That process in itself is a seeking of things which are the same and things which are different. Judgement depends partly on induction and induction depends on regularity, similarity, identifiable succession. This process is also uncertain. The formation of utterances about experiences is rarely purely individual - particularly in the case of teaching. It involves conversation between people experiencing the same thing and the coordination of many different descriptions of the performance. However, in the domain of discourse and conversation, some of the distinctions which might be reflexively perceived individually get lost.

In higher learning, there is a continual disruption to the objectification of discourse. There is a continual pulling apart of concepts within a group so that individual perceptions are not lost. This is a dialogical process which seems to be increasingly rare in universities. I fear it is partly because those individuals who are best at it perform the worst in current measures of teaching excellence and are dispensed with!

Sunday, 20 August 2017

Potlatch and Education

I've been a bit of a journey in thinking about the relation between education and economics. It seems to me that the proponents of the market-oriented rhetoric in education, and its opponents, are both preaching from economics textbooks which are fundamentally wrong on many important issues and certainly do not explain the complex things which happen in education.

Wednesday, 16 August 2017

Bateson on "Pride and Symmetry"

I've been thinking a lot about symmetry recently, and in recommending a student read Bateson's paper on Alcoholics Anonymous ("The cybernetics of self"), I noticed a sub-heading which struck me with more force than it did when I last looked at the paper: "Pride and Symmetry". Bateson was interested in symmetrical relations - particularly social symmetrical relations. This is where his notion of symmetrical and complementary schizmogenesis comes from, and he uses this idea to explain the double-bind that the alcoholic is in:

The so-called pride of the alcoholic always presumes a real or fictitious “other” and its complete contextual definition therefore demands that we characterize the real or imagined relationship to this “other.” 
A first step in this task is to classify the relationship as either “symmetrical” or “complementary” (Bateson, 1936). To do this is not entirely simple when the “other” is a creation of the unconscious, but we shall see that the indications for such a classification are clear. 
An explanatory digression is, however, necessary. The primary criterion is simple: If, in a binary relationship, the behaviors of A and B are regarded (by A and B) as similar and are linked so that more of the given behavior by A stimulates more of it in B, and vice versa, then the relationship is “symmetrical” in regard to these behaviors. If, conversely, the behaviors of A and B are dissimilar but mutually fit together (as, for example, spectatorship fits exhibitionism), and the behaviors are linked so that more of A’s behavior stimulates more of B’s fitting behavior, then the relationship is “complementary” in regard to these behaviors. 
Common examples of simple symmetrical relationship are: armaments races, keeping up with the Joneses, athletic emulation, boxing matches, and the like. Common examples of complementary relationship are: dominance-submission, sadism-masochism, nurturance-dependency, spectatorship-exhibitionism, and the like. More complex considerations arise when higher logical typing is present. For example: A and B may compete in gift-giving, thus superposing a larger symmetrical frame upon primarily complementary behaviors. Or, conversely, a therapist might engage in competition with a patient in some sort of play therapy, placing a complementary nurturant frame around the primarily symmetrical transactions of the game. 
Various sorts of “double binds” are generated when A and B perceive the premises of their relationship in different terms-A may regard B’s behavior as competitive when B thought he was helping A. And so on. With these complexities we are not here concerned, because the imaginary “other” or counterpart in the “pride” of the alcoholic does not, I believe, play the complex games which are characteristic of the “voices” of schizophrenics. Both complementary and symmetrical relationships are liable to progressive changes of the sort which I have called schismogenesis (Bateson, 1936).
Symmetrical struggles and armaments races may, in the current phrase, “escalate”; and the normal pattern of succoring-dependency between parent and child may become monstrous. These potentially pathological developments are due to undamped or uncorrected positive feedback in the system, and may-as statedoccur in either complementary or symmetrical systems. However, in mixed systems schismogenesis is necessarily reduced. The armaments race between two nations will be slowed down by acceptance of complementary themes such as dominance, dependency, admiration, and so forth, between them. It will be speeded up by the repudiation of these themes. This antithetical relationship between complementary and symmetrical themes is, no doubt, due to the fact that each is the logical opposite of the other. 
In a merely symmetrical armaments race, nation A is motivated to greater efforts by its estimate of the greater strength of B. When it estimates that B is weaker, nation A will relax its efforts. But the exact opposite will happen if A’s structuring of the relationship is complementary. Observing that B is weaker than they, A will go ahead with hopes of conquest (cf. Bateson, 1946, and Richardson, 1935). 
This antithesis between complementary and symmetrical patterns may be more than simply logical. Notably, in psychoanalytic theory (cf. Erikson, 1937), the patterns which are called “libidinal” and which are modalities of the erogenous zones are all complementary. Intrusion, inclusion, exclusion, reception, retention, and the like-all of these are classed as “libidinal.” Whereas rivalry, competition, and the like fall under the rubric of “ego” and “defense.”

Saturday, 5 August 2017

Gombrich on Semiotics

I'm going to begin my talk on Peirce next week with Ernst Gombrich's preface to the 2000 edition of Art and Illusion. This short piece fascinated me as much as the contents of the rest of the book when I first encountered it nearly 20 years ago. Art and Illusion is about the relationship between art and nature, and the Greek idea of mimesis. The relationship between art, pictures and signs is clearly an important sub-topic in this, and this is what Gombrich wrote his new preface about - at a time when semiotics was much discussed in the art schools (late 90s, early 2000s). His aim was to correct the current fashion which argued from a constructivist position that all images were signs, that the Greek idea of mimesis was nonsense. Gombrich begins:

[the] commonsense interpretation of the history of Western art has recently been attacked on the ground that the whole idea of mimesis, truth to nature, is a will-o'-the-wisp, a vulgar error. There never was an image that looked like nature; all images are based on conventions, no more and no less than is language or the characters of our scripts. All images are signs, and the discipline that must investigate them is not the psychology of perception—as I had believed—but semiotics, the science of signs.
Gombrich argues that this reaction is overstated: the thirst for illusion is unabated - the goal of mimesis captivates the imagination. Gombrich, always ahead of his time, points out the technological advances in pursuit of mimesis:
Simulators were developed for the training of pilots, who put on a helmet through which their eyes were fed the appearance of an environment rushing past, which they were asked to control. More recently, so called "virtual reality" has been perfected, which allows us not only to see and hear an invented reality but even to touch it with specially constructed gloves. I do not know whether this device will, or can become a medium of art; all that matters in the present context is the undeniable evidence that images can be
approximated to the experience of reality
Gombrich talks about the 'mental set' - the field of expectations - through which signs are interpreted. He points out the playfulness in the interpretation of signs, and the shifts in mental set. For example, the puppet theatre which might transfix the child's imagination in a story suddenly disrupts this expectation when the giant puppeteer's hand appears in the scene to move a character.

What Gombrich appears to be talking about are the constraints within which signs are interpreted: that a sign is not a construct of some individual mind, but that it is the result of a game played within multiple contexts (or constraints) of sensory stimuli, life experiences, expectations, education, social situations, and so on. The game of mimesis is played between image, perception and illusion, among many other things.

The contributing factors in the game are additional descriptions. He says:

A string of ovals can also be an ornament purely used for decoration, as in this case: 0000. But add the word "PLUM" underneath  and you transform the mental set: the oval no longer appears to stand on a neutral background, it is surrounded by an infinite halo of space, because we expect plums to be solid, and not only to be edible, but also graspable—an effect we can further enhance by the suggestion of a foreshortened stalk and leaves.

He comes to the crux of the issue, highlighting the importance of the game that is played in recognising a sign:
We come to realize in such cases that the required mental set did not precede the reading, but followed in a rapid feedback process. Where signs and images appear together on the page the feedback works almost instantly—witness the ease with which our youngest read so-called comics, combining pictures with a simple story. 
The difference between images and signs, then, does not lie in the degree of iconicity or conventionality. Images can function as signs as soon as they are recognized. We need only think of the labels on cans to realize that a perfect iconic image can function as a sign.

Gombrich tells a story about Constable whose judgement about early photography is very revealing in both his and Gombrich's attitude to the relationship between the image and nature:
In 1823 Constable visited a sensational display, the diorama constructed by Daguerre, later the inventor of the daguerrotype. "It is in part a transparency," he wrote, "the spectator is in a dark chamber, and it is very pleasing and has great illusion. It is outside the pale of art because its object is deception. The art pleases by reminding, not deceiving."
In reflecting what Constable might have meant by "outside the pale of art", Gombrich says:

Would we go quite wrong in suggesting that, for Constable, art had become something like a game of skill, with its own rules, which must be kept free of labor saving devices? To deceive the eye is to cheat, for the painter must please by reminding, just as the playwright of Shakespeare's Prologue must work on our "imaginary forces." Fidelity to nature has to be achieved within the limits of the medium. Once this compact between the artist and the beholder is destroyed, we are outside the pale of art. Indeed, as soon as Daguerre's and Fox Talbot's mechanical methods entered the field, art had to shift the goalposts, and move the pale elsewhere. 

There's something very profound in what is it to remind rather than deceive. I wrote something about this with regard to music a few years ago: Art reminds by overlaying descriptions on top of one another. I think its interplay of multiple descriptions reminds us of the interplay of multiple descriptions in our lived experience. To deceive us of reality is to identify and reproduce as faithfully as possible the descriptions of actual experience. Since the actual experience of one person and another is different, this deception necessarily abstracts from individual experience the principal descriptions which it sees to be universal - some of these abstracted descriptions can be taken as 'signs'. The complexity of the interaction of abstracted descriptions is never the same as the overlaying of multiple descriptions to produce complexity.

Tuesday, 1 August 2017

Semiotics and Symmetry

Next week I'm giving a talk about Peirce and Quaternions at the Alternative Natural Philosophy Association. It's a fascinating group which was introduced to me by Peter Rowlands at Liverpool, and I've quite enjoyed getting stuck in to thinking about Peirce long after I'd thought I'd left all that stuff behind.

The thing which has dragged me back to Peirce is his interest in quaternions, which Peter Rowlands introduced me to through his physical theory. It was a coincidence that I discovered that Peirce had been fascinated by Hamilton's work too - largely because of his father who was quite an eminent mathematician. In Peirce's writing, the quaternion tables are quite prominent, and I'm pretty convinced that his obsession with tripartite structures derives from this.

What put me off Peirce is a kind of semiotic dogmatism which analysed the stuff of the world as Symbol, Icon and Index, obsessing about Interpretants, signs and representamens. There didn't seem to be any ground for the dogmatism. But of course, this was the fault of those who jumped onto the Perice bandwaggon, not the man himself. Even within more thoughtful scholarship, and emphasis on semiosis as process  (which it clearly is), the Peircian categories are overlaid as if to say "this sign is produced by this process".

What does it mean to say "this sign" anyway? This has got me thinking about contexts, and whether Peirce's sign theory is really a theory about the context of signs.

A tripartite, anticommutative, symmetrical idea like the quaternions is an interesting way of thinking about contexts. We detect sameness and stability through a continually changing context. To say "this is a sign" is to make a declaration about something remaining the same despite changes in the context of its perception. Peirce's distinction between Sign, Representamen and Interpretant are different dimensions of the context, and within each dimension, there are a further three subdivisions. So "sign" breaks down into Icons, Indexes, and Symbols, for example. His firstness, secondness, thirdness feels like the three dimensions which hold the structure together.

This has significance for the way we think about analogy, sameness and induction (which is dependent on analogy) - Peirce was doing logic after all.

Sameness, counting, induction and analogy are all declarations: we say "this is a chair" because of its sameness with other chairs. We say "there are three chairs" because of the sameness between them. Of course, in making declarations like that, we are producing signs; but the declaration itself is necessitated by the differences between the context of the perceptions of the objects. Nothing is ever really "the same".

However, things may be symmetrical. To make a sign and say "this is a chair" is to respond to the differences of context in which chairs are perceived. Might those changes in context result from an anti-commutative rotational symmetry? I'd like to explore this. What of the anti-commutative rotational symmetry of the statement "this is a chair?" - are the changes in the contexts related? What are their dynamics? How might we investigate it?

The best way we have of investigating a context - or a constraint - is information theory. It's a crude instrument. However, what it does do is allow us to look at the many descriptions of something (the statements people make about something) and see how they relate to one another. The most interesting context to do this is over time-based media like music or video. constraints change over time, and it is possible to explore the dimensions of constraint over time, and particularly the way that changes in one constraint relates to changes in another.

The technique for doing this is known as relative entropy. A similar technique is used for exploring the presence of entanglement in physics. There, the descriptions of charge, mass, space and time seem fixed - and yet, do these properties also change the context for observation?

Questions like this are intriguing because they hint at a closing gap between the physical and the social sciences.

Sunday, 30 July 2017


A bit more work to do here (I'm using which is great at hosting and updating the versions of documents. It means that this chapter will magically improve over time!)

Thursday, 27 July 2017

Beer and Illich on Institutional Change: Uncertainty at the heart of the system

Stafford Beer's "Platform for Change" is an extraordinary book which sets out  diagrammatically to document the processes by which the world might move from pathological institutions, markets, exploitation and environmental destruction, to a viable world which lives within its means. The diagrams get more complex, as the book goes on, culminating in this:
Which is a bit daunting. However, there are things to notice. Within each of those boxes, there is a smaller box at the bottom with a "U" in it: this is "Undecidability". I think it could equally be called "Uncertainty", but it is worth noting that around every heavy-type box (in bold), there is a lighter type box which is connected to the "U" box, and which is labelled with things like "Metasystem", "conscience", "reform", and so on. 

Beer's point in platform for change is that the way society manages its "undecidability", or uncertainty, causes pathology. This is most clear from his diagram about the difference from old institutions to new institutions:
What manages uncertainty in the pathological "old" institutions at the top? The "Metalanguage of Reform". This is the drive for "restructuring", "privatisation", "outsourcing" and so on. What does structure mean in the first place - it's in the middle of the box - the hierarchical organisation of most institutions. 

Feeding in to the whole thing in the pathological institution is "Homo Faber" - the maker of increasingly powerful tools which dictate how people should live and drive people into increasing technocratisation. On the other side, we clearly see that this comes at the cost of the "Exploitable earth", with exploitable people, and cost-benefit analysis. On the right hand side at the top, Beer sees the "conservation" movement as the management of uncertainty about the exploitable earth with a metalanguage of "conscience" which is managed by the conservationist's discourse. Of course, this is a reaction to the pathology, but it also appears as part of the overall system of the problem. 

What do to about it?

Uncertainty (or undecidability) has to be managed in a different way. In the lower part of the diagram, Beer imagines a different kind of institution which facilitates the coordination of uncertainty among the different people who engage with it. The Undecidability box is connected to a "Metalanguage of Metasystem" - a way of having a conversation about the way we have conversations. 

Technology works with this not as the continual pathological product of Homo Faber who produces ever-more powerful tools, but as an appropriate response to establishing synergy in the system. Feeding it and monitoring it is "Homo Gubernator" - whose actions are dedicated to maintaining viability, providing safeguards and monitoring the eudemony in the system. 

Of course, it all raises question - but they're good questions. But I've been struck by the similarity between Beer's thought and those of Ivan Illich in his Tools for Conviviality.

For Illich, the problem of the pathological institution (the top of the diagram) is the declaration of "regimes of scarcity": the need to maintain institutional structures in the face of environmental uncertainty, which often takes the form of increasing specialisation, educational certification, division between people in society, and the ever increasing power of tools. This is a positive feedback mechanism whereby increasingly powerful tools generate more uncertainty in the environment which entails a need for more institutional defence, more scarcity declarations, and so on. It is this pathological way of dealing with uncertainty which is the underlying mechanism of the appalling inequality which we are now experiencing. 

For Illich, education lies at the heart of the means to transform this into what he calls a "convivial society". The education system we have produces scarcity declarations about knowledge, and supports professionalisation which alienates people and creates division (we've seen this with populism). 

The solution to this is to invert education - to make knowledge and learning abundant rather than scarce, and to create the conditions for conviviality. Conviviality is an alternative way of managing uncertainty. Its diagrammatic representation is in the "New Institution" box at the bottom of Beer's diagram. Quite simply, conviviality is where each person manages their uncertainty by engaging directly with each other person. Intersubjectivity is the most powerful mechanism for dealing with uncertainty that we have. We do not have to create institutions to manage uncertainty, nor do we need to create ever more powerful tools.

Illich closes the system loop because he sees the limiting of tools as the critical factor in the establishment of a viable convivial society. This limiting is a politicising of technology: it is where a convivial society determines through dialogue what tools are needed, what should be limited, and how it should manage its resources. In effect it is a communitarian approach to managing the commons of education, environment, tools and people  - very similar to that which was studied by Eleanor Ostrom. 

To do this, educational technology is a critical component. We need abundance of information and skill. We need open education and open resources for learning. 

But the most important thing is to see that the route to viability (and the root of our current pathology) is uncertainty. 

Wednesday, 12 July 2017

Winograd and Flores on Computers and conversation

Winograd and Flores wrote this in 1984. Have things changed much?
Computers do not exist, in the sense of things possessing objective features and functions, outside of language. They are created in the conversations human beings engage in when they cope with and anticipate breakdown. Our central claim in this book is that the current theoretical discourse about computers is based on a misinterpretation of the nature of human cognition and language. Computers designed on the basis of this misconception provide only impoverished possibilities for modelling and enlarging the scope of human understanding. They are restricted to representing knowledge as the acquisition and manipulation of facts, and communication as the transferring of information. As a result, we are now witnessing a major breakdown in the design of computer technology - a breakdown that reveals the rationalistically oriented background of discourse in which our current understanding is embedded. 

[...] Computers are not only designed in language but are themselves equipment for language. They will not just reflect our understanding of language, but will at the same time create new possibilities for the speaking and listening that we do - for creating ourselves in language. (Understanding computers and cognition, p78)

Later on Winograd and Flores defend their argument that computers are tools for keeping track of commitments that people make to each other through recording speech acts. They argue:

New computer-based communication technology can help anticipate and avoid breakdowns. It is impossible to completely avoid breakdowns by design, since it is in the nature of any design process that it must select a finite set of anticipations from the situation. But we can partially anticipate situations where breakdowns are likely to occur (by noting their recurrence) and we can provide people with the tools and procedures they need to cope with them. Moreover, new conversational networks can be designed that give the organisation the ability to recognise and realise new possibilities.   (p158)

I'm curious about this because it resonates with many of the aims of big data today. Winograd and Flores were anti-AI, but clearly the mass storage of speech acts does serve to reveal patterns of recurrence and breakdown which do provide anticipatory intelligence (which is what Google Now does).

I think the real issue concerns a deeper understanding of language and conversation, and particularly the inter-subjective nature of conversation - that is, the con-versare nature of it (dancing). 

Saturday, 8 July 2017

Interoperability and the Attenuation of Technological Possibility: Towards Socially Responsible Hacking?

I owe at least 10 years of my career directly or indirectly to generous funding from JISC in the UK and the EU commission. The underpinning rationale which attracted this research money was interoperability in educational technology. It was presence of the Centre for Educational Technology and Interoperability Standards (CETIS) at the University of Bolton which created the conditions for engagement in a wide range of projects. The University of Bolton, of all places, had the greatest concentration of technical experts on e-learning in the world (something continually reinforced to me as I meet colleagues from overseas: Bolton? You were a world-leader!).

Now that most of the project funding opportunities have gone (JISC survives in very different form, but on a mission to keep itself going on a commercial footing which has become problematic), the EU closed its Technology Enhanced Learning strand a couple of years ago (hardly surprising since there were rather too many very expensive projects which delivered little - even for the EU!), and CETIS survives as an independent Limited Liability Partnership (LLP), albeit in a role of more general IT consultancy for education, rather than a focused mission to foster interoperability. The international agency for interoperability in education, IMS, seems to have largely ceded the debate to the big commercial players like Blackboard, who talk the language of interoperability as a salespitch, but have little interest in making it happen.

Now that I am settled elsewhere, and I'm pleased to say, soon to be joined by a former CETIS colleague, it seems like a good time to think about interoperability again. In my current role, interoperability is a huge issue. It is because of interoperability problems that my faculty (just the faculty!) runs four different e-portfolio systems. It is because of a lack of interoperability that the aggregation and analysis of data from all our e-learning platforms is practically impossible (unless you do something clever with web automation and scraping, which is my current obsession), it is because of interoperability problems that individual parts of the faculty will seek new software solutions to problems which ought to merely require front-end adjustments to existing systems, and interoperability problems coupled with pathological data security worries create barriers to systems innovation and integration. Eventually, this becomes unsustainable.

So given all the effort that went into interoperability (my first JISC project was an investigation of interoperable web services in E-portfolio in 2004 - the project concluded that the available interoperability models didn't work and that something should be done about it), how have we got here?

Any new technology creates new possibilities for action. The ways of acting with a new tool may be very different from the ways of acting with existing tools. This means that if there is overlap in the functionality of one tool with another, users can be left with a bewildering choice: do I use X to do a,b and c, or do I use Y to do a, c and z? The effect of new technologies is always to increase the amount of uncertainty. The question is how institutions should manage this uncertainty.

CETIS was a government-funded institutional attempt to manage the uncertainty caused by technology. It served as an expert service for JISC, identifying areas for innovation and recommending where calls for funding should be focused. CETIS is no longer funded by government because government believes the uncertainties created by technology in education can be managed within institutions.. so my university ends up with 4 e-portfolio systems in one faculty (we are not alone). This is clearly bad for institutions, but not bad in terms of a libertarian philosophy to support competition between multiple providers of systems. Having said this, the interoperability battle was lost even when CETIS was flourishing. The dream of creating an educational equivalent of MIDI (which remains the golden child of systems interoperability) quickly disappeared as committees set about developing complex specifications for e-portfolio (LEAP, LEAP2, LEAP2a - see, the packaging of e-learning content (SCORM, IMS-Content Packaging), the sequencing of learning activities (IMS Learning Design, IMS Simple Sequencing), and more recently, Learning Analytics (xAPI).

All of this activity is bureaucratic. Like all bureaucratic processes, the ultimate result is a slowing down of innovation (importantly, this is NOT what happened with MIDI). Whilst technology creates new possibilities, this also creates new uncertainties, and bureaucratic processes act as a kind of weir to stem the flow of uncertainties. Institutions hate uncertainty. In the standards world, this is achieved by agreeing different shaped boxes into which different things can be placed. Sometimes the boxes are useful: we can say to a vendor of e-portfolio, does it support LEAP2a (for example). They might say "yes", meaning that there is an import routine which will suck in data from another system. However, much more important is the question "Does it have an API?" - i.e. can we interact with the data without going through the interface and do new things which you haven't thought about yet? The answer to this is almost always, No! The API problem has also become apparent with social media services too: APIs have become increasingly difficult to engage with, and less forthcoming in the data they provide. This is for a simple reason - for each of the clever things you might want to do with the data, each company wants to provide as a new "premium service".

An alternative to the institutional bureaucratic approach to the interoperability problem would seek to manage the uncertainties created by technology in a different way. This would be to embrace new uncertainties, rather than attenuate them,  and create situations within institutions where processes of technical exploration and play are supported by a wide range of stakeholders. One of the problems with current institutionally attenuative approaches to technology is that the potential of technology is underexplored. This is partly because we are bad at quantifying the new possibilities of any new tool. However, in working with most institutional tools, we quickly hit barriers which dictate "We can't do that", and that's the end of the story. But there are usually ways of overcoming most technical problems. This is what might be called the "Socially responsible hacking" approach to interoperability. With the failure of bureaucratic interoperability approaches, this may be the most productive way forwards.

Socially Responsible Hacking addresses the uncertainty of new technology in dialogue among the various stakeholders in education: programmers who see new ways of dealing with new and existing tools, teachers who seek new ways of organising learning, managers who seek new opportunities for institutional development, learners who seek new ways of overcoming the traditional constraints of institutions, and society within which educational institutions increasingly operate as something apart, rather than as an integral component. 

Wednesday, 5 July 2017

Saturday, 1 July 2017

Ivory Towers and the Grenfell Tower: The problem with Evidence

The Grenfell Tower fire represents a collapse of trust in expertise and evidence, and will bring about a reawakening of scepticism. Newsnight's report on "How flammable cladding gets approved" - raises questions about the role of evidence beyond fire safety. In policy in health, education, welfare, economics and housing evidence is the principal aid for decision-making. What Enid Mumford calls "dangerous decisions" are supported by studies which demonstrate x or y to be the best course of action. The effect of these studies is to attenuate the range of options available to be decided between. Of course, in that attenuation, many of the competing descriptions of a phenomenon or subject are simplified: many descriptions are left out, some voices are silenced. Usually, the voices that are silenced are those "on the edge": the poor, immigrants and the occasional "mad professor". From Galileo to Linus Pauling, history tells us that these people are often right.

Understanding "evidence" as "attenuation" helps us to see how easily "evidence-based policy" can become "policy-based evidence". Evidence can be bent to support the will of the powerful. The manifestations of this exist at all levels - from the use of econometrics to produce evidence to support austerity to the abuse of educational theory in support of educational interventions (which so many educational researchers, including me, are guilty of). But it helps academics to get published, to raise our status in the crazy academic game - and, once established in the sphere of the University, the habit sticks. Effective decision-making is intrinsic to effective organisation. If organisational pathology creeps in, decision-making within a pathological organisation will be constrained in ways which obscure real existent problems.

The deeper problems concern academia's and society's allergy to uncertainty. We hold to an enlightenment model of scientific inquiry, with closed-system experiments and the identification of causal relations through the production of event-regularities. Too often we pretend that the open systems with which we engage are closed systems whose event regularities are no longer physical events, but statistical patterns. So Stafford Beer's joke that "70% of car accidents are caused by people who are sober" entailing that we should all drink and drive, highlights the dangers of any statistical measure: it is an attenuation of descriptions - and often an arbitrary one at that.

The computer has changed the way we do science, and in almost all areas of inquiry from the humanities to physics, probabilities are what we look at. These are maps of uncertainty, not pointers to a likely successful outcome, or a statistically proven relation between an independent variable and a probability distribution. What is an independent variable, after all? It is a single description chosen out of many. But its very existence is shaped by the many other descriptions which are excluded by its isolation. And we don't seem to care about it! I review endless depressing papers on statistical approaches to education and technology, and I see these assertions being made without the slightest whiff of doubt - simply because that is how so many other papers which are published do it. I reject them all (although always gently - I hate horrible reviews - but always inviting authors to think harder about what they are doing).

Uncertainty is very difficult (probably impossible) to communicate through the medium of the academic journal article. The journal article format was devised in 1662 for an enlightenment science which is radically different from our own. Of course, in its time, the journal was radical. The effect of printing on a new way of conducting and communicating science was only just opening up. Printing was doing to the academic establishment what it did to the Catholic church a century before. Enlightenment scholars embraced the latest technology to harness their radical new practices.

We should be doing the same. The experiments on building cladding are easily demonstrable on YouTube. Equally, uncertainties about scientific findings can be expressed in rich ways using new media which are practically impossible in the journal. The scientists should learn from the artists. Furthermore, technology provides the means to democratise the making of descriptions of events. No longer is the description of an event the preserve of those with the linguistic skill to convey a compelling account in print. The smartphone levels the playing field of testimony.

Our decisions would be better if we became accustomed to living with uncertainty, and more comfortable living with a plurality of descriptions. The idea of "evidence" cuts against this. We - whether in government or academia - do not need to attenuate descriptions. Uncertainties find their own equilibrium. Our new media provide the space where this can occur. Universities, as the home of scholarly practice in science, should be working to facilitate this.

Friday, 30 June 2017

The Paradox of Institutional Change in Universities: The Strategic Need for a Pincer-Movement

The last 10 years has seen most Universities in the UK undergo significant restructuring. These processes, which are still ongoing - most terribly at Manchester and the OU at the moment - are intended to deliver transformations to the institution's financial viability, their "market appeal", improvement of the student experience, and increasing competitiveness in research and teaching.

The results from the last 10 years of restructuring tells us quite clearly that NONE of this actually occurs. Departments may be closed, and salaries saved, but within a few years, the salary bill creeps up to exceed what it was before. Staff morale is damaged through the autocratic processes by which friends, colleagues and (most importantly) conversations are broken up. The atmosphere in institutions whilst restructuring occurs is dismal and this has an impact on students.

The recruitment of new (cheaper, younger) staff can also be highly problematic. Some of these will be adjuncts, paid very little, and struggling to survive, let alone teach their large (and highly profitable) Masters class of overseas students in the Information Systems department.  These people are clinging on to the academy in the hope that something better comes up. But things continue to get worse. Other new staff will be recruited on a kind of "metric" basis - those with the most papers wins! Never mind what they are like as people, how collegial they are, how well they care about their students. And often, they are appointed by a few senior colleagues, because the junior staff who keep the department going are all at risk of redundancy.

The spirit of despondency turns out to be highly contagious. The new staff - particularly the good ones - leave. The students complain - although they continue to attend in sufficient numbers to keep the thing on the road because almost everywhere else is the same.

Who benefits from restructuring? Usually, only the person who thought the thing up.  There is a real and deep question about institutional change which needs to be addressed.

Organisms change their structure when the structure of their environment changes. What is the environment of the University? With the student-as-customer rhetoric, are students cast in the role of the "environment" of universities? Universities seem to believe this, because they attempt to adapt to meet student expectations.

But many would argue that society at large is the environment of the university. What is the relation between the University and society? Well, it is one of circular causation. The university produces important aspects of society (its knowledge), and society produces the university through society's requirement to think about itself and produce new components like doctors, teachers and government ministers.  Of course, society includes learners... and teachers, administrators, tax payers, voters, Brexiteers, Remainers, banks, Corbynites and Theresa May.

History tells us that Universities do change over time. Like biological organisms, change comes about through adaptation to changes in the environment - to changes in society. Francis Bacon's 1605 "The Advancement of Learning" was a wake-up call to universities, just as the Reformation was a wake-up call to the Catholic church. Curiously though, the members of the 17th century "invisible college" beavering away at scientific experiments outside the university had all been through the academic establishment at some point. The early IT pioneers like Gates and Jobs, the military developers of the internet and the Whole Earth Catalogue existed on the periphery of the institution in a counter-cultural bubble. The same might be said of the off-piste developments in BitCoin in the early 2000s.

University change takes the form of organic absorption of the counter-culture. Jazz improvisation, for example, moves from seedy strip joints to the university classroom (with its professors of jazz) in the space of 90 years.  The only counter-cultural development which has resisted this seems to be the sex industry, and yet its adoption and development of technology has paved the way to the iPlayer and lecture capture!

What can we learn from this?

  • Institutional restructuring is institutional self-harm. 
  • If institutions change in response to changes in their environment, perhaps they should consider nurturing environmental changes which they might find challenging in the short-term, but to which their adaptation will be fruitful in the future. 
  • The obvious thing here is to develop feasible free personal certificated learning - but this is NOT a MOOC and it is not a marketing exercise. The institution doesn't need to make its presence felt, but to support social movements. 
  • Institutional change is likely to result from a pincer-movement: Constructive internal initiatives to help an institutional culture thrive are good, but they go hand-in-hand with initiatives to develop challenging things in the environment. 

Wednesday, 28 June 2017

Viable Institutions

Still a lot to do here, but it's taking shape...

Saturday, 24 June 2017

Government as Steering: Cybernetics and the Coming Labour Government

The joy surrounding Jeremy Corbyn's success in the election masks a need to do some very difficult work if a left wing labour government is going to deliver on the promise to transform society. There is muddle-headedness about the practicalities of government, the way events can overtake good intentions (no politician would have wanted a Grenfell on their watch), or the sheer challenge of keeping a political machine together which always seems hell-bent on self-destruction (all political parties seem to have this tendency).

Now is a golden opportunity to do this. Corbyn has the luxury of opposition where his grip on the party has been strengthened, and public expectation of a Corbyn victory (unthinkable before the election) has shifted significantly. These are real achievements.

Labour, and Corbyn, have got here because the Tories don't know how to govern. They see the world in a linear and hierarchical way, where simple "strong and stable" solutions can solve intractable problems. When things don't work out the way they wished (like the deficit coming down), the Tories tend to carry on regardless: strong and stable. This isn't government. It is ideological extremism.

"Government" and "governor" come from the same latin root: Gubernator. The Watt governor is the simplest idea of governing:

The Watt governor 'steers' the engine, by increasing the flow of steam if the engine runs too slow, and decreases it if it runs to fast. The Greek word for governor is kybernetes, from which we get cybernetics. The Kybernetes was the steersman on the ship, so cybernetics is about steering. And so is government.

Stafford Beer is the cybernetic thinker who considers the problems of government (and its related problem, management) in most detail. I have thought about the Viable System Model (see for many years, and the Cybersyn experiment in Chile of 1971-3 (see remains the most significant attempt to rethink government (apart from some promising experiments in the Soviet Union which didn't get off the ground properly - see

There is a fundamental problem that the VSM addresses: the problem of attenuating descriptions of the world. In hierarchical power structures like governments, or bosses of universities, hospitals or any institution for that matter, the "top" relies on filters to give them the most important information from the ground. This is where the pathology starts, because the filter entails removing most of the other descriptions which are not considered important. This is why the election opinion polls got it so wrong - because they didn't listen to the variety of description that was out there. Technology has made the situation worse - it can filter more effectively than anything else - although this is a stupid way to use technology!

The VSM is a set of nested loops within which there is attenuation of description (there has to be), but at the same time the attenuated descriptions are organised into the production of a generative model whose engagements with the organisation (or country) that is being managed is continually monitored. The circular loop continually asks "Are we right?", "In what ways are we wrong?", "What have we learnt about the world that we didn't know before?", "How should the model be changed?". In other words, there is attenuation, and there is amplification of the abstracted model in a continual process of organic adaptation (Beer described his model using the metaphor of the human body). This is steering.

In theory, this is fine, and the VSM is often used in management consultancy to help heal organisational pathology: I'm hosting a conference in November at Liverpool on this very topic:

But apart from Cybersyn, there has been no real-time empirical attempt to exploit this thinking in government or management. We should do it, because our existing models of government cannot deal with the obvious circular causality which is endemic in our world, from overseas wars and local terrorism to austerity and burning tower blocks.  We have to have a practical way of dealing with circular causation, and I worry that Corbyn's labour isn't prepared.

Beer's Cybersyn was a data-driven operation in a world where data was hard to come by (they transmitted it with Telex machines). Today, we have data everywhere - but we don't know how to use it. Most approaches to "big data" seek to amplify automatic "filters" of complexity - this is basically what machine learning does. That's fine up to a point, but whatever filters are produced, are used to create a model which must be tested and improved. The human thinking about the rightness of the models used doesn't appear to happen. All "big data" results are the opportunity for humans to produce new descriptions of the world, and for these new descriptions to feed into higher level steering processes. But it doesn't happen. Consequently, we allow the "big data" to dictate how the world should become without thinking about what we've missed.

One of the critical signs that any government or management should worry about is a decrease in the variety of description about something. This is usually the harbinger of catastrophe. Our Universities are heading straight for this, because they are removing vast chunks of variety in the conversations and descriptions which are made within them as they close departments, sack staff, become fixated on metrics of academic performance which mean nothing, or chase government targets for "teaching excellence" in the hope of getting more money. Nobody is monitoring the richness of conversation in Universities. Yet, the true strength of any university is the richness of the conversations which it maintains.

The same goes for a healthy society. The urgency of thinking about this was impressed upon me a couple of days ago when I received a text message from a bright and brilliant academic and friend in my old institution (one of only a few in that awful place). It's a dismal reminder of how much trouble we are in: "I've just been told I'm being made redundant". So that's another conversation killed.