Saturday, 24 June 2017

Government as Steering: Cybernetics and the Coming Labour Government

The joy surrounding Jeremy Corbyn's success in the election masks a need to do some very difficult work if a left wing labour government is going to deliver on the promise to transform society. There is muddle-headedness about the practicalities of government, the way events can overtake good intentions (no politician would have wanted a Grenfell on their watch), or the sheer challenge of keeping a political machine together which always seems hell-bent on self-destruction (all political parties seem to have this tendency).

Now is a golden opportunity to do this. Corbyn has the luxury of opposition where his grip on the party has been strengthened, and public expectation of a Corbyn victory (unthinkable before the election) has shifted significantly. These are real achievements.

Labour, and Corbyn, have got here because the Tories don't know how to govern. They see the world in a linear and hierarchical way, where simple "strong and stable" solutions can solve intractable problems. When things don't work out the way they wished (like the deficit coming down), the Tories tend to carry on regardless: strong and stable. This isn't government. It is ideological extremism.

"Government" and "governor" come from the same latin root: Gubernator. The Watt governor is the simplest idea of governing:
















The Watt governor 'steers' the engine, by increasing the flow of steam if the engine runs too slow, and decreases it if it runs to fast. The Greek word for governor is kybernetes, from which we get cybernetics. The Kybernetes was the steersman on the ship, so cybernetics is about steering. And so in government.


Stafford Beer is the cybernetic thinker who considers the problems of government (and its related problem, management) in most detail. I have thought about the Viable System Model (see https://en.wikipedia.org/wiki/Viable_system_model) for many years, and the Cybersyn experiment in Chile of 1971-3 (see https://en.wikipedia.org/wiki/Project_Cybersyn) remains the most significant attempt to rethink government (apart from some promising experiments in the Soviet Union which didn't get off the ground properly - see http://dailyimprovisation.blogspot.co.uk/2014/11/social-ecology-and-soviet-cybernetics.html).

There is a fundamental problem that the VSM addresses: the problem of attenuating descriptions of the world. In hierarchical power structures like governments, or bosses of universities, hospitals or any institution for that matter, the "top" relies on filters to give them the most important information from the ground. This is where the pathology starts, because the filter entails removing most of the other descriptions which are not considered important. This is why the election opinion polls got it so wrong - because they didn't listen to the variety of description that was out there. Technology has made the situation worse - it can filter more effectively than anything else - although this is a stupid way to use technology!

The VSM is a set of nested loops within which there is attenuation of description (there has to be), but at the same time the attenuated descriptions are organised into the production of a generative model whose engagements with the organisation (or country) that is being managed is continually monitored. The circular loop continually asks "Are we right?", "In what ways are we wrong?", "What have we learnt about the world that we didn't know before?", "How should the model be changed?". In other words, there is attenuation, and there is amplification of the abstracted model in a continual process of organic adaptation (Beer described his model using the metaphor of the human body). This is steering.

In theory, this is fine, and the VSM is often used in management consultancy to help heal organisational pathology: I'm hosting a conference in November at Liverpool on this very topic: http://healingorganisations2017.org.

But apart from Cybersyn, there has been no real-time empirical attempt to exploit this thinking in government or management. We should do it, because our existing models of government cannot deal with the obvious circular causality which is endemic in our world, from overseas wars and local terrorism to austerity and burning tower blocks.  We have to have a practical way of dealing with circular causation, and I worry that Corbyn's labour isn't prepared.

Beer's Cybersyn was a data-driven operation in a world where data was hard to come by (they transmitted it with Telex machines). Today, we have data everywhere - but we don't know how to use it. Most approaches to "big data" seek to amplify automatic "filters" of complexity - this is basically what machine learning does. That's fine up to a point, but whatever filters are produced, are used to create a model which must be tested and improved. The human thinking about the rightness of the models used doesn't appear to happen. All "big data" results are the opportunity for humans to produce new descriptions of the world, and for these new descriptions to feed into higher level steering processes. But it doesn't happen. Consequently, we allow the "big data" to dictate how the world should become without thinking about what we've missed.

One of the critical signs that any government or management should worry about is a decrease in the variety of description about something. This is usually the harbinger of catastrophe. Our Universities are heading straight for this, because they are removing vast chunks of variety in the conversations and descriptions which are made within them as they close departments, sack staff, become fixated on metrics of academic performance which mean nothing, or chase government targets for "teaching excellence" in the hope of getting more money. Nobody is monitoring the richness of conversation in Universities. Yet, the true strength of any university is the richness of the conversations which it maintains.

The same goes for a healthy society. The urgency of thinking about this was impressed upon me a couple of days ago when I received a text message from a bright and brilliant academic and friend in my old institution (one of only a few in that awful place). It's a dismal reminder of how much trouble we are in: "I've just been told I'm being made redundant". So that's another conversation killed.


Monday, 19 June 2017

Technology, Forms and the Loss of Description

When rich descriptions are difficult to bear, methods of attenuating description become attractive. They restrict the mode of expression to that which is permitted by whatever medium is devised for conveying 'standard' messages. We have become so used to this that we barely even notice it.  Paul Fussell identified in "The Great War and Modern Memory", that the means by which descriptions are attenuated emerged from the most brutal and traumatic of events where it was barely possible to articulate how people felt. Before the first world war, there were no "forms" to fill in.

The military authorities did their best to ensure that richer descriptions of the soldier's experiences were not conveyed home, lest it lead to unrest or loss of morale. Fussell describes a letter sent by a young boy in a platoon which went:

Dear Mum and Dad, and dear loving sisters Rosie, Letty, and our Gladys, -
I am very pleased to write you another welcome letter as this leaves me. Dear Mum and Dad and loving sisters, I hope you keeps the home fires burning. Not arf. The boys are in the pink. Not arf. Dear Loving sisters Rosie, Letty, and our Gladys, keep merry and bright. Not arf. 

Today our whole lives are ruled by forms, and even the scope for protesting the restrictions of the medium are curtailed. The best one can do is not fill it in. Such 'data gathering' processes have become part of normal life. We even conduct social research like this. 

Fussell describes the "Form A. 2042" shown above. The Field Service Post Card was sent 
with everything crossed out except "I am quite well" - immediately after a battle which relatives might suspect their soldiers had been in. Such were the hazards of occupying newly blown mine-craters that, according to George Coppard, "Before starting a twelve-hour shift in a crater, each man had to complete a field postcard for his next of kin, leaving the terse message "I am quite well" undeleted."
Soldiers found ways of using the medium to convey messages that the cards were not meant to convey. Fussell notes:
the implicit optimism of the post card is worth noting - the way it offers no provision for transmitting news like "I have lost my left leg" or "I have been admitted into hospital wonded and do not expect to recover".  Because it provided no way of saying "I am going up the line again" its users had to improvise. Wilfred Owen had an understanding with his mother that when he used a double line to cross out "I am being sent down to base" he meant he was at the front again. (Fussell, "The Great War and Modern Memory", p185)
Fussell claims that the Field Service Post Card is the first "form": "It is the progenitor of of all modern forms on which you fill in things or cross out things or check off things, from police fraffic summonses to "questionnaires" and income-tax blanks. When the Field Service Post Cardwas devised, the novelty of its brassy self-sufficiency, as well as its implications about the uniform identity of human creatures, amused the sophisticated and the gentle alike, and they delighted to parody it..."

Today we have video, which has, in many ways, levelled the playing field of testimony: one does not have to be a great poet or writer to convey the complex reality of a situation - anyone can do it. Yet the form remains. How could one summarise the complexity were it not for the tick-boxes?

There is a better answer to this question than tick boxes. The form amplifies a particular set of descriptions as a series of choices. Whatever actual descriptions might be made by individuals, these somehow have to fit the provided descriptions. The interpretation of the fit to the provided descriptions adds a further layer of attenuation.

Institutions and governments fail because they fail to listen to the rich variety of descriptions made within the organisations they oversee. Instead, they collect "data" which they attenuate into "preferred descriptions", and implement policy according to their conclusions. Crisis emerges when the effects of policy are the production of more descriptions which are also ignored. 

Sunday, 18 June 2017

Tuesday, 13 June 2017

Open Educational Resources and Book Printing Machines

"Being open" has been a major theme in educational technology for many years. It goes to the heart of why many have been drawn to education technology in the first place: "let's transform education, make it available to all, liberate ourselves from constraints", and so on. There is an associated economic narrative which speaks of "re-use" and highlights the apparent ridiculousness in the redundancy of so much content - why have 100 videos about the eye when one would do?

The opportunity of technology is always to present people with new options for acting: blogging presents new options for publishing, for example. In effect, new options for acting are new ways of overcoming existing constraints. When looking at any innovation, it is useful to examine the new options it provides, and the constraints it overcomes. Sometimes new technologies introduce new constraints.

What new options does OER provide? What constraints does it overcome?

These are not easy questions to answer - and perhaps because of this, there is much confusion about OER. However, these are important questions to ask, and by exploring them more fully, some insight can be gained into how OER might be transformative.

Enormous amounts of money have been spent on repositories of stuff which are presented as lego bricks for teachers to assemble their teaching. Remember learning objects? Remember widgets? Remember JORUM? The rationale behind much of this was that educational content could be assembled by teachers and incorporated as ready-made chunks of knowledge into new courses. So the constraint was the labour of teachers? Or the cost of resources? OER to the rescue!?

But actually none of this addressed the deep constraint: the course. Who cares about courses? Well, universities do... Courses = Assessment = Money.

Of course, away from courses, there are Open Educational Resources on YouTube, Facebook, Twitter, Wikipedia, Stackoverflow, Listservs, blogs, wikis, and various other specialised disciplinary forums. Moreover, the tools for searching and retrieving this stuff have got better and better. Email histories are now a major resource of information thanks to vast storage of google and the capabilities of their search tools. All of these things have circumvented the constraint of the course.

Universities care about courses. Open Educational Resources cut the costs of setting courses up. And of course, the skill requirements of the teacher might be seen to be lowered to that of curators where the cost saving implications are attractive to university managers: we don't need teachers - we can get this stuff for free, and pay cheap adjuncts to deliver it! So the constraints are financial and organisational.

But... nobody really understands what happens in teaching and learning. Whilst there are ways in which a video on YouTube might be said to "teach", generally teaching happens within courses. But what does the teacher do?

What happens in teaching and learning is conversation. Ideally, in that conversation, teachers reveal their understanding of something, and learners expose their curiosity. This can happen away from formal courses - most notably on email listservs, where (perhaps) somebody posts a video or a paper, and then people discuss it, but it is something that clearly is meant to happen in formal education.

"Revealing understanding" of something is not the same as presenting somebody with ready-made resources and activities (although someone can reveal their understanding of a subject in a video or a book - or indeed, a blog post!). Teachers have always used textbooks, but conversations usually revolve around a negotiation of the teacher's understanding of the textbook. Most textbooks are sufficiently rich in their content to throw up interesting questions for discussion.

Ready-made resources represent someone else's understanding. They can sometimes present an unwelcome extra barrier for the teacher, since the teacher is trying to reveal their understanding of the subject, but are caught trying to reveal their understanding of somebody else's understanding.

Teachers produce resources to help them articulate their understanding. Some very experienced teachers may even write books about their understanding of a subject. When resources are publishable at this level, things get interesting and a new set of constraints emerges. The big constraint is the publishers.

Let's say a teacher writes a book. They send it to a publisher and sign away their rights to it. In signing away their rights to the content, they are restricted in what they might do with the content in future. The book might be very expensive and so the people who a teacher wants to read it, cannot afford it. There may be chunks of text which they might want to extract and republish for a different audience. They can't do it.

I think this is about to change. One of the exciting developments in recent years has been print-on-demand self-publishing. Alongside this, professional typesetting has become within easy reach of anyone. LaTeX-driven tools like Overleaf (http://overleaf.com) make a once-esoteric skill accessible to all. And the book printing machines like Xerox's Espresso Book Machine are the most powerful exemplars of 3D printing:



Why will academics exploit this? Because, whilst publishing with a respectable publishing house is often seen as a 'status marker', it also constrains the freedom of the academic to manage their own resources and engagement with their academic community.

A self-published open book can exist on GitHub as a LaTeX file, which an academic can fork, republish, reframe, etc. And why not allow others to do the same? And if copies can be printed for very little, why not do your own print run and distribute your book to your academic community yourself? For all teachers, and for all academics, the point of the exercise is conversation.

More importantly, with the production of high quality resources that can be exploited in different ways, the teacher is able to express their understanding of not just one but potentially many subjects. What is the difference between a book on methodology in education research to a book on methodology on health research? Might not the same person have something to say about both? Why shouldn't the resources or the book produced for one not be exploited to do the other?

Saturday, 10 June 2017

Albert Atkin on Peirce

I have always been a little bit reticent about Peirce's semiotics. It's become another kind of theoretical 'coat-hanger' which media theorists, communication scholars, educationalists, informational scholars, musicologists, and much postmodern theory has draped 'explanations' which, it seems to me, don't explain very much. My suspicion, as with many social and psychological theories, is that the clergy are a pale imitation of the high-priests. It's the same story with James Gibson and affordance theory. And whilst believing that there's much more to Peirce than meets the eye of someone surveying this academic noise, I haven't yet found a way into it. Until now.

I'm reading Albert Atkin's recent book on Peirce. He articulates exactly how I feel about the sign theory, when he first of all points out that philosophy has largely ignored the sign theory - partly due to unreflective criticisms of analytic philosophers (most notably Quine), whereas

"Interest is much livelier outside of philosophy, but a similar problem lurks nearby. One finds interest in and mention of Peirce's sign theories in such wide-ranging disciplines as art history, literary theory, psychology and linguistics. There are even entire disciplinary approaches and sub-fields - semiotics, bio-semiotics, cognitive semiotics - which rest squarely on Peirce's work. Whilst this greater appreciation of Peirce's semiotic marks a happier state of affiars than that which we find in philosophy, there is still a worry that, as the leading scholar of Peirce's sign theory, T.L. Short, puts it, 'Peirce's semiotics has gotten in amongst the wrong  crowd'. Short's complaint may be a little hyperbolic, but his concern is well founded considering the piecemeal and selective use of Peirce's ideas in certain areas. From a cursory reading of much work in these areas, one might think Perice had only ever identified his early tripartite division of signs into icons, indexes and symbols." (Atkin, "Peirce", p126)
Peirce's biography, which Atkin covers elegantly, is extremely important in understanding how Peirce's logic, mathematics, sign theory and metaphysics fit together. A combination of intellectual isolation - he lost his University position in 1884 and never gained another one - and a unique inheritance from his mathematician father Benjamin Peirce, together with power intellectual life in the family home, set the scene for a radical redescription of logic, mathematics, cognition and science. The simple fact is that the extent to which this redescription is truly radical remains underappreciated - not helped by noisy dismissals by the academic establishment - not only of Peirce himself, but also of some of the foundational work which Peirce built on (he gained his interest in Hamilton's Quaternions from his father; Hamilton's work too suffered some careless dismissals).

If people think they know Peirce, or they know the semiotics, they should think again. I strongly suspect the time for this true original is yet to come. 

Tuesday, 6 June 2017

Distinctions

This is gradually coming together... It helps me to post on here - it's a multiple description!

Monday, 5 June 2017

Peirce on Quaternions

Had it not been for my discussions with Peter Rowlands at Liverpool University, I wouldn't know what a quaternion was. That I took it seriously was because it plays a vital role in Rowland's physical theory which unites quantum and classical mechanics, and my interest in this has evolved through a desire to tackle the nonsense that is talked about in the social sciences about sociomateriality, entanglement, etc. But then there is a another coincidence (actually, I'm more convinced there is no such thing - these are aspects of some kind of cosmic symmetry). I got to know Rowlands because he is a friend of Lou Kauffman, who has been one of the champions of Spencer-Brown's Laws of Form.

One of the precursors to Spencer-Brown's visual calculus is contained in the existential graphs of Charles Sanders Peirce. So on Saturday, I went looking in the collected writings of Peirce for more detail on his existential graphs. Then I stumbled on a table of what looked like the kind of quaternion matrix which dominates Rowlands work. Sure enough, a quick check in the index and Peirce's work is full of quaternions - and this is a very neglected aspect of his work.

To be honest, I've never been entirely satisfied with the semiotics. But the mathematical foundation to the semiotics makes this make sense. It situates the semiotics as a kind of non-commutative algebra (i.e. quaternion algebra) - and in fact what Peirce does is very similar intellectually to what Rowlands does. It means that Peirce's triads are more than a kind of convention or convenience: the three dimensions are precisely the kind of rotational non-commutative symmetry that was described by Hamilton. I'm really excited about this!

So here's Peirce on the "Logic of Quantity" in the collected papers (vol. IV), p110:

The idea of multiplication has been widely generalized by mathematicians in the interest of the science of quantity itself. In quaternions, and more generally in all linear associative algebra, which is the same as the theory of matrices, it is not commutative. The general idea, which is found in all of these is that the product of two units is the pair of units regarded as a new unit. Now there are two senses in which  a "pair" may be understood, according as BA is, or is not, regarded as the same as AB. Ordinary arithmetic makes them the same. Hence, 2 x 3 of the pairs consisting of one unit of a set of 2, say I, J, and another unit of a set of 3, say X,Y,Z the pairs IX, IY, IZ, JX, JY, JZ, are the same as the pairs formed by taking a unit of the set of 3 first, followed by a unit of the set of 2. So when we say that the area of a rectangle is equal to its length multiplied by its breadth, we mean that the area consists of all the units derived from coupling a unit of length with a unit of breadth. But in the multiplication of matrices, each unit in the Pth row and Qth column, which I write P:Q of the multiplier coupled with a unit in the Qth row and Rth column, or Q:R gives:
      (P:Q)(Q:R) = P:R
or a unit of the Pth row and Rth column of the multiplicand. If their order be reversed,
      (Q:R)(P:Q) = 0
unless it happens that R = P.

Monday, 29 May 2017

Symmetry, Learning and Sociomateriality

A number of ideas are bombarding me at the moment. The best thing that has happened to me at Liverpool University is the encounter with the theoretical physics of Peter Rowlands. This is what universities are really about: creating the possibility of encounter with radically new ideas. Peter is interested in reconfiguring the relation between classical and quantum mechanics. At the root of his approach are three simple concepts: the 'nilpotent' - an imaginary number when raised to a power produces zero; quarternions - 3-dimensional imaginary numbers invented by Hamilton which have peculiar properties; and, most importantly, symmetry.

In grappling with this (and I am still grappling with it... this blog is part of the process!) both symmetry and the nilpotent resonate with me in my thinking about cybernetics, learning, music and emotional life. The nilpotent puts the focus on nothing. The universe is about nothing. Now compare this to the importance of absence in  the work of Terry Deacon, or Bhaskar, or Lawson, or the apophatic in the ecological work of Ulanowicz. There is also the category theoretical work of Badiou who places particular emphasis on nothing in his graphs. Is absence nothing? In the sense that absenting is about something "not there"... zero is clearly not there. From Newton's third law (Rowlands has published a number of books about Newton), an obvious point to make is that the resultant force in the Universe is zero. More importantly, the somethings that we see in the universe are the product of constraints of things which we can't see (dark matter/energy). There is a nothingness about dark matter. But there is also nothing in the resultant totality of what we can see and what we can't. The real question is how something emerges from nothing.

In cybernetics, the concept of constraint takes the place of absence - although they are considered to be the same thing (Deacon, Ulanowicz, Lawson are all in agreement about this). The nilpotent idea seems to be mirrored in the tautology of Ashby's Law: a complex system can only be controlled by a system of equal or greater complexity. Something emerges from nothing, through the fact that at any particular level, systems are unstable: The complexity of system a and system b might be greatly different, thus necessitating systems at higher levels to participate in balancing the variety. This is a dynamic process. In terms of understanding it, this is a process that relies on broken symmetry.

Having a fundamental way of describing symmetry-breaking is something which mathematics struggles with. Perhaps the closest we get is in fractal geometry, or in the dynamics of Conway's game of life. But these are the result of heuristics and recursive functions rather than fundamental mathematical properties. The quaternions present a way of thinking about broken symmetry which is fundamental. i, j and k are all square roots of -1, so ii = jj = kk = -1. But ij is not equal to ji. This anti-commutative property gives quaternions the potential to articulate complex matrix structures which have a kind of 'handed-ness'. This abstract property becomes useful to describe the apparent handedness that we see in the universe, from subatomic particles to DNA to the Fibonacci structures in biology.

What's fascinating about this is that nilpotency and broken symmetry combined have remarkable generative properties. For Rowlands, one of the key things is the bridging of the gap between classical mechanics and quantum mechanics. In much social science writing (and in educational technology) it has become fashionable to cite quantum phenomena like "superposition" and "entanglement" as a way of articulating the complex 'sociomateriality' of social life. Many realists object to the woolly language. Scientists like Sokal object to the lack of understanding of physics - although some who promote sociomateriality do so from a scientific perspective - like Karen Barad. Part of the problem lies within physics. Classical mechanics and quantum mechanics are generally considered (not just by Barad) to be different kinds of thing. Rowlands argues that they are the same kind of thing - and in fact, quantum mechanics can be seen to be an entirely logical and consistent extension of classical mechanics. So Newton was more profoundly right about the universe than is widely accepted. Through Rowland's ideas of broken symmetry, issues like superposition and entanglement emerge as logical consequences of the conservation of mass and energy, and the non-conservation of time and space.

So here's a tantalising question: could learning be seen as a product of broken symmetry and nilpotency? My first instinct with these kinds of questions is to ask it of something like learning but more objectively observable: music. Can music be the result of broken symmetry and nilpotency? There have been many studies of the Fibonacci sequence in music - notably in Bartok and Debussy. I strongly suspect the answer to this question is yes. So learning? What is the symmetry of understanding or thinking? Is there a way of answering this question? At a deep level, these are questions about information - and through taking them as such, methods can be devised for exploring them. The way Rowlands is able to explain the emergence of something from nothing immediately suggests a new approach to one of the fundamental questions in the theory of information - the "Symbol grounding problem" (see https://en.wikipedia.org/wiki/Symbol_grounding_problem)

A nilpotent broken symmetry of learning would have to entail a nilpotent broken symmetry of education and other social structures. Might they be investigated in the same way? What about a nilpotent broken symmetry of politics? (Is that dialectic?) Are these too questions about information?

Yes...

Sunday, 21 May 2017

New technologies and Pathological Self-Organisation Dynamics

Because new communications technologies liberate individuals from the prevailing constraints of communication, it is often assumed that the new forces of self-organisation are benificent. Historical evidence for massive liberation of means of communication can tell a different story. Mechanisms of suppression, unforeseeable consequences of liberation - including incitement to revolt - revolution, war and institutional disestablishment follow the invention of printing; propaganda, anti-intellectualism, untramelled commercialism and totalitarianism followed telephone, cinema and TV; and the effects of unforseeable self-organising dynamics caused by the internet are only beginning to be apparent to us. It isn't just trolling, Trump and porn, its vulnerabilities that over-rationalised technical systems with no redundancy expose to malevolent forces that would seek their collapse (which we saw in the NHS last week).

What are these dynamics?

It's an obvious point that the invention of a new means of communication - be it printing or Twitter - presents new options for making utterances. Social systems are established on the basis of shared expectations about not only the nature of utterances (their content and meaning) but on the means by which they are made. The academic journal system, for example, relies on shared expectations for what an "academic" paper looks like, the process of review, citations, the community of scholars, etc. It has maintained these expectations supported by the institutional fabric of universities which continues to fetishise the journal, even when other media for intellectual debate and organisation become available. Journalism too relies on expectations of truth commensurate with the agency responsible for the journalism (some agencies are more trusted than others), and it again has resisted new self-organising dynamics presented by individuals who make different selections of their communication media: Trump.

But what happens then?

The introduction of new means of communication is the introduction of new uncertainties into the system. It increases entropy across institutional structures. What then appears to happen is a frantic dash to "bring things back under control". That is, reduce the entropy by quickly establishing new norms of practice.

Mark Carrigan spoke in some detail about this last week in a visit to my University. He criticised the increasing trend for universities to demand engagement with social media by academics as a criterion for "intellectual impact". What are the effects of this? The rich possibilities of the new media are attenuated to those few which amplify messages and "sell" intellectual positions. Carrigan rightly points out that this is to miss some of the really productive things that social media can do - not least in encouraging academics in the practice of keeping an open "commonplace book" (see http://dailyimprovisation.blogspot.co.uk/2011/07/commonplacing-and-blogging.html)

I'm wondering if there's a more general rule to be established relating to the increase in options for communicating, and its ensuing increase in uncertainty in communication. In the typical Shannon communication diagram (and indeed in Ashby's Law of Requisite Variety), there is no assumption that increasing the bandwidth of the channel affects either the sender or the receiver. The channel is there to illustrate the impact of noise on the communication, the things that the sender must do to counter noise, and the significance of having sufficient bandwidth to convey the complexity of the messages. Surplus bandwidth beyond what is necessary does not affect the sender.

But of course, it does. The communications sent from A to B are not just communications like Twitter messages "I am eating breakfast". They are also communications that "I am using Twitter". Indeed, the selection of the medium is also a selection of receiver (audience). This introduces a more complex dynamic which needs more than a blog post to unfold. But it means that as the means of communicating increases, so does the entropy of messages, and so does the levels of uncertainty in communicating systems.

This is what's blown up education, and it's what blew up the Catholic church in 1517. It's also what's enabled Trump's tweeting to move around conventional journalism and the political system as if it was the Maginot line. As the levels of uncertainty increase, the self-organisation dynamics lead to a solidification (almost a balkanisation - particularly in the case of Trump) of message-medium entities which become impervious to existing techniques for establishing rational dialogue. Government, because it cannot understand what is happening, is powerless to act to intervene in these self-organising processes (it should). Instead, it participates in the pathology.

We need a better theory and we need better analysis of what's happening.

Saturday, 13 May 2017

The Evaluation of Adaptive Comparative Judgement as an Information Theoretical Problem

Adaptive Comparative Judgement is an assessment technique which has fascinated me for a long time (see http://dailyimprovisation.blogspot.co.uk/2011/11/adaptive-comparative-judgement-and.html). Only recently, however, have I had the opportunity for trying it properly... and its application is not in education, but in medicine (higher education, for some reason, has been remarkably conservative in experimenting with the traditional methods of assessment!).

ACJ is a technique of pair-wise assessment where individuals are asked to compare two examples of work, or (in my case) two medical scans. They are asked a simple question: Which is better? Which is more pathological? etc. The combination of many judges and many judgements produces a ranking from which a grading can be produced. ACJ inverts the traditional educational model of grading work to produce a ranking; it ranks work to produce a grading.

In medicine, ACJ has fascinated the doctors I am working with, but it also creates some confusion because it is so different from traditional pharmacological assessment. In the traditional assessment of the efficacy of drugs (for example), data is examined to see if the administration of the drug is an independent variable in the production of the patient getting better (the dependent variable). The efficacy of the drug is assessed against its administration to a wide variety of patients (whose individual differences are usually averaged-out in the statistical evaluation). In other words, in traditional clinical evaluation, there is a linear correlation between
P(patient) + X(drug) = O(outcome)
where outcome and drug are shown to be correlated across a variety of patients (or cases).

ACJ is not linear, but circular. The outcome from ACJ is what is hoped to be a reliable ranking: that is, a ranking which accords with the  judgements of the best experts. But it is not ACJ which does this - it is not an independent variable. It is a technique for coordinating the judgements of many individuals. Technically, there is no need for more than one expert judge to produce a perfect ranking. But the burden of producing consistent expert rankings for any single judge (however good they are) will be too great, and consistency will suffer. ACJ works by enlisting many experts in making many judgements to reduce the burden on a single expert, and to coordinate differences between experts in a kind of automatic arbitration.

Simply because it cannot be seen to be an independent variable does not mean that its efficacy cannot be evaluated. There are no independent variables in education - but we have a pretty good idea of what does and doesn't work.

What is happening in the ACJ process is that a ranking is communicated through the presentation of pairs of images to the collective judgements of those using the system. The process of communication occurs within a number of constraints:


  1. The ability of individual judges to make effective judgements
  2. The ease with which an individual judgement might be made (i.e. the degree of difference between the pairs)
  3. The quality of presentation of each case (if they are images, for example, the quality is important)

An individual's inability to make the right judgement amounts to the introduction of "noise" into the ranking process. With too much "noise" the ranking will be inaccurate.

The ease of making a judgement depends of the degree of difference, which in turn can be a measure of the relative entropy between two examples. If they are identical, then the relative entropy will be the same. Equally, if images are the same, the mutual information between them will be high, calculated as:
H(a) + H(b) - H(ab)
If the features of each item to be compared can be identified, and each of those features belongs to a set i, then the entropy of each case can be measured simply as a value for H, across all the values of x in the set i:

The ability to make distinctions between the different features will depend partly on the quality of images. This may introduce uncertainty in the identification of values of x in i.

What ACJ does is it deals with issues 1 and 2. Issue 3 is more complex because it introduces uncertainty as to how features might be distinguished. ACJ deals with 1 and 2 in the same way as any information theoretical problem deals with problems of transmission: it introduces redundancy.

That means that the number of comparisons needed to be made by each judge is dependent on the quality and consistency of the of the ranking which is produced. This can be measured by determining the distance between the ranking produced by the system and the ranking determined by experts.  Ranking comparisons can be made for the system as a whole, or for each judge. Through this process, individual judges may be removed or others added. Equally, new images may be introduced whose ranking is known relative to the existing ranking.

The evaluation of ACJ is a control problem, not a problem of identifying it as an independent variable. Fundamentally, if ACJ doesn't work, it will not be capable of producing a stable and consistent ranking - and this will be seen empirically. That means that the complexity of the judges performing ranking will not be as great as the complexity of the ranking which is input. The complexity of the input will depend on the number of features in each image, and the distance between each pair of images.

In training, we can reduce this complexity by having clear delineations of complexity between different images. This is the pedagogical approach. As the reliability of the trainee's judgements increase, so the complexity of the images can be increased.

In the clinical evaluation of ACJ, it is possible to produce a stabilised ranking by:

  1. removing noise by removing unreliable judges
  2. increasing redundancy by increasing the number of comparisons
  3. introducing new (more reliable) judges
  4. focusing judgements on particular areas of the ranking (so particular examples) where inconsistencies remain
As a control problem, what matters are the levers of control within the system. 

It's worth thinking about what this would mean in the broader educational context. What if ACJ was a standard method of assessment? What if the judgement by peers was itself open to judgement? In what ways might a system like this assess the stability and reliability of the rankings that arise? In what ways might it seek to identify "semantic noise"? In what ways might such a system adjust itself so to manipulate its control levers to produce reliability and to gradually improve the performance of those whose judgements might not be so good? 

The really interesting thing is that everything in ACJ is a short transaction. But it is a transaction which is entirely flexible and not constrained by the absurd forces of timetables and cohorts of students.



Wednesday, 10 May 2017

The Managerial Destruction of Universities... but what do we do about it?

As I arrived at the University of Highlands and Islands for a conference on the "porous university", there was a picket line outside the college. Lecturers were striking about a deal agreed with the Scottish Government to establish equal pay among teaching staff across Scotland which had been reneged on by management of colleges. The regional salary difference can be as much as £12,000, so this clearly matters to a lot of people. It was a good turnout for the picket line (always an indication of how much things matter) - similar to the one when the University of Bolton sacked their UCU rep and his wife which made the national press (http://www.dailymail.co.uk/news/article-3013860/Lecturer-wife-sacked-failing-University-Bolton-blowing-whistle-100-000-jolly.html)

It is right to protest, and it is right to strike. But sadly, none of this seems to work very well. Bad management seems to be unassailable, and pay and conditions continually seem to get worse.

At UHI, the porous university event was an opportunity to take the temperature of the effects of over 5 years of managerial pathology in universities across the country. The collective existential cry of pain by the group was alarming. The optimism, hope, passion and faith which is the hallmark of any vocation, and was certainly the hallmark of most who worked in education, has evaporated. It's been replaced with fear and dejection. Of course, an outside observer might remark "well, you've still got jobs!" - but that's to miss the point. People might still be being paid (some of them) but something has been broken in the covenant between education and society which has destroyed the fabric of a core part of the personal identities of those who work in education. It's the same kind of breaking of covenant and breaking of spirit that might be associated with a once healthy marriage which is destroyed by a breakdown of trust: indeed, one of my former Bolton colleagues described the spirit of those working for the institution as being like "the victims of an abusive relationship".

Lots of people have written about this. Stefan Collini has just published his latest collection of essays on Universities, "Speaking of Universities", which I was reading on the way up to Scotland. It's beautifully written. But what good does it do?

In the perverse monetised world of universities, the writing and publishing (in a high ranking journal) of a critique of the education system is absorbed and rewarded by the monetised education system. In its own way, it's "impact" (something Collini is very critical of). Weirdly, those who peddle the critique inadvertently support the managerial game. The university neutralises and sanitises criticism of itself and parades it as evidence of its 'inclusivity' and the embrace of opposing views, all the time continuing to screw lecturers and students into the ground.

A good example of this is provided by the University of Bolton who have established what they call a "centre for opposition studies" (http://www.bolton.ac.uk/OppositionStudies/Home.aspx). There are no Molotov cocktails on the front page - but a picture of the house of commons. This is sanitised opposition - neutralised, harmless. The message is "Opposition is about controlled debate" rather than genuine anger and struggle. Fuck off! This isn't a progressive way forwards: it is the result of a cynical and endemic conservatism.

I wouldn't want to accuse Collini of conservativism in the same way - and yet the symptoms of conservativism are there in the way that they exist in the kind of radical "history man" characters that pepper critical discourse. The main features of this?
  • A failure to grasp the potential of technology for changing the dimensions of the debate
  • A failure to reconcile deep scholarship with new possibilities for human organisation
  • A failure to suggest any constructive way of redesigning the system
If I was to be cynical, I would say that this is because of what Collini himself admits as the "comfortable chair in Cambridge" being a safe place to chuck bricks at the system. It is not really wishing to disrupt itself to the point that the chair is less comfortable. 

The disruption and transformation of the system will not come from within it. It will come from outside. There's quite a cocktail brewing outside the institution. One of the highlights of the UHI conference was the presentation by Alex Dunedin of https://www.raggeduniversity.co.uk/. Alex's scholarly contribution was powerful, but he himself is an inspiration. He exemplifies insightful scholarship without having set a "formal" foot inside a university ever. His life has been a far richer tapestry of chaos and redemption than any professor I know. Meeting Alex, you realise that "knowledge is everywhere" really means something if you want to think. You might then be tempted to think "University is redundant". But that might be going too far. However, the corporate managerialist "nasty university" I think will not hold sway for ever. People like Alex burn far brighter. 

Another bright note: Just look at our tools! The thing is, we have to use them differently and creatively. I did my bit for this effort. I suggested to one group I was chairing that instead of holding up their flipchart paper with incomprehensible scribbles on it, and talking quickly in a way that few take in, they instead passed the phone over the paper and made a video drawing attention to the different things on their paper. So paper became a video. And it's great!

Monday, 8 May 2017

Educational Technologists: Who are we? What is our discipline?

I am an educational technologist. What does that mean?

I think we are at a key moment in the history of technology in education and there is a radical choice facing us.

We can either:

  • Use technology to uphold and reinforce the traditional distinctions of the institution. This means VLEs, MOOCs, Turnitin, etc. This enslaves individual brains and isolates them; 
The consequences of this are well summarised by Ivan Illich:
"Observations of the sickening effect of programmed environments show that people in them become indolent, impotent, narcissistic and apolitical. The political process breaks down, because people cease to be able to govern themselves; they demand to be managed."
The alternative?
  • We use technology to organise multiple human brains in institutions and outside them so that many brains think as one brain.

To do the latter, we need to think about what our discipline really is. I am going to argue that our discipline is one that crosses boundaries: it is the discipline of study into how distinctions are made, and what they do.

For many academics, the educational technologist looks after the VLE or does cool videos on MOOCs. They also the person academics seek help from when the techno-administrative burden of modern universities becomes overwhelming: how do I submit my marks, get my students on this course, etc. For some academics, the educational technologist is a kind of secretary - the equivalent of the secretary who would have done the academic's typing in the 1970s when typing was not considered to be an academic activity. Some academics blame the educational technologist for the overwhelming techno-administrative nightmare that constitutes so much of academic life today.

Certainly there is a boundary between the academy and the educational technologist. Like all boundaries, it has two sides. On the one hand, the academy pushes back on the technologists: it generally treats them with suspicion (if not disdain) - partly because it (rightly) sees a threat to its current practices in the technology. The educational technologists have tried to push back on the academy to get it to change, embrace open practice, realise the potential of the technology, etc. Right now, the academy is winning and educational technologists are rather despondent, reduced to producing "learning content" in packages like "storyline" which often reproduce work which already exists on YouTube.

This situation has partly arisen because of a lack of identity among learning technologists. In trying to ape the academy, they established themselves in groups like ALT or AACE as a "discipline". What discipline? What do they read? In reality, there is not much reading going on. There is quite a lot of writing and citing... but (I'll upset a few people here) most of this stuff is unreadable and confused (I include my own papers in this). In the defence of the educational technologist, this is partly because what they are really trying to talk about is so very difficult.

I believe we should admit our confusion and start from there. Then we realise that what we are doing is making distinctions. We make distinctions about learning like this:

or we might make cybernetic distinctions like this:
What is this? There are lines and boxes (or circles). 

What are the lines and boxes doing?

What are the lines around the boxes doing? (these are the most interesting)

Scientific communication is about coordinating distinctions. In coordinating distinctions, we also coordinate expectations. The academy, in its various disciplines, upholds its distinctions. However, as physicist David Bohm realised, scientists don't really communicate.


For Bohm, dialogue is a way to exploring the different distinctions we make. The demand of Bohmian scientific dialogue is to continually recalibrate distinctions in the light of the distinctions of others. More importantly, it is to embrace uncertainty  as the operating scientific principle.

Scientific Dialogue is about communicating uncertainty.

If we are recalibrating, then we are continually drawing and redrawing our boundaries. But this process is controlled by more fundamental organising principles which underlie the processes of a viable organism. It's perhaps a bit like this:


Here we see the death of boundaries, and the reorganisation of the organism. Much of what goes on here remains a mystery to biologists. Some are exploring the frontiers, however. Deep down, it seems to be about information...
or ecology:

Information, semiotics, ecology all concern the making of distinctions. There are a variety of mathematical approaches which underpin this. In fact, Charles Peirce, founder of semiotics, was also the founder of a mathematical approach to studying distinctions.  This is Peirce's attempts to fathom out a logic of distinctions:





And this is the very closely-related work of Cybernetician George Spencer-Brown:


When we talk about education, or technology, or biology... or anything... we are making distinctions.

A distinction has an inside and an outside. We usually forget the outside because we want to communicate the inside. We only know the outside of the distinction by listening.

It is the same in physics - particularly Quantum physics. 


And this is becoming important for the future of computing. Quantum computers are programmed using a kind of "musical score" - like this from the IBM Quantum Experience computer:



So what does all this mean?

Well, it means that science has to embrace uncertainty as an operating principle. Yet science in the academy is still tied to traditional ways of communicating. The academic paper does not communicate uncertainty.

To communicate uncertainty, we need to listen to the outside of our distinctions.

Our scientific institutions need to reconfigure their practices so that the distinction between education and society is realigned to progress society's scientific knowledge.

It is not to say that distinction between education and society should be removed. But that a discipline of examining ecologies of distinctions is essential for a new science of uncertainty to prosper.

It also means that new media should be deployed to communicate uncertainty and understanding on a much wider basis than can be achieved with academic papers. Where we have struggled is in being able to listen to large numbers of people in a coherent way.

This is one way in which we might do this...

It involves doctors and learners using Adaptive Comparative Judgement tools as a means of making diagnoses of retinal scans in examining Diabetic Retinopathy. Adaptive Comparative Judgement is a technique of getting lots of people to make simple comparisons in order to arrive at a ranking of those scans, with the most pathological at one end, and the normal at the other. In addition to this, there is a simple way in which learners can be trained to do this themselves:

Other technological means of getting many brains to act as one brain include BitCoin and the BlockChain that sits behind it...

The MIT Digital Certificates project is exploring ways in which a blockchain might decentralise education...



What about the distinctions between education and society? How might they be better managed?

What about the distinctions between critique and functionalism and phenomenology in education?

Well, the critique only exists because it has something to push against. The thing it pushed against exists partly because of the existence of the critique (and indeed, it embraces the critique!)... We have a knot.

We should understand how it works....



Saturday, 6 May 2017

@siobhandavies and Double Description at @WhitworthArt ... and reflections on Music and Education

Living around the corner from the Whitworth Art gallery means that I often make serendipitous discoveries. I popped into the gallery on my way into the city centre centre this morning and found Siobhan Davies and Helka Kaski doing this as part of their work "Material / Rearranged / to / be" - a dance work inspired by photographs from the Warburg Institute collection:



There's something very cybernetic about what they are doing - indeed, the whole installation's emphasis on action and reflection is very similar to the theme of the American Society for Cybernetics conference in 2013 (see https://www.youtube.com/watch?v=bjGcrEl0fJg). This is rather better than we managed in Bolton!

If the cybernetician Gregory Bateson wasn't the first thinker to have considered the importance of 'multiple descriptions of the world' - particularly in the distinction between connotation and denotation, he certainly thought more analytically about it than anyone else. We live with multiple descriptions of the same thing. In cybernetic information-theoretic terms, we are immersed in redundancy. Why does Siobhan Davies have two dancers mimicking each other? Because the dual presentation is more powerful - perhaps (and this is tricky) more real  - than the single description.

In a world of austerity, what gets stripped away is redundancy. We streamline, introduce efficiencies, 'de-layer' (a horrible phrase that was used to sack a load of people in my former university), get rid of the dead wood (blind to the fact that the really dead wood is usually making the decisions!). The arts are fundamentally about generating multiple descriptions - redundancies. It's hardly surprising that governments see them as surplus to requirements under austerity.  But it spells a slow death of civilisation.

Warren McCulloch - one of the founders of Cybernetics and the inventor of Neural Networks - took particular interest in naval history as well as brains. He was fascinated by how Nelson organised his navy. Of course, there were the flag signals from ship to ship. But what if it was foggy? Nelson ensured that each captain of each ship was trained to act on their own initiative understanding the heuristics of how to effectively self-organise even if they couldn't communicate with other ships. McCulloch called this Redundancy of Potential Command, pointing out that the ultra-plastic brain appeared to work on the same principles. This was not command and control - it was generating sufficient redundancy so as to facilitate the emergence of effective self-organisation. In effect, Nelson organised the many brains of his naval captains to act as one brain.

That's what Davies does here: two brains act as one brain.

This also happens in music... but it hardly ever happens in education. In education, each brain is examined as if it is separate from every other brain. The stupidity of this is becoming more and more apparent and the desperate attempts of the education system to scale-up to meet the needs of society stretch its traditional ways of operating to breaking point. Yet it doesn't have to be like this.

In a project with the China Medical Association, at Liverpool University we are exploring how technologies might facilitate the making of collective judgements about medical conditions. Using an assessment technology called "Adaptive Comparative Judgement" each brain is asked to make simple comparisons like "which of these scans displays a condition in more urgent need of treatment?". With enough people making the judgements and each person making enough judgements, many brains act as one brain in producing a ranking of the various scans which can then be used to prioritise treatment. In practice, it feels like a kind of orchestration. It is the most intelligent use of technology in education I have ever been involved with.

Orchestration is of course a musical term. Musicians are traditionally orchestrated using a score, but there is much more going on. The fine degrees of self-coordination between players is heuristic at a deep level (much like Davies's dance). The performance and the document which describes the manner of the performance are all descriptions of the same thing too. It's redundancy all the way down.

I was mindful of this as I put together this video of my score for a piece I wrote 10 years ago called "The Governor's Veil" with a recording of its performance. In video, with the score following the sound, the double description and the redundancies become much more noticeable.

 

Thursday, 4 May 2017

Teaching, Music and the life of Emotions: a response to distinctions between thinking and knowing

Music makes tangible aspects of emotional life which underpin conscious processes of being – within which one might include learning, thinking, reflecting, teaching, acting, and so on. In education, we place so much emphasis on knowledge because knowledge can be turned into an object. People make absurd and indefensible distinctions between “thinking” and “knowing”, “reflecting” and “acting”, “creating” and “copying” partly because there is no framework for thinking beyond objects; equally nobody challenges them because they are only left with feelings of doubt or alienation that they can barely articulate. The emotional life cannot be objectified: it presents itself “through a glass, darkly”. Only the arts, and particularly music succeeds in “painting the glass”.

In Suzanne Langer’s view, composers and performers are epistemologists of the emotions: in their abstract sonic constructions they articulate what they know about what it is to feel. What they construct is a passage of time over which, they hope, the feelings of listeners and performers will somehow be coordinated to the point that one person might look at another and know that they are feeling the same thing. It is a coordination of the inner world of the many; a moment where the many brains think as one brain. This is the most fundamental essence of social existence.

We each have something of the composer in us in the sense that we (sometimes) express our feelings. But composers do more than this. They articulate what they know about what it is to feel, and their expression is a set of instructions for the reproduction of a temporal form. In mathematics, this kind of expression through a set of instructions is called “E-Prime” (https://en.wikipedia.org/wiki/E-Prime). It’s a bit like the kind of games that people sometimes play: “think of a number between 1 and 10; double it; divide by …”. But similar in kind though such games are, they have nothing of the sophistication of music.

Great teachers do something similar to composers. To begin with, they work with in an immensely complex domain. Broadly, the teacher’s job is to express their understanding of a subject. But when we inquire as to what it is to "express understanding", we are left with the same thing as in music: it is to express what it feels like to know their subject. In great hands, the subject they express and the feelings they reveal are coordinated to the point that what is conveyed is their knowledge of what it is to feel knowing what they do.

Talking about emotions is difficult. It is much easier to talk of knowledge, or to talk about creativity, or thinking in loose rhetorical terms, avoiding any specifics. It is easy to point to pictures of brain scans and make assertions about correlations between neural structures and experiences - which somehow takes the soul of it and gives license to bullies to tell everyone else how to teach based on the brutal "evidence" of neuroscience. Any child will know they are lying. 

We can talk about emotion more intelligently. Wise heads in the past - some from cybernetics - made important progress in this. Bateson's concept of Bio-entropy is, I think the closest description we have of what happens (I had a great chat to Ambj√∂rn Naeve about this yesterday). We should start with music: it is the essence of connotation. It presents the richness of the interaction of multiple descriptions of the world which was at the heart of Bateson's message. It is ecological, and it's ecology is so explicitly ruled by redundancies. And perhaps the most hopeful sign is that the very idea of counterpoint is beginning to take centre stage not just in the way that we analyse ecologies, but in the way that the quantum physicists are programming their remarkable computers. 

Tuesday, 2 May 2017

Relative Entropy in the Analysis of Educational Video

Relative entropy is a calculation much used by quantum physicists to measure degrees of entanglement between subatomic particles. Its formal form is the Kullback-Leibler equation:
It isn't as scary as it looks (information theory rarely is!) - it's basically a metric of distance between a probability distribution P and a distribution Q. If two subatomic particles are entangled (in other words, their behaviour will be coordinated), then the distance between the probability distribution of their behaviour (their expected states) will be zero. 

That Quantum physics tells us something we already know about nature and social life is reflected in the various fluffy uses of "entanglement" (e.g. Latour, Barad, etc) in the social science literature. But this is rarely done with any real insight into what it actually means. It basically seems to say "it's complex, init!".

I'm grateful to Loet Leydesdorff for pointing me in the direction of Kullback-Leibler after I requested some degree of measurement for the synergy between different entropy values for different variables. My inspiration for asking this was in thinking about music. Music presents many descriptions to us: rhythm, melody, harmony, timbre, dynamics, etc. Something happens in music when the change in any of these dimensions is accompanied with a similar change in another dimension: so the rhythm changes with melody, for example. At these moment, we often detect some new idea or motif - it's at these moments that things grow. Basically, I'm drawing on a musical experiment I did a few years ago: http://dailyimprovisation.blogspot.co.uk/2015/09/entropy-and-aesthetics-some-musical.html

The same kind of technique can be applied to the analysis of video. Like music, video presents many different descriptions of things. 

I've been looking at Vi Hart's wonderful video on Fibonacci numbers and spirals. 



There are a rich range of descriptions contained in this video, and I was wondering how the probability distribution of each description relates to the distribution of other descriptions. So I've been doing some analysis, using Kinovea for video analysis, Puredata for analysis of the pitch and rhythm of speech, and using YouTube to produce a transcript of the video from which I can do some entropy calculations. 

After munching on the data and converting it into a form I can deal with, I've imported it all into a Jupyter notebook using Python's Panda dataframes, queried it using sql (using the pysqldf library), and done entropy calculations on the whole thing. 

My code is still a bit rubbish, but it's beginning to tell me things. For example, I can look at the changes in entropy of the transcribed text over window periods as the video progresses. So here is a list of the first 20 seconds in 5-second chunks:

0-5: -0.25206419825534054
5-10: -0.24292065819269668
10-15: -0.3868528072345415
15-20: -0.3333333333333334

Now I can do the same for the 'events' which occur in the video. Here I was a bit stuck to describe things, so that when she drew a spiral, I wrote "spiral". She draws a lot of spirals, so the entropy is uninteresting...

0-5: 0
5-10: 0
10-15: 0
15-20: 0

What? Well, maybe there's an error in my coding - I might go back and add some more detail to my analysis. She keeps on drawing spirals, and therefore the entropy is 0.

What about the pitch of her voice? That's the interesting one... I used PD to do this using fiddle~ (I first played with Fiddle~ in PD years ago in improvisation: http://dailyimprovisation.blogspot.co.uk/2008/06/playing-with-pd-fiddle.html - it just goes to show the importance of documenting everything that we do!)
Now the pitches are more interesting than the video events:

0-5: -0.4533324434922346
5-10: -0.366932572935196
10-15:-0.5315857945285835
15-20: -0.6913119495075026

Is there a correlation there? Well, the range of pitches in the voice increases with the variety of vocabulary used in the text. Perhaps that isn't surprising. But it's not surprising for a reason which has everything to do with relative entropy: the entropy of the use of words is likely to be coupled with the pitch, because with more words, there are more syllables and potentially more opportunities for variety in the pitch. Over a more extended period of time, and taking into account that events do occur in the video which increase its entropy, we can start to examine the relationship between the different aspects of what happens. 

The fact that there is a kind of stable ritual of drawing spirals which runs alongside an increase in the variety of words spoken and pitches used suggests that the actions in the video are a kind of 'accompaniment' to the words that are spoken. To begin with, the ritual of drawing spirals is a kind a 'drone' against which other things happen. As in music, the drone maintains the coherence of the piece. 

Image if she started differently: if she started by doing the maths straight away.. then it would have a very different dynamic. The entropies would also be very different.

Saturday, 29 April 2017

@TedXUoBolton, Science and the Managerial Craving for Academic Celebrity

What has actually happened to Universities in the last 20 years? We only see indicators that things aren't what they used to be, but since those whose job it is to commentate on how things are changing are themselves enmeshed in Universities which are in the throws of these transformations, there appears to be no position from which one can gauge how far our institutions are straying from their historic origins.

So here's the latest sign: the second TEDx event to be held at the University of Bolton. For those students and some of the more junior academic staff taking part in this, it is a great opportunity, and on the face of it, a great idea. But the weird thing is that three senior managers plus a couple of professors from Bolton have been instrumental in creating a platform for themselves.

Heading the bill -  (which is here: http://www.bolton.ac.uk/tedx/speakers/) is Bolton esteemed Vice-chancellor Professor George Holmes DL - cue dancing girls!. If that's not enough of senior management (he's enough for most, including the former UCU rep - http://www.theboltonnews.co.uk/news/11882630.University_of_Bolton_staff__sacked__over_claims_they_leaked_information_to_the_press/), then you can listen to the Deputy Vice-Chancellor, Professor Patrick McGhee! Wait... Yes! I know you want more! So, here's what you've been waiting for - the University's one-and-only Pro-Vice Chancellor, Kondal Reddy Khandadi (cue lots of whooping and cheers)

These illustrious speakers are mixed with academics from other institutions, including Steve Fuller - who gave a nice talk about democratising HE and science, which is something he's been on about for some time (see http://dailyimprovisation.blogspot.co.uk/2014/11/science-and-social-ontology-at-russian.html)

What is this? It looks like a kind of 'academic washing': a way of manufacturing academic credibility and bestowing it upon members of the management team. It runs alongside the 'Royal washing' which Bolton has also been engaged in recently: http://www.bolton.ac.uk/MediaCentre/Articles/2016/Jul2016-07.aspx, and the seriously odd "political washing" of the "Centre for Opposition Studies" - http://www.bolton.ac.uk/OppositionStudies/Home.aspx (I find the picture of the House of Commons curious - there's something they don't understand about opposition... this is entirely sanitised! No reference to struggle, peaceful or violent, whatsoever!)

Then, I'm reminded about TED itself, and the particularly unfortunate episode with Rupert Sheldrake whose talk at TEDx Whitechapel on the "The Science Delusion" was banned. Sheldrake is one of my favourite scientists because he has the courage to ask difficult questions of people who call themselves scientists but chose to ignore those questions because it would make peer review more troublesome. 

Why, then, does TED ban Sheldrake and willingly host Vice Chancellors, Pro-Vice chancellors, and Deputy Vice Chancellors who haven't got anything remotely as interesting to talk about?

So this is the barometer of where things have got to. Most scientists find Sheldrake's "morphogenetic field" idea too esoteric an explanation for the phenomenon of the simultaneous formation of new crystal structures at different points in the world. But even physics used to be more inquiring and accepting of weird ideas.

At my University, the first head of the physics department was Oliver Lodge, who did pioneering work in electromagnetic radiation in the early 20th century, and was also a passionate communicator of science. Fuller's appeal for democratised science is not new, and we should examine those who did it long before. (I'm deeply grateful to Liverpool physicist Peter Rowlands (see https://www.amazon.co.uk/ZERO-INFINITY-FOUNDATIONS-PHYSICS-Everything/dp/9812709142) for pointing me towards this). The football at the beginning of this video is a curious distraction!


What Lodge says in this video is not a million miles away from what Sheldrake says. Lodge was not only a physicist, but a spiritualist. It's the kind of combination that would get you sacked from Universities these days. But instead of getting sacked, Lodge went on to become Vice Chancellor of the University of Birmingham.

So here's the barometer. Lodge exemplifies what scientific inquiry looks and sounds like. He epitomises the spirit of inquiry and communication which suffused the university of his time. He wasn't alone in Liverpool - Charles Sherrington was down the road exploring the neural structures in monkeys' brains (in the attic above my office!); He was a contemporary of and studied alongside William Bateson, who invented the term "genetics" and was father to Gregory Bateson; He worked with other key thinkers in philosophy including Whitehead, with whom he no doubt found much in common.

This simply doesn't happen any more, and we are all the poorer for it. Instead we have managers parading themselves as academic celebrities, making pronouncements about education - about which they understand very little (as we all do). Why? Because we have turned education into a business, where status is money, and money gives status.

There are still people like Lodge around. Sheldrake is one, and so is Peter Rowlands. But they are on the fringes, many clinging to the academy in adjunct positions which save the managers money, and help to fund their yachts, and (no doubt) TedXBolton 2018!

Wednesday, 26 April 2017

Educational Content and Quantum Physics

One the most difficult issues to understand in education is the role of content and its relation to conversation. There are the material aspects of content - physical books, e-books, webpages, interactive apps and tools... What's the difference? There are the many different ways in which teachers can coordinate conversations around the content. And there are the fundamental differences between disciplines. Tools like Maple or Matlab are great for getting students to do virtual empirical work in Maths or Physics. But in sociology and philosophy?

The phenomenology of these tools is radically different. It's not simply about the rather shallow view on "affordances" which was popular a few years ago. It's a much deeper ecological process (which perhaps is where Gibson's original work on affordance should - but didn't - lead us). We can say books afford "flicking through" but in effect that is to reduce the richness of experience into a function. The problem is that the systems designers, with their functionalist bent, will then try to reproduce the function in another form. We only have as many functions as we can give names to them. Yet each function is implicitly dependent on any other function, and on aspects of the phenomenology which we cannot articulate.

I'm thinking about books a lot at the moment partly because I've been learning quantum mechanics using Leonard Susskind's "Theoretical Minimum" (see http://theoreticalminimum.com/) book in conjunction with the videos of his lectures he gave at Stanford. This is the first time I've found any kind of MOOC-like experience actually worth it: A book + online lectures. Of course the objection would be that it is so expensive and resource-hungry to produce a book that this isn't practical unless you are famous like Susskind. But this isn't true any longer.

Book printing machines are extraordinary things. Combined with high quality typesetting using Latex-based tools like Overleaf, the results are as good as anything that Penguin can produce for Susskind. And it's cheap - with most of the self-publishers, the equivalent of Susskind's book could be less than the £9.99 charged by Penguin. All universities can now do the Open University thing at a fraction of the cost.

But what about conversation? In my case, my interest in Quantum Physics is being driven by a conversation with one of the physicists in Liverpool about the use of ideas of 'entanglement' in the social sciences (i.e. sociomaterial stuff, Latour, Barad, etc). Without wanting to "do a Sokal" (https://en.wikipedia.org/wiki/Sokal_affair), it does seem that quantum theoretical terms are being used without deep understanding of what they refer to. Equally, it may be the case that the physics and its mathematical techniques does indeed reflect a wider reality which is already known to our common sense. I think both propositions may be true, and that one way of exploring it is to make a deep and clear connection between the physicists and the social scientists.

Might I pursue the interest in Susskind without my physicist friend? Maybe... but there'll always a be a conversation I will have somewhere where I can process this stuff. But it may not be online.

That is the crucial point - that conversations about matters of curiosity do not necessarily happen online. The current online education model saw that conversation had to happen online because otherwise the education could not be coordinated. But with a good book, and a set of video resources, we can do our own coordination independently of any central authority.

The reason why the online education model forced conversation into forums was I think because it confused learning conversation with assessment processes. In order to assess learners, obviously there has to be some record of the transactions between learners and teachers which reveals their understanding. It might also indicate to teachers new kinds of interventions which might be necessary to steer student learning in particular ways. But if the learning is left to self-organisation processes, and free choice is given to use a variety of different resources (books, webpages, etc), then what needs to be focused on is a flexible and reliable method of tracking (or assessing) development.

But it's not as simple as separating assessment from learning. Assessment is a key moment of learning - it is the moment when somebody else reveals their understanding in relation to the learner's revealing of understanding. That is a key aspect of conversation. In formal education, it can also be a formal transaction - particularly where marks are involved.

This is perhaps where the interaction with online content can be developed. Could it be an explicitly formal interaction of exchanging different understandings of things, and passing judgements about each others' understanding? In the emerging world of learning analytics, there is already something like this going on - but its lack of focus and theoretical clarity are resulting in exacerbating the confusion rather than deepening understanding.