Digital isolation v Digital exile

A snippet from some correspondence with a dear friend:

Ooh, sounds like a blog post in the making – ‘digital isolation v digital exile in a digital world hurtling towards “technological Singularity”‘. Perhaps the government needs a change management plan to deal with those suffering from digital isolation or exile – this is a complex problem connected with their NBN mess, amongst other things – or is this more of a ‘wicked’ problem. I will think on it!

So, what do we have here?

Digital Isolation.  What is this? Is it new, or is it something that we didn’t have a name for, but now have a whole pandemic of? Now this is interesting… imagine actually running a crowdsource to understand what people understand of their current state of digital isolation – does the quality, speed, availability, cost and ‘capability of using’ (thinking literacy and that famous quote: Choose your Authors like you choose your friends) contribute to varying degrees of isolation?  What about those who choose to isolate themselves [digitally] from others, by CHOOSING not to engage using social media, online tools, mobile devices or ‘my’ accounts?  Are they ‘digital hermits’ or suffering from some socially technological dysfunction?  Are these people our odd-uncle-at-the-family-picnic ‘pod-mates’ or dinosaurs in the next ‘village over the hill’ along in the pod-farm of comfortable, home-like acoustically pleasing partitioning?  Of course, running a crowdsource activity which didnt also feature the ability to participate in offline activities would of course limit the participation to only those who were =/ Digitally isolated.  So the activity would measure the degree of digital isolation, not being 100% digitally isolated… and would also have flaws in the science, not considering the [analogue?] socially isolated (even harder to ferret out than those ‘digital, but not isolated’) as being part of the 50th percentile, juxtaposed with the ‘digital’.

How digitally isolated are you, at this very moment?

Digital Exile. What is this?  Is this being ‘grounded’ and Mum turning off the WiFi after 10pm, when all the homework has been done? Is this something that happens when you move away from a fast-food outlet that delivers free wifi, or are unlucky enough (socioeconomic night-breed or zombie-apocalypsed) not to have 24/7 access?

Technological Singularity? Well, we all know what that is, yet, we choose not to do anything about it.  At least, from a public policy perspective – there isn’t any money left to think about the future, we are too busy trying to fix the problems of yesterday. Or at least, too busy trying to get reelected, so that we can consider yesterday’s problems. Is the technological singularity connected with the rise of Homo Evolutis?

Change management plan for the government? Why should they start thinking ahead, now? Did you notice that I am being nonspecific about which country, party or level of government?  I thought you noticed.

What is ‘Coding’ and why should you care?

The future of change management? I wonder if the point of the article was not so much about ‘code’ but understanding the power of a ‘language’.

Taken out of context of any particular code, and code engine; is not the issue here to understand what ‘code‘ does and what a code engine could produce – the key here is understanding the code and using it to best effect, or even stretching the possibilities of the limitations of the language. That’s how new language is born!

Language to human intelligence is like code to ‘artificial intelligence’ – arguably, the language defines the edges of capability to ‘instruct’ something else to perform a task – and the limit of the ability to communicate the task (how, what, why etc) lies in the language and more specifically, the translation of one language into another, sometimes through an innocent third-party language.

So, the understanding of ‘code’ (being an analogy for language applicable to ‘[artificial] intelligence’) is the important part – understanding what the code is for you, and the code for the receiver of the instructions, and the translation requirements (perfect or more likely, imperfect) between the two.  So, code is really ‘communication 101’ – the sender and receiver model.

Sender Receiver Model

Try as I might, I don’t seem to be able to find a model that doesn’t use the ‘encode-decode’ terminology, in a non-technology sense.  Interesting in itself.

So, what is there to think about?

  • Those that don’t ‘have’ language (Autism spectrum, people requireing reasonable adjustments in the workplace, or simply, ‘not able to get emails daily’)?
  • Those that do have language, but have the wrong one (English as a Second Language, IT-Geeks trying to speak to ‘Business Freaks’)?
  • Learning a language that has no ‘words’ – symbology, syntax, understanding?
  • The phenomenon of ‘jargon’?
  • Where does behaviour fit in? Can you say something and communicate something else?

So, I have been very obvious, and provided some of the underlying issues with every piece of change management practice I have ever attempted. Bugger trying to get people to do something, or change behaviour – it would seem that organisations fail miserably (and continue to fail without learning from their mistakes – that isn’t good science!) at simply being about to make ourselves (or as executives, communicate to our employees) in a way that they understand – using their ‘code’ or something that is close enough so they can ‘get it’.


By the way – the most successful change management practitioners are YOUR ‘code engine’ – they can do the encoding of your language (verbal, behavioural, symbolic or otherwise) into something that those receiving can decode.  I’m available. 😉

Careening into a ’24×7 Society’


How can a society be anything other than 24×7?

There was a recent doco that I watched, looking at cognitive ability-enhancing drugs (Horizon; Pill Poppers. BBC). The initial part of the documentary focused on how drugs are ‘discovered’ and then looked more closely at some of the therapeutic uses of various drugs, specifically Ritalin.

(Incidentally, it is estimated that the average ‘healthy‘ person will consume more than 14 000 over the counter pills in their lifetime…)

The documentary touched on the benefits to the lives of those children and younger adults with behavioural and other cognitive issues, where correctly identified and treated.

Here’s where it gets absolutely fascinating!

The next part of the documentary looked at those professionals (specifically surgeons, but other professions as well) using medical supplements and pharmaceuticals to maintain concentration and alertness during complex procedures. Specifically ‘Ritalin’, a product traditionally used to treat behavioural and concentration disorders, but included other prescribable drugs too (incidentally, seeking to provide the sort of concentration and focus seen in aspects of the Autism spectrum and Aspergers’ Syndrome, though in less debilitating [sic] circumstances).

The ‘new’ uses of these drugs for people ostensibly without behavioural and concentration issues is bringing astounding results, including the reduction [sic] of some risks associated with surgeries requiring patients to be under anaesthesia for many hours, where the handoff to a ‘fresher, but different’ surgeon introduces a risk to the patient which could be higher than the risk of the primary surgeon soldiering on, or a less experienced surgeon to take over, to rest the primary surgeon

Don’t get me wrong – I am the first to want the best doing my neurosurgery (touchwood!) with the lowest amount of risk! However, what happens to these ‘cognitively enhanced’ professionals when they stop taking concentration and performance enhancing drugs?

What is the effect of a future organisation who is ‘cognitively enhanced’?

What are the ethics, and how does this apply in an organisation that isn’t surgery, but simply seeking a competitive edge? Does this mean that people with attention or cognitive underperformance can now be gainfully employed, or does this mean that healthy people are now expected and encouraged to be even further ahead of the curve, or worse, frowned upon if they aren’t!

Does this mean that only the wealthy (those that can afford) or executives can be cognitively enhanced, and therefore the divide between exec, management and operational front line becomes wider, leaving a bigger void in-between, and much harder to make the leap across? The Australian Public Service already has programs addressing the (widening) gaps between APS, Executive Level and Senior Executive Service, so it is reasonable to think that non-government organisations have the same issues, because the APSC is not renowned for its ability to move fast, or being on the leading edge of career progression and staff development.

How do we manage the change management plan for that? How do we support operational staff (have nots), management (want to haves) and the executives (haves) to work together (which they are less and less if you read recent treatises on ’employee engagement’) and to support career progression?

Does this phenomenon already exist, or does this make way for individuals to be identified to be ‘enhanced’ and become better strategists, better ‘somethings’ and the role of change management is to console those not ‘tapped on the shoulder’ or left behind or at least, divert the attention of those not fated for meteoric stardom?

Socially, are we expecting people to function at a very high rate for the whole 24 hours – not just at work? If we only do this professionally, what happens when we are returned to our ‘dull selves’ after work? Or perhaps even, on the way home from work (while driving)? What will this do to our relationships and our families?

Does this mean that the risks of the interim state are the responsibility of the individual or the organisation?

Or do we just stay ‘turned on’ in our 24×7 society and ‘expire’ sooner? Is this a cost we are prepared to bear?

Messy business, this…


Organisational parasites


Picture of the Earth, surrounded by a fractile image of many coloured trees all interconnecting

Ecology of things

If normal flora and fauna are beneficial, are organisational parasites beneficial?  If I said: Point out someone in your organisation who could be described as an ‘organisational parasite’ (I wouldn’t ask that!), I am sure you could, very quickly.  If I asked; What do you plan to do about the parasite? What would happen next?

It is a very simple thing to make an analogy of an organisation with a biological organism – in fact, it is very common to consider that the structure and function of an organisation is what it is, because that is what it needs to be: the same reason that the heart, lungs, gills, eyes, brain, abdomen etc. is what it needs to be and where it needs to be to most effectively do what it needs to do.  Structure and Function is the basis of most organisational development, design and business process management practice, even if this concept is only intuitively and tacitly understood, if not explicit and up front.

I personally, would always favour an organisational change manager or business process engineer with a background in biology, ecology or human anatomy every time.  Why? Because they have a sense of what is going on, and the essential ‘cosmic interconnectedness‘ (thanks Douglas Adams for the concept – however to make it more sensible, lets call it ‘Big Data‘) that they are tweaking, fondling and grossly dismembering, in a way that even the most sophisticated financial analyst, with their finger on the pulse, cannot.  And that is because of the relationship of the organisation to that of something living and actual, rather than theoretical and symbolic, which I have yet, in my experience, failed to come across, in those who ‘don’t get it’ because they are focused on the bottom line, or the left and right sides of a ledger.

So, it is a known fact that there are advantages of some parasites in the human body – do [some] organisational parasites come with advantages too?

There is argument (bloodless, but quite fierce) to support the concept that humanity is either currently ‘attempting to evolve‘ into our next genetic mutation, or is on the verge of evolving (whether we like it or not) with the event horizon of ‘the Technological Singularity‘ fast approaching.  Some cite evidence of this ‘attempt’ to evolve in the higher incidence of Autism, Aspergers Syndrome, Downs Syndrome etc, as a way of the species attempting to find a new genetic or neuro-intellectual deviation or mutation which is able to ‘cope’ or take advantage of the new ‘pressures of existence’ – pressures which include the need to ‘think’ a different way to be able to synthesize the vast quantities of data into information, and therefore knowledge and wisdom, which we have never, anthropologically, had to deal with before.

Image of a symbolic brain, with connections from the internet to feed it with more and more data

Getting fed!

So, as cultivators and repair-practitioners of organisational cultures (in the current context, we could also be talking ‘cultures’ in the same biological vein as ‘culturing likeness in a petri dish for the requirement of studying or altering behaviour and action on the world around the culture’ – a definition which quite well describes the organisational change manager, in my view) do we have the scientific background to understand what we are looking at, down the microscope, now or in the near future?  Do we have the biological engineering equivalent in our organisations to reliably predict what the changes we propose will have on an organism, in the context of the organisation, which moves within an ecosystem, that has arisen through a process of anthropological evolution to ensure that the current structure and function of an organisation is exactly what it needed to survive – to now – but may not be able to cope with the environmental pressures that the world will place on it, in the near future?

Evolution has always been at the behest of the changes to the environment of which genetic diversity seeks to take advantage – and vice versa.  So, that would make the change manager role one of understanding the environment, and the actors on the environment, and then seeking to provide an environment to encourage the growth of sameness (or culture) that would promote or highlight the ‘piece of diversity’ that takes advantage of the identified changes.  Else, the change manager seeks to introduce a ‘piece of diversity’ to a receptive (by design or accident) culture which in turn, effects the environment to change in a desirable way.  Or both, at the same time, considering the myriad of cause-and-effect reactions, flavoured by emotional and behavioural diversity already existing in the organisation. When looked at it from this perspective, how would you rate your skill level at achieving this, rather than ‘communicating something’, sending an email, or ‘engaging’ with a stakeholder?

So, for the change manager, does this mean that, as well as understanding the structure and function of the organisation, it is also a requirement for us to understand the symbiosis in the organisation of the parasites and the normal flora and fauna of the organisation – so that we can be change managers gardening and tending the ecology of those organisms/organisations that we seek to serve?

What eco-management skills and capabilities do you have, in relation to yours or any other organisation? Further, if you don’t currently have the skills to keep a whole ecology alive (for how can you achieve change within an ecology if you don’t consider the whole of it, including the individuals, the individual ‘species’ and other collectives, and their relationship to the greater whole – which may even include the external factors acting on your ecology?) where are you going to get those skills, and how do you know when you have them?  And how do you know those evolutionary changes (for it is EVOLUTION of the organisation we are seeking to influence here, don’t mistake me, nothing shorter of ‘evolutionary engineering’) are going to achieve the desired effect – or if the desired effect is something for now, or something for the future, when the changes actually come to bear on the organisation?


Getting Jacked in


Or: How to be part of the future, when living in the technological past:

So, welcome to the 20th century. Dont I mean, the 21st century? Nooo… I currently live outside of the city limits. This poses some problems, most of which revolve around ‘patience’.

Patience to wait for tradespeople

Patience to wait for news

Patience to wait for visitors

Patience to wait for technology.

Its a virtue. I’m told.

At times like these, my mind considers a 12-step program, but then, I don’t have the patience for more than 3 or 4 steps. Got anything shorter?

When and how did I become an instant gratification monster? As a Milenial I shouldnt have this driving need; I should be happy with Betacord, Cassette tapes, ‘life before the interwebs‘, landline phones, travelling to the corner store for some bread and milk, and stopping for a chat with the woman in the pet store, who asks after my dogs (which she names in order of age), and enjoying a coffee, sitting on the porch watching my horses feed.

So, why the guilt? Why do I feel somehow ‘less’ without something smart or ‘i” in the palm of my hand or lap? Why do I nurse technology to see just how far from the house I can walk before the wifi starts to fade – why am I considering getting a more powerful wifi modem? Do I really intend to watch TED Lectures from the stables, and hook up wireless, wifi, infra-red cameras (which I will barely check) on the boundary fences and select paddocks? I live in the country – why am I so suspicious of my neighbours with the attack-dogs (one with only three legs)?

Is this particular post, an ironic, oxymoronic attempt to seek absolution for my technical guilt, when I thought that I had put all that ‘confessional farce‘ behind me years ago?

I wonder, in today’s age, why it takes:
*more than 15 hours of telephone complaints (resolved by 1 online complaint to the Telecommunication Industry Ombudsman)
*10 hours of telephone technical support
*repeatedly refusing offers of mobile connectivity (there IS no mobile reception here, why do you keep offering it to me when you KNOW your own provider doesn’t provide reception here?!?)

to feed the obsession that I feel I *should* have to get reconnected, when I have a sublime coffee in my hand, and an Australian Shepherd who keeps bringing me a ball to throw?

Shouldn’t I remember life back in the dark ages (or am I thinking of the FUTURE Digital Dark Age?) when there was no personal or professional digital identify – or at least, there will be no discernible distinction between a personal and digital identity, in a Technology Singularity, post-human world? Does this still make me human, or am I transitioning to a higher state?

Instead, I seek to get my family ‘hooked’ – I am so concerned with perpetuating the ‘digital virus‘ at home and at work: I ‘drop round’ to make sure my mother’s wifi works at optimal speed, ensuring the laptop and iDevice that I got her has constant connection; that everything is ‘virus free’ (who would want to plant a data virus on my mother’s ‘Smurfs’ app usage?); I make sure that my FaceTime and Skype connections to my father are working; I video chat with my niece on her birthday; I facebook my cousin while she is in the waiting room, while her son is in surgery in a hospital.

I wonder if I should be feeling some sort of guilt about secondary and tertiary bullying to ensure that my 90-year old grandparents become hooked; and that we can all participate in digital e-Health (lightyears before the government) so that my grandfather can videochat with any of his 8 children, 47 grandchildren and exponentially increasing great-grandchildren and 5th generation offspring with his third generation iDevice tablet and 5th generation smart phone, if he or my grandmother become unwell? It seems that we trust our private family technology network before we trust the phone to call an ambulance!

And yet, we still insist on printing documents at work.

In thinking about this, making the choice to quit my last job when my boss didn’t trust the technology that he had invested hundreds of thousands of dollars in ensuring existed to facilitate virtual connectivity and productivity, because he wanted me to remain geographically located with him, instead of with my stakeholders, doesn’t seem like such a bad decision. Except when I am interrupted checking my email, on the way to getting milk and bread by the pet store owner, asking after my dogs.

However, this seems to be a common bullying tactic by dinosaurs to remain surrounded by their food, in their lair – the resistance to allow employees to actually use the technology that they spend 15 hours per day designing, planning, installing, training and patting each other on the back for achieving would seem significant. Have *I* failed in implementing the change management plan which gives my former boss the comfort to evolve, or did we reach the limit of his genetic, professional predisposition to transition into the universe he chose to create (over spending time with his family)? Was I really seeking to ‘un-plug’ myself, create a digital sabotage event and identity schism between my actual and digital personalities; to create a scenario where I would not be able to reconcile with my digital self in a healthy multiple-personality disenfranchise, breaking the cycle of personal violence between my actual and digital self; seeking some solitude, a good coffee and a pat for the dog?

What would you miss? Asking my grandmother of 92 years, of all today’s ‘mod-cons‘ (many of which I could not possibly do without), which does she think she could not bear to part with? Every time, she says, without hesitation and a wry smile: ‘Running water’.