Category Archives: Change

Models, Metaphors and Analogies.

It is common to consider business structures by metaphors to other things. For instance, some people discuss organisations as if they are machines, where if you were to pull a handle a specific outcome will happen. Others consider societal models – battalions, tribes and so on. (Business leaders in particular seen to like military metaphors. Perhaps they like to imagine themselves as generals deploying the troops. Misguided, I’d say.) Yet others see medical metaphors, discussing organisations as if they were diseased bodies with issues to be cured.

Metaphors do serve a purpose: they help us to understand an organisation by thinking about it in a different way.

But they can also confuse. Organisations are made up of people, they are not strictly controlled and limited machines. They rarely work in the way you expect them to.

The more you know about the original subject of the metaphor, the weaker the metaphor can seem. There are two biological – botanical – metaphors that I have regularly come across that baffle me. Because I have been knowledgeable of botany (I have two degrees in the subject); and when I hear people using them, it seems that maybe the words don’t mean what they think they mean.

Let’s start with “rhizomes“. Rhizomes are, botanically, underground stems. They grow through the soil, occasionally putting up the visible bits of the plant. Iris have rhizomes; banana plants have rhizomes; bracken, the plant I am most familiar with, has rhizomes. I spent five years working with rhizomes. Whole hillsides can be covered by bracken, the visible fronds rising up from the subterranean rhizome. The rhizome grows through the soil, occasionally dividing, unseen. It divides and divides, growing on. The old rhizome rots away, and, reaching a dividing point, the divided rhizomes become two separate (though genetically identical, save for any random mutations that might have occurred).


Not a rhizome.
Photo by Tylerfinvold on Wikimedia, used under CC free licence GFDL. https://en.wikipedia.org/w/index.php?curid=13160509

When people talk to me about organisations with rhizome structures, what I see is a hillside covered in bracken, the rhizomes underground. True, it might be that my knowledge of bracken and other rhizomatous plants fogs the discussion somewhat. I’m sure that’s not the meaning they intended. But then I can’t help thinking maybe I know more about rhizomes than they do. And perhaps they should use a different model.

Not a “mycelium“, though. Mycelia might be considered to be the fungal equivalent of rhizomes. They are one of the fundamental parts of many (though not all) fungi: they grow as unseen filaments through a substrate – the soil, a dead tree, your skin. (Athlete’s Foot is a fungal infection, the fungus mycelia growing in your skin.) Mycelia are multi nucleate – fungi have a very different form to most organisms we’re familiar with – and they divide and reconnect. The visible parts of fungi we’re familiar with, mushrooms and toadstools growing above the ground, are specialised reproductive structures formed from mycelia: the fibres of the mushrooms we eat are composed of mycelia.


Not a mycelium.
Photo by chris_73Own work, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=19658

When business people talk to me about mycelium organisation structures, what I see is athlete’s foot, or a fairy ring, the outward sign of fungal mycelia growing out from a point in the soil.

I can’t help but wonder what it actually is that they’re trying to describe, but I’m pretty sure it’s not that. (I have tried to understand, but the use of the word seems woolly; I got several pages in on Google before I found a definition, unclear as it is.)

The thing is, the mental models we create to explain and describe the world matter. If we think of organisations using a doctor-patient metaphor, it will lead to examining them in that way; if we instead use a mechanical metaphor, or a military one, we will find something different.

There’s no right answer, of course. Models, metaphors and analogies help us investigate the world around us. But they also skew our thinking. And they need to be used with care.

Implementation.

For processes, design is coupled with implementation. It can be hard to separate the two, sometimes: poor design leads to poor implementation.

The City of Edinburgh Council has redesigned its domestic recycling processes, and over the last month it has been implementing the new design in my neighbourhood.

In addition to large communal bins for non-recyclable rubbish (destined for landfill), domestic recycling used to work with weekly curbside collections: cardboard and plastic bottles (only plastic bottles) collected one week; glass, paper and tin cans the other; and weekly collections of food waste for composting.

Each type of waste had a separate receptacle – red and blue boxes for cardboard and glass, a sealed container for food.

It wasn’t perfect. I would frequently forget to put out my recyclable waste and have to wait for the next collection. In a windy city, where gales seem to coincide with the weekly collections, the emptied containers would be blown all over the place.

The council announced in January that they were changing the process in February.

Instead of weekly collections, there are to be permanent large communal bins for general recycling (cardboard, paper, tin cans and plastic containers – not plastic bags, which are not recyclable), for glass, for food waste, and for landfill. Four different communal bins.

In many ways, this will work better: no more missing the weekly collection, no more boxes scattered across the road by the wind. Drop off recycling when it suits.

But…

(You knew there had to be a “But…”, didn’t you?)

But, firstly, communication has been poor. The launch date of “February” was unclear: when in February? Should I keep putting out recyclable waste for pick up before then? Where will the new communal bins be?

(These questions remain unanswered, despite the system already changing…)

Secondly, the one new communal bin for general recycling – cardboard etc (but not plastic bags) – looks very like the old landfill bin. It arrived a couple of weeks ago in the same place as the landfill bin it replaced. OK, it does say “general recycling” on it, with a long list of acceptable items and another list of unacceptable items. Maybe my neighbours can’t read. Maybe they didn’t notice the signs. Either way, on the first day it was on the street, it was full of plastic refuse sacks (presumably full of waste for landfill, not recycling). A couple of days later, it had just recycling waste in it, and someone – possibly the council, possibly not – had taped a new notice to it detailing again what could go in it – plus stating that putting the wrong items in the bin would mean it all had to be sent to landfill.

A couple of days later, I looked in it to see someone had placed polystyrene blocks in it – another no-no, meaning the whole bin is apparently destined for landfill.

The other bins – for glass and food waste – haven’t appeared, leaving my neighbours (and me!) clearly confused as to what to do with our waste. Before the changes, today would have been a cardboard collection; several neighbours have put out cardboard for collection, despite the new communal bin for cardboard (if it hasn’t been contaminated with landfill waste…).

According to the leaflets put through the door, the reason for the change was to increase the rate of recycling. In the short-term, this has clearly failed: at least two full recycling binloads will, according to the notice stuck on the bin, be destined to landfill because they contained the wrong kind of waste.

People are confused, and most likely throwing out recyclable waste in the landfill bins because of the uncertainty. The glass and food waste communal bins haven’t appeared. (Perhaps they have been camouflaged…)

So – in the short-term at least – the implementation has failed. It might just be bedding in. I can’t help thinking, though, had the design been better, with clearer communication (and, for instance, a timetable of what was happening when), it could have been a lot smoother.

Update: not only “today would have been a cardboard collection – but it was. The neighbours’ cardboard has been collected from curbside. For many systems, dual running during implementation makes a great deal of sense. But it has left me even more confused than before.

Talking ‘Bout My Generation.

Since the queen first saw the light she has seen invented and brought into use … every one of the myriads of strictly modern inventions which, by their united power, have created the bulk of modern civilization and made life under it easy and difficult, convenient and awkward, happy and horrible, soothing and irritating, grand and trivial, an indispensable blessing and an unimaginable curse.
Mark Twain

The Economist recently wrote about the difficulties for firms of managing different generations in the workplace; Gerry Taylor from Orangebox gave a talk at the Business School a couple of months ago on similar issues, particularly those resulting from the changing demographics and the impact of technology. The Economist article focuses on managing these different generations – and their different wants and needs; Taylor on the impact on workspaces of these different generations.

Taylor was talking about research from a couple of years ago, Office Wars 2012 [PDF], and I don’t think any of the facts were particularly new: changing technology (particularly social media), the rise of China, the internet of things – we all know it’s coming! – but his analysis was somewhat different, and interesting. His delivery was passionate and engaging, too!

Boomers – people like me – are in the middle or towards the end of their careers (depending how we manage the pensions timebomb…): by 2015, a third of the workforce will be over fifty years old. They are often defined by their work and they’re highly skilled (they’ve had a while to get there). They invented youth culture, and they’re rewritten the rules with each decade. They are IT adaptors who enjoy change and like trying new things.

Millennials believe they move faster than anyone working who’s older than them, are “digital natives” for whom being unconnected is not an option, and they are huge customers of technology – including in the workplace.

These pictures of two generations are clearly stereotypes, and there are other generations – “X” and “Y” – in between; they are perhaps relevant to only certain strata of society, but they help forecasters maker plans. Including what the office may look like.

One big difference between boomers and millennial is, despite all their technological know how, millennial are much less secure than boomers. This might just reflect the power balance, but it is the boomers who know how the world works and are confident about their place in it. After all, they made the rules.

Boomers have taken to changing office structures, particularly working from home. Taylor described a high tech company – it might have been IBM or Microsoft (but I can’t for the life of me find a reference to it – have you ever tried Googling “Microsoft office”?) – who brought in flexible working, including hotdesking and homeworking. The boomers jumped at it; the millennials rebelled. Lacking the security of the office and the personal networks that the boomers had spent their careers building, the millennials floundered.

Apparently what millennials value must about the office environment is the experience the boomers have. They need mentors. (This might not be just millennials: it is perhaps their stage in their careers rather than their generational status that is relevant.)

Our office spaces have been shaped by technology. The thing is, it’s the technology of the nineteenth century. Electric lighting, the telephone, lifts (“elevators” to our American cousins) and the typewriter made offices what they are. (You could probably add air conditioning to that list.)

And what they are are social spaces where people can make connections.

Technology will continue to make a huge impact on offices and how we work, and in particular it can free people from the workspace. But this also changes the dynamics of the workspace; and perhaps it is the boomers who will benefit.

Businesses need to change what they do with their offices, and how they relate to their younger staff, because the competition for talent will otherwise mean that will find employers who do. There is a contradiction here of course: with the economy still in the doldrums if not recession, where precisely I’d that competition for talent? Have millennials really got the pick of several job offers? I doubt it, right now; but these things can change quickly.

Why have offices – when technology lets you work anywhere, why do we need to all be in the same place? Because work is more than work: people – particularly those millennials – value the social aspects of the office.

But I have doubts. As the Economist itself states, youth unemployment is high around the world and there is a generation who may not know employment – no wonder those millennials are feeling insecure. Whilst recruiters and HR people still go on about “the war for talent“, for most people in most roles, just keeping a job is a struggle. Real wages are falling in the UK and in the USA, as well as the Eurozone.

The differences between generations may just reflect the differences in their life stages – differences which have always been there, as workers move through their work and actual lives. Of course what young employees want is different from those approaching retirement. One major difference with previous generations is that those approaching retirement may be doing so with trepidation, with rising retirement ages and falling annuity rates, and may be keen to keep working (and earning). People at different stages of their lives will have different outlooks – saving, “nest-building”, and so on. These may lead to conflict (some commentators talk of “intergenerational war“).

But for most jobs, nothing much has changed. And the employers can dictate the workplace and work culture that suits them best. To most people in most roles in organisations I imagine that the issues raised seem distant and unimportant. The workplace may be changing, but slowly, imperceptibly – for most.

Hearing Voices.

I have long not liked machines that talk. I don’t even like machines that beep at me, switching all system sounds off at the first possible opportunity. I think my attitude to voice interaction with computers was established by seeing “2001: A Space Oddity” as a child. HAL set my default reaction.

So when Ben Cowan spoke to Edinburgh Skeptics a couple of weeks ago on human-computer interaction, particularly voice activated systems like Apple’s Siri, it evoked a strong reaction.

I started using computers in the early 1980s: I first used a device (which I think must have been an Apple) as an undergraduate in1982; my postgrad work required me to use a mainframe for some stats, and I wrote my thesis on a BBC Micro. A keyboard was the only way to interact with these machines, and that could be very frustrating – with the mainframe, “interaction” could take hours, as the machine was very unresponsive. Feedback is powerful. They tried to talk to me – an irritated beep when they didn’t like something (which back in those days was often) – but one only talked back in frustration. Or anger. (Come on – who hasn’t sworn at their computer?!)

I first used a mouse in 1986, again on an Apple. But when I first started using PCs, the keyboard and command line was still the main way of interacting. I still use keystrokes rather than mouse actions for a lot programmes.

My experience of voice activated systems has been limited to supermarket self service systems and telephony systems. I hate self service checkout with a vengeance, largely down to the universally patronising tone of voice used. And my experience of telephony systems is similar to these poor miscreants, shown by Ben.

It’s not just Glaswegian accents – Birmingham City Council installed a voice activated telephony system which couldn’t recognise Brummie accents. They must have done extensive testing of that one!

And then of course there was HAL, lurking at the edge of my technological nightmares.

Perhaps it is a matter of control.

The thing is, voice interaction is becoming much more common. My phone and my tablet – both Android devices – have the ability to use voice activated systems (most commonly Google Now, which is the standard app). You’ve probably realised I’ve not tried them. It appears I’m not the only one.

But voice interaction systems are likely to become more common. As well as Siri and Google Now, Google Glass is voice activated. Satnav appears almost ubiquitous (though I of course abstain…).

I’m beginning to think this might be my problem rather than HCI’s, and Cowan explained why this might be. I’d say I have an issue with the aural “uncanny valley” (my words, not Cowan’s) – the closer to a human voice they sound, the stranger, more passive and downright unemotional – unhuman, even – they seem.

Cowan discussed some of the psychology that goes into this. There are rules in conversation – like “partners in a dance”, even if we’re not aware of the steps. We learn the steps as we learn to talk. Computers don’t. They have to be programmed, and at the moment those programmes are largely database driven and determinate. They work off keywords, rather than natural language. Instead, we fall into line with the machines: Cowan explained how when we talk to people, we model their usage and align our vocabularies. (Starbucks works hard to get us to model their language.) Interestingly, people communicating with computers align their language, too. This has been going on as long as there have been computers: when writing Fortran or Basic programmes back in the 1980s, the vocabulary I could use was very restrained and had very specific meaning. I had to use those words and the programmes’ syntax because otherwise the programmes wouldn’t work, or would give different results from those expected.

When we speak, whether to another person or using voice interaction with a computer, though, the modelling would be internal – subconscious – rather than deliberate.

I was surprised to learn that Siri has a single, masculine voice in the UK (apparently with an American accent). In the USA, Siri seems to have a feminine voice (which can be changed). Presumably its implementation in other languages takes on different voices or accents. Perhaps in the future we will be able to programme computerised voices, as some people do with satnav which I am sure would go some distance to overcoming the uncanny valley – though it may raise other issues (who owns the sound of their voice? What if one decides to use an ex’s voice? …And so on).

Still, it would appear that Siri has a sense of humour…

Which I think is where I came in…

Edit: I have been reliably informed that in the UK, Siri has a British accent. Chris says: “UK Siri [is] decidedly British; he sounds like a sarcastic airline pilot.”

Some Thoughts About Unconferences…

I spent several hours during the summer at two conferences – both very interesting, and of different topics, but in the same building which has a classical lecture theatre with a raked auditorium; both conferences featured talks followed by short Q&A. This seemed somewhat ironic, given that both were about change (one technological, the other political): the format of these conferences was one that had remained constant for decades, with one person lecturing to the audience, who sat and listened.

I mentioned to someone in the coffee break how it might have been beneficial to have more participation and engagement – after a day and a half in the lecture theatre, I certainly needed it – and I started to talk about the unconference format. My friend hadn’t come across the idea of an unconference before, and as I tried to explain it, I thought I must have blogged about it before and told her I’d send a link.

DSCN2409

It turns out I was wrong. I have written about several unconferences before – BarCamp and another BarCamp, ConnectingHR. and another ConnectingHR, TweetCamp, and BarCampBank, for instance – but I have only obliquely written about unconferences per se.

This post aims to put that right. (Also, I have just read Lloyd’s post asking what people would like an unconference about.)

Unconferences sprang out of the experience that many conference goers have – that the real value of some conferences comes from the conversations over coffee and lunch rather than the lectures themselves. Lectures didn’t engage and could inhibited discussion – one person standing at the front of a room of peers holding forth.

The first unconference-like events I attended were corporate events to aid organisation change programmes, using open space technology. (The “technology” here refers to the process – just as a stone can be technology, and a pencil can be technology. Some of the best open space events I have been to have been the most lo-tech; and the best technology talk I have been to used no technological aids at all!). The grounds rules for open space events are (as far as I remember them…)

  • it starts when it starts
  • the people who are there are the right people to be there
  • everyone has to be prepared to contribute to the discussion
  • when it’s over, it’s over (so don’t keep going over the same issue if you have sorted it out)
  • “the rule of two feet” – it is fine to get up and walk away, either because you want to hear another talk or because you feel you have nothing to add to the one you’re at

The same apply for unconferences, and they have some very profound effects. They are very empowering: there is no one controlling the discussion. The agenda is decided by the participants: an unconference is “self-organised”: if there’s is nothing on the agenda that interests you, do something about it! Start your own session – even if it is “Nothing else interests me – what should we talk about?” You don’t need to ask anyone permission. You are only there because you want to be.

DSCN2424

There is a paradox, though: unconferences happen because someone – the lead-organiser, perhaps – has decided there is something worth discussing – a central topic. (This is not the case for BarCamps.) So there is an element of curation. And self-organised doesn’t mean disorganised: there is usually a lot of pre-event organisation which, I imagine, takes a lot of work. Someone has to find the space, find sponsorship, sort out catering, and so on. (I am sure it would be possible to do without some or all of these things, but not for an unconference of any size.)

But the day itself is self-organised. That agenda – it is decided by participants offering to run sessions, slotted into a skeleton timetable. And there can be “wash-up” sessions, so people can share anything they have learned.

There is one last common feature of open space events that is not necessarily found in unconferences. Open space events are often focused on action and change, and one of the outcomes is therefore a list of actions. An unconference may produce an agenda for change, but it is often more personal change rather than organisational – people taking away their own change.

Because of the participation and the large amount of sharing of ideas, unconferences can be very energising – and exhausting! I believe they work best when they attract people from many disciplines who come together to explore their ideas. If you want to learn how to do plumbing, don’t go to an unconference; but if you want to explore what different things you might be able to do with pipes – well, that’s sounds about perfect… An unconference can be very liberating!

(Here are a couple of posts I came across recently on others’ experiences at unconferences – one about a ConnectingHR event I wasn’t at, the other about a pair of events. It isn’t just me who finds these things empowering, energising and liberating!)

Everything in the Garden…

When I was thinking about how the online world might change in the future, I suggested there may be some pressure for the resurrection of “walled gardens” – specialist areas of the internet where interactions happen behind password-protected access.

DSCN1501

When I first started using the internet, back in 1995, this was common. I signed up with Compuserve (which I am amazed to see still exists!), and there were specific areas of interest curated by the the organisation; to access the internet proper, you had to click an icon to leave the walled garden. There were other, similar services, too – AOL, for instance.

Accessible browsers like Internet Explorer and Netscape – and, later, Firefox, Chrome et al – freed up the internet and enabled non-technically savvy users (like me) to get around. The walled gardens more or less died: users didn’t like being walled in, and had no reason to be so.

My reasoning for thinking walled gardens might make a return was a commercial one: suppliers of content – like Facebook, Twitter, Apple or Google – want to keep you on their site as long as possible, to sell your eyeballs to advertisers and others willing to pay them. That’s how they make their money. Part of the bargain is that we get to look at the content they provide (albeit, if they’re Facebook or Twitter, that it is created by other users). In a more competitive environment, they will set up walls to keep you there.

(A corollary was that there would also be a move to more openness, driven an increasing awareness and technical know-how.)

A couple of weeks ago I saw a demonstration of what this might be like, and I’ll admit I didn’t really get it. Kiltr is

the largest social media platform focused on connecting Scottish interests globally to create economic, cultural and social value for its members

and, believing that the only way to understand new networks is to play around with them, I signed up to join last September.

I made some connections, mostly with people I follow on other networks, looked at some brands (most of which I follow on other networks…), and I don’t think I have been back in the last three months (until just now!). The reason for my resistance is that the “exclusive” nature of Kiltr doesn’t really make sense to me: if I want to share stuff, I want to share it with my friends and contacts pretty much anywhere – many of whom may be on Kiltr, but most are not.

A couple of guys from Kiltr were demonstrating their new platform at the University business school’s Entrepreneurship Club; they are rolling it out in June, I think. It was very, very snazzy – a whole world away from the current experience. But I am not sure that an excellent new interface will make a difference to me. I think that Flipboard is an excellent app – it looks great, works very well, does what I might want. But I hardly ever use it: however good it looks, it didn’t actually add anything to my use of social media.

(They will be developing the interface for other enterprises as part of their business model.)

Similarly, I am not sure what more I will get from Kiltr that I can’t get from my existing social networks. To connect with people  and content I want, I would need to go to Kiltr in addition to the other networks. I assume that I can connect to the same people and brands elsewhere that I can on Kiltr – all it gives me, aside from a superb experience after the relaunch, is the walled garden of Scottishness.

I can see what brands and advertisers – they get access to consumers or businesses with less noise; there may be fewer eyeballs, but there will be more attention.

But for individuals, I’m not sure.

It may just be me. There are other networks I’m a member of which I rarely if ever visit. The ConnectingHR network (developed using Ning, I think) is full of interesting, like-minded people – most of whom I connect with in other ways, mostly Twitter. It is so long since I logged into the site that I have no idea what my password is; but I engaged with other members on Twitter just this morning.

I am not sure what it would take for me to use these walled gardens more effectively – what would make the loss of “shareability” with those not on a particular network worth paying.

I will certainly give Kiltr’s new site a go when it is launched in June – but I am doubtful it will be enough.

Hans Rosling on “The Big Picture”

My first event in this year’s Edinburgh International Science Festival was to hear Hans Rosling give a statistics lecture.

This wasn’t the typical kind of statistics lecture; I reckon I have had at least four stats courses over the years, and whilst I know enough to know what to do (or where to find out what to do), I think it is fair to say that I don’t really get statistics. I can do it, but it never really makes sense. And all those stats lectures were dull, dull, dull, and dry.

This one was different. Not talking so much about stats as our ignorance of stats, and largely based on data rather significance tests, Rosling was as much entertainer as statitician. (I think on of his slides described him as “edutainer”.)

He was talking about how numbers can be used to describe the world – not to the exclusion of other inputs, but to produce a rounded picture.

The bulk of his talk was about population growth and world poverty, and the causes of change in these global phenomena – largely economics. In between, he told stories of his life amongst the numbers, when to trust them and when not. (“Not” seemed to be mostly when you don’t actually have the data – he highlighted how wrong our assumptions about the world can be.)

Rather than try to reproduce what he said (without the laughs), here are some of his TEDtalks covering similar issues…

…on stats

…on poverty

…on population growth

All the data and the manipulations he used can be viewed on Gapminder, where one can play around with the data and visualisation. A great way to while away the Easter break…

“More Like People”?

My working life has been spent with organisations, in one way or another. (And of course my life before that: schools and universities are organisations too…) I love exploring the way organisations work – what makes them tick. That is why, believe it or not, I loved auditing: auditors dig into organisations, discovering the real processes and structures that enable to them to function. (Clue: it isn’t what managers tell you. And it doesn’t have anything to do with shareholders!)

When talking about organisations – something I do often – I repeatedly find myself describing them as dysfunctional. I don’t think that I have come across or worked in an organisation that couldn’t work better in one way or another, from multinational banks to small, two-man operations. I have long wondered why this is. It isn’t that people in the organisation don’t know this: one thing consultants learn very quickly is that what they tell their clients is very rarely news: organisations know what’s wrong, even if they need someone from outside to help them articulate it.

Their processes could be better, their communications could (almost always) be improved, their structures changed to help the business. Hierarchy and structures get in the way rather than enable, and people in organisations know the work arounds – big and small – to get things done.

(A caveat: “could be better” is a value statement: the corollary has to be “better for whom?” Customers? Employees? Managers? Owners? The wider population? The environment? These groups may not be exclusive, but better for one may very well not be better for all.)

Organisations could be – well, better organised. They are dysfunctional.

I have only one answer. Organisations are made up of people, not processes; people make the organisation work. And people are dysfunctional.

Despite the idea that organisations are separate from people, it is people that are the organisation. We pretend they aren’t. We even pretend that organisations are people!

The thing is that whilst some organisations behave as if they were psychotic, most large organisations’ dysfunctionality works in peculiarly non-human ways. (Small organisations’ dysfunctionality is just like the people behind the organisation!) The veil of incorporation lets everyone in an organisation hide behind the processes, hierarchy and bureaucracy that lets the organisation continue to believe they are “rational”.

Liam Barrington-Bush started a campaign to counter this and humanise organisations, “#morelikepeople“, and he’s developed some of his ideas into a book, “Anarchists in the Boardroom“. (I should declare an interest: I’ve known Liam for quite a while, we’ve discussed his ideas many times, I was involved in focus groups around his book, and I read early drafts of a couple of chapters; he and I agree on much, and probably disagree on more!)

Liam’s focus is on not-for-profits and social enterprises, but I think his ideas are relevant to all organisations. Broadly, Liam reckons (amongst other things) that new media – particularly social media – can act as a counter to the rigid hierarchies and management processes that twentieth century industrialisation created. This is a topic has interested me for a long while – Benjamin Ellis covered it particularly well in a one day conference at Cass Business School three years ago.

Using collaborative tools to develop self-organising structures and flatter structures would clearly have an impact on the nature of work and business; if large organisations were able to embrace them, they might become flexible and responsive.

More likely, I feel, is that small organisations – already more flexible than large, and often unencumbered by rigid structures and processes – that are likely to adapt faster to social media, perhaps becoming more openly networked rather than hierachical.

(Liam is using crowd sourcing to publish his book – itself an interesting example of the changing nature of business in a new, social and collaborative world; he is still looking for supporters.)

Trends About Trends

William Nelson and Richard Hepburn explored some long term trends in the UK – reassessing them and exploring new qualititative techniques such as crowd sourcing. Such trends have an impact on economics and government policy, as well as fundamentally affecting the way we live our lives (ten years ago I would never have guessed the impact carrying a mobile phone would have on my behaviour!).

The themes they identified were

  • changing structure of households (what Nelson called “home-alone v ‘all together now'”): there have been increases in young people staying at home, people living by themselves, couples cohabiting, and young people sharing till later in their lives.The current state of the economy and the jobs market is driving a lot of this as young people stay at home or have to because they can’t afford a place of their own (apparently leading to an increase in squatting in London and some novel approaches to communal living and working elsewhere), but of course it also has economic impacts. Immigration and demographics (which Nelson also covered) will have an effect, too.(
  • “smart v connected”: drawing on “the internet of things” – the ability to give any object its own internet identifier – Nelson argued against the need for “smart objects” (all those food-ordering fridges PR-savvy white goods manufacturers say we’ll be buying) but reckoned our homes would become more connected – but under our control. He foresees us using our mobile phones as universal controllers, switching on heating, lights and cookers remotely. As technology converged, he also believed that it would be gas or electricity companies who would own the interface, not the telecoms or media companies that currently own our broadband connections, prompting competition for control of our homes: remote controlled central heating might be the killer app. (Maybe Sky will buy British Gas?)
  • social networking to networked socialising: we’ve been living in a technology-mediated networked society since the advent of the telephone in the early 20th century, but we’re increasingly connected. The ability to carry the internet in our pockets has changed the way we behave. Whilst our lives might be more and more busy, we’re also procrastinating more: we might arrange to meet people, but the details – where, when, what – are more flexible and subject to change: we are less willing to commit to a fixed schedule, with frequent and repeat rescheduling. People are more willing to take the best offer that comes along (apparently 40,000 people are stood up every day!). A lot of this happens on mobiles – people are checking what’s on and booking more last minute tickets, which effects artists’ and venues’ planning and pricing strategies.Their is also an increase in “leisure as performance” – people tweeting or Facebooking (is that a verb? I guess so…) photos of themselves at events – the ease of one-to-many communications is turning us into a nation of show-offs – and sharing information about our plans to go to events becomes a currency. Interestingly, one doesn’t actually need to go to the event – you can share the information that, for instance, you’ve got a ticket for the Olympics (posting the details and a photo of the ticket, perhaps) before selling it on. Data about the event can be more valuable than the event itself.

    (It also means we are under self-imposed scrutiny: the more we share online, the more we are building the panopticon… And I am shocked that there is a data analysis firm called Panopticon. Maybe we get the future we deserve.)

  • the gender revolution finally happens: decades after the 1960s, Nelson reckoned that changes in gender relations have now become so normal as to cease to be newsworthy – and when things get boring, change has happened. (I know many feminists who may disagree with this; please don’t blame me for sharing his views with you!) There are now more female graduates than male, and they get better degrees; they’re also better at getting jobs than male graduates. Nelson said that women aged 20-29 now have higher hourly wages than men (I have searched the ONS website, which is full of fascinating data, but I can’t figures split by gender and age, so I’ll just have to take his word for it!).As women become more equal to men, they are becoming less equal to each other: there are growing disparities between women. And whilst pay hourly pay might have moved in their favour, women still spend more time on housework (in the US) and are the prime provider of childcare. It’ll be interesting to see if those roles change with women having the higher earning potential.

    There may also be pressure on employers to change their models of employment (strongly rooted in the early 20th century?) to cope with highly qualified, high earning women who want to fit in childcare and their home life, too: this might add pressure to develop more flexible models of employment.

  • ageing population: the “demographic timebomb” has almost become a cliche, but it remains important, affecting policy and opinions for decades. 2012 sees a spike of people reaching 65 – the results of a mini baby boom in 1946 and 1947 as soldiers returned from the war. Since Britain didn’t really recover economically for another decade or so – it was in 1957 that MacMillan asserted “you’ve never had it so good” – it won’t be until the 2020s that the wave of over-65s resulting from the 1950s baby boom reach 65.The ONS predicts that the proportion of over-60s will continue to grow whilst the proportion of under-14s is static and the proportion of those aged 15-59 decreases – hence worries of a decreasing working population having to support an increasing number of the old.

    None of this is news – the “demographic timebomb” has been written about for decades. But by looking at the detail, we can plan and change – both public policy and our personal choices. For instance, Willie pointed out the market for Saga will grow by 7% pa (I think – I didn’t write the figure down!), without the company doing anything at all. The effect of demographics on policy – the provision of health care, pensions and social care for the elderly, for instance, as well as indirectly affecting, say, transport, housing and industrial policies – and of course the economy

  • “the youth of today”
    It was in his discussion of youth that Nelson really challenged our assumptions. The young are not hoodie-wearing rioters drunkenly threatening passers-by: Nelson gave figures from the UK for reducing youth crime, decreasing youth drug and alcohol use and a decreasing teenage pregnancy rates – not the stuff of tabloid headlines.At the same time, parents are being more protective of their children – driving them to school and managing their leisure time (back to the panopticon there…) – in part driven by a culture of fear: children are taught about “stranger danger” when other risks may be more relevant. What effect will “paranoid parenting” have on future generations? Will they learn to assess risk if protected during childhood – surely a key part of growing up? And what will such cosseting have on our children’s future health?

These are just some of the trends that Nelson has worked on; perhaps most interesting is where they intersect: for instance, the effect of the changing nature of networked socialising as the population ages; or the changing form of households when examined through the lens of changing, less rebellious youth; or the impact of changing economic power of (some) women on household structures and the balance between generations.

“The Future Is Already Here”: looking at future trends

July saw the Edinburgh University Business School alumni conference, the topic this year being futurology and “Trends”.

It started off with a panel discussion on “Futurology”, in which all three speakers were very careful not to talk about the future. They did though have much to say that was interesting about the past and present, and what that might mean for the future…

Murray Calder told a great story of American Scotch whisky salesmen asking why it wasn’t possible to simply make some more… of a 25 year old malt! Coming from an industry where current possibilities are clearly shaped by decisions made years – decades – before, Murray described the future as a range of outcomes and opportunities – I pictured the diagrams used by the Bank of England to describe possible inflation rates.


From www.clearonmoney.com

Looked at this way, the future is necessarily contingent – possibilities looking like quantum maps rather than binary outcomes. And as we are all too frequently reminded, the past is no predictor of future performance (though it may be the best model we have!). Murray described the future in Darwinian terms: those best adapted to change are likely to survive.

Alan Fowler similarly viewed the future as a range of options – but options which we can shape. By taking control and envisaging the future we wanted to create, he proposed “backcasting” to work out how to get there. For organisations, tying in strategy and workstreams to the hoped-for future could lead to big rewards; it also needed constant re-planning, since of course the starting point will have changed the moment one finishes planing. (He expanded this in his talk later in the day – the Isochron website outlines their approach to change management.)

William Nelson also concentrated on clients’ strategic objectives. Starting from a quantitative perspective, his clients want – need – narratives that they can work with: stories that can help define their future. (Francesca Elston recently wrote eloquently on the power of stories to shape our thoughts.) The need to create compelling narrative to help effect change is a powerful story of its own. Interestingly, Nelson also said that PR people and journalists need facts rather than the general story: they want solid information, despite there being a lot of evidence that journalists don’t know what to do with data.

They were each asked to identify the top trends they saw emerging; all picked some variation on mobile technology and social interaction. For Murray, “social” was nothing new, but we had access to new tools to accomplish these interactions; as mobile devices become ubiquitous, access to information and networks changes the way we behave in social situations, on- and offline. (We must all have seen people sitting in a bar with their friends – all interacting with people elsewhere through their mobile smartphohes.) Google, Wikipedia and IMDB have killed many old-form pub conversations…

Willie reckoned that smartphones have the capacity to become universal controls in our homes, allowing us to interact with otherwise “dumb” tools like central heating through an interactive hub. His really interesting take on this was that it would probably be a utility firm like British Gas which would get first mover advantage on this, not a technology giant or an ISP. Technology firms have been predicting the “internet of things” for a long while – Microsoft thinks we’ll have thinking mobiles, 3d screens and wholly integrated lives whilst Ericsonn foresees sentient hoovers, bickering cookers and interactive tv – all vying for our attention – and they may be right (at least for a tiny percentage on the top of the ladder); but I agree with Willie that using our phones to switch on the heating when we’re on the bus home or switching on lights at home when we’re away on holiday seems a much more likely future for most.

Willie’s other key trend was the changing structure of families – as cohabitation and divorce become more common, different ways of organising in social structures may emerge. (I have friends who talk about “their family of choice” – albeit that’s what I think I call friends…) As people marry or settle down later (if at all), and the economy continues its sideways slide, different ways of organising homes might increase – such as “co-living” or (for the less well-off?) squatting (the latter especially if the sympathy for the Occupy movement and the opprobrium heaped on rich bankers continue). Willie looked at both these trends in more detail in a later session.

Alan’s take was somewhat different: he saw mobile technology disrupting as well as strengthening social interactions, to the extent that it could damage community. (At these point all those who see “online community” as the future will be throwing their arms up in disgust.) He definitely saw technology increasing the gap between the “haves” and the “have-nots” – the government certainly sees access to technology as a driver of economic growth – and those without access by economic situation, location or choice are at risk of getting left behind. Alan foresaw an increasing gap between those at the front edge of technological innovation and those at the back.

The questions raised many more issues, such as

  • is democracy hard wired to think short term only – politicians rarely have a horizon beyond the next election (although Alan pointed that in his experience working with the public sector, ministers and their civil servants are often involved in considering the impact of policies and planning for decades ahead)
  • envisioning the future often leads to its crystallisation – we create our own futures (the point of Alan’s session in the afternoon – a way of making that happen)
  • technological change prompts behavioural change – but behavioural change takes a lot of time, and may happen in ways that are not foreseen
  • “the future” needs defining – it starts now; unless something catastrophic happens tomorrow will be much like today for nearly all of us – so it makes sense to keep doing what works today (though continuing to be adaptable to change)
  • large organisations don’t seem adept at managing the unforeseen – catastrophic outages at RBS in June and O2 in July were surprising in that backup plans didn’t seem to work

The main takeaway message was never make predictions – because you’ll be wrong – but coming at trends laterally and challenging the assumptions may produce some interesting ideas.

I went to three other sessions: one each by Alan and Willie, and another by a digital agency, entitled “Digital Futures: Trends in Social Media”. They did discuss trends in social media, but they didn’t discuss the future at all: everything covered already existed, even if some of the content may have been new to some attendees. William Gibson is quoted as saying “The future is already here — it’s just not very evenly distributed” – maybe it’s just difficult picking out which future will actually come to be.