Friday, 16 January 2015

The rotten apple.


Steve Jobs is dead. My best wishes go to his family and close friends - who are no doubt experiencing the pain and loss that go with losing a loved one, especially when they die before old age. He was a father, a husband, a son. RIP.

Now. I am not his wife, his daughter, his mother, not even a remote acquaintance of his. Yet, much as I have my daughters smiling face on my desktop, every time I access the internet I am faced with this.
Steve Jobs was a man. He made computers. Some of the computers he and his company made were different. That is it.

Like Diana (who was a woman who married the man who was the son of a Queen), in death he is being revered in a religious-style: confirming the suspicion that Apple is more of a cult than a company. Cults are dangerous.

As a User Experience Designer, many will try to argue with me about the 'good work' that Apple has done. Largely this will centre around the (now decades old) innovation of the Graphical User Interface (GUI) that used human-oreinted visual metaphors to represent the environment of a computer (desktop, folder, file, document etc.) so that people could more easily interact with, interpret and manipulate their data. Great. But innovations happen all the time. No one cried (beyond the appropriate close family and friends) when the inventor of the portable defibrillator died http://en.wikipedia.org/wiki/Frank_Pantridge - yet he invented something that had a dramatic impact on the very existence or not of many many people.

But then of course, Jobs has done more than that. He - through his leadership of the Apple corporation - has created personal devices that entertain us. He has enabled us to stay in contact with other people just like us across the world, and and to shut out the existence of the less pleasant that might physically surround us. We can now help anyone, anywhere instantly just by spreading information, ensuring that thousands of pounds each year of disposable income is not wasted on supporting out local businesses or available to charity.

He has created a global unification of the upper-middle classes. A joint aspiration to have the latest version of portable consumer electronics. This ensures that there is a constant demand for manufactured goods with a sales model that encourages the disposal of these items. I'm sure I could go on.

But the main argument is often that 'Apple is better because its easy to use'. And yes, it is. But does this mean that is it better?

Cigarettes are quite easy to use. They come in handy 20 or 10 packets, with enticing packaging and until recently seductive marketing campaigns. And they are addictive. Once you start using them, they are always on your mind - you want them all the time, even the idea, the haptic sensation of opening the packet, the feel of the slim tube between the fingers - you'll *need* to have one as soon as possible.

Now consider the analogy, but not in a knee-jerk way. Just because something is easy to use does not make it inherently good. Nor does it make it bad. It just makes it easier. Yes, Apple's products have been easy to use, they have been fun, they have been revolutionary. But so are cigarettes.

Apple's products are toys at best. Made for generations of self-interested kidults with short attention spans

Friday, 25 February 2011

Mitigate and move on...

Some of you SIG-IA subscribers might be familiar with this, but recently I've been trying to resolve a rather sticky issue with an ecom site: a seemingly critical piece of the ecom journey was taken away from the UX - descoped - during the development cycle.

Solving this challenge brought to light several issues that just keep coming up in discussion at the moment within the UX community:

1 - Deliverables. Over on IXDA there's a thread about a pattern library that covers the same ground. What should come out of IA/UX and who is it for. Part of this situation with the cart arose because IA specs defined a desirable user experience, but the detail of how this should be implemented was either unresolved or lost in a sea of other specification. When it came to the dev sprints (completed by a 3rd party supplier) pdf page designs were used as the 'specification' and reinterpreted and from the stories that emerged it became clear that the feasibility of some core functionality was very low.
IAs make nothing - we research, conceive communicate. If our deliverables are not effective, then we aren't fulfilling our role. It's more than just picking a software package, throwing it all in and hitting 'generate'. Our deliverables should meet the same high standards of being 'user-or-usage-oriented' as our solutions themselves.

2 - Process. Which brings us to the Agile issue. Dev work in sprints. The site owners have projects and programs and goals. Design and IA is done at a site section level (home page, checkout etc). By the time a sprint can be undertaken, there are already many dependencies in the designs of each feature. Retrofitting user stories to completed designs, which get built on assumptions about tech feasibility, because that gets done after the user stories. It's messy. What ought to happen (in an ideal world without clients, budgets and time-zones) is a high-level IA/UX is worked out and assessed for feasibility by all involved. Then design and dev can work in sprints on features, slowly filling in the big picture. The challenge for the IA/UX here is keeping a hold on the current state of the big picture so that coherence, user flows and consitency can be maintained - any hints!?

3 - The art of mitigation. As an IA/UX, I don't actually build anything. As such, I can't control many of the factors that effect the outcome. My job is to mitigate against and for those factors, and the value of that role has been made very clear in this circumstance. While the debate continues to rage every so often about 'who needs UX anyway' (no doubt fuelled by the massive increase in demand and the dubious skills of many presenting themselves for roles) this situation desperately needed UX. Dev couldn't have solved it. Dev could only sticky-plaster over the individual effects of the descope as they became apparent. Design couldn't have fixed it; they needed to resolve the visual display issues that arose, but that would have resolved system issues on which those displays were dependent (such as the persistence of cookies).

Autocratic or Anarchic ecosystem

We couldn't change the way it was getting built. We couldn't change the available interface elements. But we could mitigate the effects of those for the user through careful use of visual concepts (such as placement, prominence, change in state) and language to manage the expectations of the user so that they were met by the experience on offer. I get nervous when I see IA/UX specs that say 'things should always be like this'. This kind of 'benign dictator' approach is at best over optimistic and at worst damaging to the user's experience.

Apparently, a guy called Eamon Hennessy, who was an anarchist of sorts, once said of rules:
"The good people don't need 'em and the bad people don't pay attention to them, so what use are they anyway?"
He kind of has a point. If all we do as IAs is set out hard and fast, unbending rules for others to follow, then we're in danger of becoming more of an obstacle to good a UX than a creator of one. If we take care to understand the issues of the products we're specifying, in the same way a traditional architect understands steel, plumbing, soil erosion, and respects and seeks the opinions of the experts in these issues, then we can provide guidance to the people that actually do the designing, the development, the marketing, the owning of the products, and advice when things go wrong.

Over and Out.

Monday, 6 September 2010

Once a UX, always a UX - the human in the machine

Right now there is a cleverly orchestrated spat going on in the blogosphere about whether the role of user experience designer is really necessary, or even if it exists. Sparked (I believe) by a blog post from Ryan Carson (http://thinkvitamin.com/opinion/ux-professional-isnt-a-real-job/) and countered by Andy Budd (http://www.andybudd.com/archives/2010/09/why_i_think_rya/index.php) as well as a flurry of tweets, essentially the argument goes like this:

1. User Experience is just a product of code+colouring in
2. User Experience is easily addressed by a diligent developer+designer
3. User Experience therefore doesn't exist.

It's obviously a no-brainer that I'm going to disagree. I've been working on the 'human factors' side of the internet for 10 years or so (I remember the days when I was keen to stress how long, now it just belies my age!). But what has sucked me in to this argument is not a need to add my voice to the yay's, but because so far the reasoning is all wrong.

UX is a valid 'profession' because...


1 - Designers and Developers are professionals, and should necessarily be immersed in the business of producing the best code / visual experience as possible.
2 - Users are not a definable or static entity. So the argument that you can learn to do 'good UX' while billing all your time for dev or design is impossible. The point of having a UX onboard is to assess the current context of the user experience and discover the particular requirements. If you pay a UX to come in and just tell you what their last client's users needed, you're not getting good value.
3 - Jack of all trades is master of none. As a UX pro, I like to keep up to date with development, hardware and tech, design styles etc. so that I can have useful conversations with these fields and anticipate issues. But I don't code and I don't draw (apologies to the design folk, I know I'm being reductive), and if I were to spend all my time learning how to and staying up to date then I would let my specialism slip. I have suspicions about any pro who claims to be such a polymath that they're master of all.

Size does matter

I've been paid by tiny clients to work on 5-page brochureware before. And I've tried to persuade them not to. In this case, there is a big argument for sharing best practice and letting the team get on with it themselves. So I'd focus on improving that client's process: best practice libraries, reusable questionnaires to elicit needs, overarching principles etc. But this is not where I earn my bread and butter.

My road to UX started at Ofsted, a high-profile UK government department responsible for the inspection of (then) schools, educational institutions and childcare across England and Wales. These reports were published on a website. Internally, a staff of thousands spread across the country and at home struggled to keep in tune. Quite frankly, it was a mess. Yes, there were developers. Yes, there were contracted designers sometimes. But the UX niche was carved to create a backbone of rational response to human and organisational need, which could be delivered by these parties.

I've heard web development and communications teams around the globe sigh with relief at the emergence of the 'UX professional' as a resource - something that can deliver the high level strategic requirements and the detailed interface and structural designs to fill in the gaps of their existing team knowledge.

It's not who you are, it's where you come from?

OK, but my biggest gripe with the current debate is to do with the background of practitioners. Way back, in the glory days of the Sig-IA list, there was an almost continuous feeling of 'show and tell' as the community emerged from a wide variety of different backgrounds. At that time, 'Web Developer' was way down the list of likely backgrounds.

The list of skills in a UX role has grown alarmingly in recent years. Now, it is not unlikely to see a job spec that expects candidates to develop and design interfaces. I'm not suggesting that a proficiency in these areas should prohibit people from being a good UX, but there is a genuine worry that the closer the UX designer gets to the 'machine' the more sympathy they may have. When we are good at something, we find it hard to imagine that anyone else might struggle. I now use a lovely Mac on a fast broadband wifi connection. But it is nice to remind myself of what it's like using an 5-year-old Dell on a dialup every now and then. Some people still don't have iPhones you know.

If what you are after is a basic level of easiness in your web projects, fine: Read a few articles on A List Apart and get it done. But if you genuinely want to deliver digital services to a population, then get away from the keyboard, walk the street, and find someone to be the advocate of the technophobe.

Developers: give yourself credit. Development is important and highly complex. You don't need to hang a fashionable UX tag round your necks to feel necessary.

Thursday, 27 May 2010

CAB Direct is live



I'm proud of this baby, these battles were hard fought

http://www.cabdirect.org/


Lovely database full of endless info and bibliographic references. Now all wrapped up with filters, promos, help and clear text.

Quite something for an academic site, I'll tell you :)

Hi Christina!

Friday, 12 March 2010

customer support on twitter (anecdotes)

Further to IBF's and others' musings on using Twitter for an aftersales/customer service channel, here are some anecdotal comments taken from the BNM (Brighton New Media) mailing list. The demographic of the list is naturally early-adopter-oriented, but these anecdotes demonstrate the comparative (or at least percieved) effectiveness of twitter vs. telephone support:

For those with BT Broadband woes, myself (and others on this list I've
noticed) have recently been helped by the social media pixies on twitter,
@btcare.
A quick moan about BT services on there and they'll message you back and
seem to get the problem sorted quicker than through the usual channels.


I found the same with Quark - when I had moaned all morning on twitter about installation and wrong codes blah blah I had a message from a bod in California who had it sorted in an hour! Tech eventually got back to me 3 days later by conventional methods...

I get IntelliJ, Grails, and Tapestry support via twitter. All you have
to do is mention an issue you're facing and somebody (often the authors)
will jump at the chance to help.

Really useful!


Thanks BNMers (again).

Tuesday, 9 March 2010

Cutting out the middle-man(agement)

Following on from the IBF member meeting and the New Directions in Usability report (not just yet, out in March), here are some more examples from this month's Wired (UK) mag about how work is changing.

HubSpot - ditch staff holidays
Treating employees like contractors, leaving them to manage their own workflows and productivity with monitoring processess in place. Allowing out-of-office, flexible hours work patterns.

BestBuy - Global Brand, local responsiveness
Uses twitter sales support and customer feedback to provide local service tweaks (including and all-night NY store and a 'Coffee&Donught' session in Tampa). Local teams respond locally to demand.

LiveOps - enable networks of freelancers
Telephone support operators employed as home-based freelancers make up the staff for this "call centre in the sky".

Mosaic - "not an employer, but the hub of an entrepreneuerial community"
Investment banking consultancy that 'cloud sources' its staff - creating an enabling framework (including skype meetings, yammer channels). Consultants run concurrent careers and projects, Mosaic keeps overheads low.

Cancer Research UK - transparent, relevant, responsive
Scientists directly communicate with and monitor contributions from donors - keeping them motivated and focused on the goal, and keeping donors (customers) in touch with developments and progress.

All nice examples, and a flavour of the way in which the world of work, business and employment are heading.

Wired Magazine UK - Work Smarter
Intranet Benchmarking Forum Member Meeting roundup (eek! with a vid of me!)
forthcoming report New Directions in Usability

Monday, 22 February 2010

Chaos and the art of poor presentation

Last week I presented a key note to the February gathering of the UK's leading Intranet Managers at the IBF's Member Meeting.

And I was about as effective as a Llama on the London Underground.

Here's some of the feedback comments:
"too much was fit into one day and not enough exploration"

"slightly chaotic"

"maybe not forward thinking enough"


Why? It certainly wasn't the fault of the event organiser, the IBF (www.ibforum.com) who put on meetings like these year round and across the globe, sharing best practice and insights in a confidential, cooperative environment.

Nor was it the gathered crowd, who represented the cream of intranet professionals and practitioners and all arrived enthusiastic to engage and share with their peers in what, to me, was a uniquely collaborative environment.

For my part, public speaking is not an issue and I've delivered many well received sessions on usability etc. and facilitated enough meetings.

So how did it all go so wrong?

Well, beside the over exposure of being thrust into the attendees faces all day long :-) I committed the first mortal sin of usability design. Oh the irony.

Whilst preaching about the importance of envisioning the context of content, users, and goals, I completely failed to apply the practise myself.

-The content was about sharing knowledge - not broadcasting information.
-The users were expecting to contribute - not to recieve.
-The goal was to support members - not to provide for them.

Doh!

So apologies to all of the attendees - but also thanks, I could have sat there for four more days happily soaking up the knowledge and experience in that room. My failure is a credit to you all - never have I seen a room so full of cooperation between effectively competing organisations.

The IBF meeting is one example of the real best way to improve user experience for all - not through monothetic blabbering by conference-circuit professionals, but by enabling the sharing of ideas and experiences between active practioners like yourselves.

Bravo!