Target is out. Turns out treating Canada like a podunk backwater instead of the geographically challenging but sophisticated market it is didn't go well. I really thought they'd stick it out, proving that I really don't understand these mass-market general merchandise retailers.
Also Tilley. Not out, but who knows what this will become. While I'm definitely not in their target market, it's always tough to see a great Canadian success story come to some sort of end.
thoughts and personal opinions about the great wide world of being a consumer, being consumed with things and dealing with customer service both bad and good.
Thursday, 15 January 2015
Saturday, 3 January 2015
"and a little of that human touch" (springsteen lyrics)
It’s 2015. We’re in
the future of the movies of (some of) our youths, as in Back to the Future was
set in 2015. Blade Runner's date is rounding the corner. So while we collectively look
to see what fiction imagined, we find ourselves living some innovations that weren’t
imagined, or perhaps were not imagined the way they manifested. We do not have driverless
cars, but cars that park themselves are here, and Unattended Train Operation (driverless)
transit is already in use.
So while technology continues forward, often the most
sophisticated algorithm in the world still doesn’t quite get it. While I’m always awed that the PC Plus
program has a pretty good idea when I need more popcorn, a product I don’t
consume on a cyclical basis, there are times when a little decorum and humanity
is needed.
The recent episode with Uber comes to mind. In the middle of
a rash hostage taking in Australia, Uber’s booking system noticed an unexpected
uptake in cab usage – you know from people running for their lives – and the
system’s usage metrics kicked in and started upping the cab fees. An innocent
enough reaction given circumstances that fall within normal range, but,
clearly, in this case a place where a touch of good old human metrics would
‘get it’. Similarly, but quite so crudely, this year’s Facebook ‘year in
review’ feature took posts with the most “Likes” and used those posts to create
a snapshot of a person’s year. The part the coders failed to get was the
sometimes, in a method of support, people - myself included - will Like a post that isn’t necessarily
positive, but in a show of support, say when a goal isn’t reached, or a loved
one passes on.
I consider myself an amateur enthusiast of analytics and the
part that I find the most interesting is the story part of the analysis, but in
both telling and creating stories, options have to be considered. We all know
computer code doesn’t do so well with grey concepts. And truthfully, it’s
almost impossible to consider all angles, a human fault, but consideration to
how people act and use features should be part of the story here. The Uber
example was almost impossible to predict; the idea was the increase cab fees
during peak demand times, like the end of Saturday night, or after a major
sporting event, or even more vindictively during a transit shutdown or major
storm. A concept, not loved, but understood by anyone who tried to rent a car
over a holiday or book a hotel room for a major event. But the Facebook
situation isn’t that far of a reach, and yes the post is editable, but the
articles I found mentioned instances where a loved one who had passed away, was
featured in this auto generated post, and understandably was startling to those
users.
Most of these systems are pretty ubiquitous. Like the
popcorn, or the Amazon recommendations, or that fact that certain government
bodies that might be watching my consumption habits in order to be keyed in to
my potentially devious plots… unfortunately what they found out is that I buy a
lot of makeup and eat a decent amount of pizza. Revelatory, I know!
I had a tersely written, complain-y description of the automated
job search filters currently in use. Needless to say, since I’m actively
looking for work, perhaps, a bit of human review might make the entire process
just a little more fruitful, for both parties.
Similarly trying to track down an answer to a question on a
government form used to be a tedious task of being puzzled, reading complicated
language on a website and sometimes in a fit of exasperation, calling to listen
to options on an automated phone system. Although my last few interactions with
government were very human indeed, I was assigned a person who handled my case,
answered my questions, and who I talked to each time I needed information. And
believe me when I say, it’s rather unsettling to get straightforward, logical
answers from your government.
As we mature into this technology - and we all use it. Algorithms
suggest what we might like on shopping sites, on movie sites, when browsing for
the next thing to read perhaps we, the royal we, will get better at considering
situations that may not fit the norm, and require intervention. However when
thinking in evolutionary terms, we humans may be the weak link, but I think
we’re still best equipped; after all our gray matter considers all the colours.
and the winners are... - not really a post
CBC announces the top marketing blunders of 2014, and in the video part (sadly not the printed version), Target Canada is called out for failing, apologizing for its failure but not actually doing anything to fix the failure (at about 1:42). Plus some other doozies.
And we'll start the year out with a few more.
And we'll start the year out with a few more.
Subscribe to:
Posts (Atom)