This is the text version of a talk written for BBC Radio 4’s Four Thought programme, first broadcast on October 23rd, 2013. It was recorded at Somerset House in front of a live audience with David Baddiel hosting. Huge thanks to Giles Edwards for the invitation to speak, and for help and advice. Thanks also to Bill Wasik’s excellent And Then There’s This for the account of Jonah Peretti’s work.
Empires of Attention
Thank you for inviting me to come and talk today, and in particular, I want to thank you all for your attention. Your attention is a very valuable thing, and to decide to spend it listening to this talk here today, or at home on the radio, or later online, is not an insignificant act.
I’ve worked in digital media and broadcast for over 15 years, and I’ve become obsessed with attention. My story tonight will be about this obsession, and what I’ve learnt about the way attention defines our culture.
Because how we understand audience attention – how we ask for it, measure it, and build business empires by selling access to it – is fundamental to our culture. For the last few hundred years, the business of culture has essentially been the business of measuring audiences’ attention. We can trace a line of entrepreneurs of attention from today’s culture backwards through the last two centuries – from Jonah Peretti, who has used his intimate knowledge of the patterns of digital attention to build The Huffington Post and Buzzfeed, two of the biggest news and culture sites on the web; through Arthur Nielsen, who invented the ratings technology that the US TV giants ABC, NBC and CBS were built on; to Charles Morton, who took the raucous entertainment of supper-clubs and taverns and developed the more mainstream and wildly popular Music Halls of Victorian England, from which came the talent that would dominate the early years of cinema and radio.
These entrepreneurs were not leaders, but listeners – their particularly skill was in realising that audiences were consuming culture in new ways, finding new ways to measure these new patterns, and new ways to make money out of them. The story of these ‘empires of attention’ is the story of how we – the audience – have engaged with culture, and how the interaction between artists and audiences has moved from visceral participation to abstract measurement and back again. This story starts amidst the raucous popular culture of Victorian England.
Back in the 18th and 19th century, public entertainment was found in pleasure gardens, inns and taverns, with acts brought in to perform as the audience worked their way through dinners and rounds of drinks. ‘Song and Supper Rooms’ such as the Coal Hole, in The Strand, The Cyder Cellars in Maiden Lane or Paddy Green’s of Covent Garden offered food and drink till the small hours of the morning. London Gentlemen would go there to find foaming tankards of stout, a dinner and a cigar, all set to the warbling strains of the comic or sentimental vocalists attached to each establishment. The proprietor would act as ‘chairman’, leading the entertainment and calling on regulars to perform their favourite songs and comedy acts.
One regular act was the ‘judge and jury’ show, a thinly-veiled skit in the form of mock-trials of society scandals, with the audience and performers acting as barristers, jury and witnesses – in many ways the fore-runner of TV satire like That Was The Week That Was or Spitting Image.
The entertainment in these venues was a collaboration between the audience and the acts, with the line between the two often blurred by bonhomie and alcohol. The audience was noisy, and the acts used this noise as a feedback loop – a connection between performers and audience that created culture through call and response.
But this connection would start to fray as the 19th century went on. First the Theatre Regulation Act of 1843 relaxed the rules on theatre ownership, but on the condition that no eating, drinking or smoking could happen during a performance. The law was vague and poorly enforced, but it led to entrepreneurs like Charles Morton starting a new kind of entertainment venue, one that appealed not only to the male-dominated taverns and supper clubs, but women and families too.
Dining tables were replaced by rows of seats, the acts became more visual and spectacular through introducing scenery and sets, the participatory judge and jury shows were replaced by comedians and novelty acts, and the shouting and carousing moved from the auditorium to the bar in the foyer. Audiences became quieter, and stage lights were introduced to focus attention on the acts instead of the audience. Here’s a description of Charles Morton’s Canterbury Hall from 1858:
“We make our way leisurely along the floor of the building, which is really a very handsome hall, well lighted, and capable of holding fifteen hundred persons; the balcony extends round the room in the form of a horseshoe. At the opposite end to which we enter is the platform, on which is placed a grand piano and a harmonium, on which the performers play in the intervals when the professional singers have left the stage.
Let us look round us; evidently the majority present are respectable mechanics, or small tradesmen with their wives and daughters and sweethearts there. Every one is smoking, and every one has a glass before him; but the class that come here are economical, and chiefly confine themselves to pipes and porter. The presence of the ladies has also a beneficial effect; I see no indication of intoxication, and certainly none of the songs are obscene.”
By the late 19th Century, Music Hall was the dominate form of popular culture, and the audience experience had moved from the rowdy tavern to the theatre experience we would recognise now. The evening’s entertainment shifted with changes in popular taste, with opera and classical music giving way to variety and comic acts. The acts became increasingly professional – the comic George Laybourne was hired by Charles Morton for the princely sum of £20 per year – and they started to become well-known names, drawing a crowd on their own terms. But although these developments allowed the music hall entrepreneurs to build hugely valuable empires, they also sowed the seeds for Music Hall’s downfall.
At the beginning of the 20th century, cinema arrived with an even more spectacular form of entertainment than music hall, perfectly suited to the increasingly dark and quiet auditoriums. The transition took decades, with early films appeared as ‘acts’ within music hall itself, sharing the same comedians and singers popular at the time. But over time the longer feature film format became an evening’s entertainment by itself, and the music hall circuits and entrepreneurs – the ‘empires of attention’ of the 19th century – gave way to the new empires of cinema, and, by the mid-20th century, broadcasting.
This was a turbulent time for popular culture, with traditional business models struggling, and new technologies rapidly innovating formats and distribution models as audiences moved their attention to these new forms of culture. That might, perhaps, sound familiar to anyone here working in the media industry.
Throughout this transition, the one constant was the gradual breaking apart of the relationship between the audience and the artist. In 1850 popular entertainment happened in rowdy and participatory taverns and supper clubs. By 1950, the audience sat quietly in cinemas or at home, hearing and seeing entertainment that was recorded in a different time or space altogether. The feedback loop – the call and response of the music hall – had disappeared almost completely.
The rise of broadcasting brought new problems with understanding audience’s attention. Advertisers had no way of knowing how many people would actually hear their commercials, so without finding a way to measure attention, early radio broadcasters would struggle to compete for ad income with print and other forms of media.
In the early years of radio many techniques were used to measure audience attention, from sampled surveys and interviews to ‘radio diaries’ that asked listeners to record their own listening habits. For a brief period, fan mail was used as a way of indicating audience size and interest, often linked directly to the product sponsoring the show.
For example, Kate Smith was an early radio star with shows like ‘Kate Smith Sings’ and ‘The Kate Smith Matinee’ on CBS and NBC. Her show ‘Kate Smith and her Swanee Music’ was sponsored by La Palina Cigars, who ran a competition for listeners to send in cigar bands in return for a signed photograph. Over 50,000 cigar bands were submitted in response, demonstrating to La Palina the power of radio as an advertising medium.
But competitions like these were unpredictable and too reliant on the individual appeal of stars like Smith. Radio broadcasters understood that they needed to agree on a consistent measure of attention to convince advertisers to invest in radio, so they worked together to define a single measure across their industry.
This wasn’t just happening in radio- the period from the 1940s to the turn of the 21st century is known by academics studying the ratings industry as the era of one big number as the competition for advertising money between newspapers, magazines, radio, TV and outdoor media forced competitors in each sector to agree on how to measure their audiences. The result in every case was a single measure – the one big number that was used to represent the behaviour of millions of people.
Central to the developments of these measures was Arthur C Nielsen, who took his expertise in market research in the 1920s and 30s and developed new techniques to measure radio audiences in the 1940s. Nielsen ratings then became the main measurement of US TV from the 1950s, based around ‘Nielsen Sweeps Week’ – the periods during the year when Nielsen sampled the television audience, initially by sending out paper diaries for viewers to record their viewing, then later introducing boxes in selected viewers’ home that automatically measured the programmes they watched.
This data would be used to set local and national advertising rates, so broadcasters would fight for audience attention with cliff-hangers, ‘stunt-casting’ ( for example, bringing in a hollywood celebrity to guest on an episode) and special editions of popular shows, as in 1997, when the hit show ER broadcast an episode that was filmed and broadcast entirely live. In fact, the popular phrase for the moment when these desperate tactics mark a show’s decline is ‘jumping the shark’, based on a Happy Days special produced for Nielsen Sweeps Week in 1977 when, in a desperate bid for attention, the show writers wrote an episode where The Fonz jumped over a shark-pen on water-skis.
Most of the multi-billion dollar ad spend around the world today is bought and sold based on ideas of audience measurement that can be traced back to Arthur Nielsen and his early work in the fledgling radio industry. There is probably no single person who has been more influential in the creation of the global media industries – the ‘empires of attention’ – of the 20th century, yet he is nowhere near as well known as the writers, directors, performers and executives of the media empires built on his work.
But for artists and performers, these numbers were a thin and ineffectual connection to the audience. The rich engagement with the audience in live performance had now become a series of numbers on a report. During the late twentieth century – the ‘golden era’ of broadcasting – creators would sometimes not even see these audience reports, and would have to find other ways to get feedback.
In a documentary about the making of Blackadder, Director Richard Curtis said that in the early 80s, he never saw the ratings for Blackadder early series, and still to this day does not know how many people watched. He said he used to walk the streets of Shepherd’s Bush during the broadcast, looking in people’s windows to see if they were watching, and if so, whether they were laughing, as it was the only way he could know the audience’s response.
It’s a wonderful anecdote, but also a telling illustration of how far the gap between audiences and artists had grown in the 20th century – from the communal rowdiness of the supper clubs and taverns to the silent and invisible audiences of the broadcast era.
By the time I started working for the BBC in 2001, multiple generations of TV executives and creatives had only ever understood audiences as numbers. Ratings had become an obsession, but real connection with audiences was rare.
I remember being at one of Greg Dyke’s awaydays for senior BBC management during his time as Director General, where senior executives were asked for their opinions about the lives and tastes of sample members of the BBC audience. At the end of the session, there was a theatrical reveal as the real people we were discussing were brought into the room, invited to sit on our tables and talk to the BBC executives about the kind of programmes they watched. I remember the extraordinary sense of surprise and novelty amongst executives that we were meeting actual members of the public, and I felt a bit sorry for the people brought into the room as if they were some kind of exhibit. It made me realise how far TV industry executives were removed from the audience who enjoyed their content – they obsessed over abstract representations of audiences, but were almost speechless when they were brought into the same room.
This made me curious about how we’d ended up like this, how audiences and artists had ended up so far apart. I started researching the history of how we measure attention, and how this affects the way that we make and experience culture. I discovered that the last 50 years or so have been an unusually quiet and stable period – the dominate forms of culture, the business models behind them, and most importantly the way we measured audiences’ attention, hadn’t change that much at all.
If you grew up in this period, you experienced mainstream culture as a passive member of an audience, either sitting at home watching TV, or in a darkened cinema, with the exceptions being live music or comedy, which were mainly aimed at younger audiences. I remember seeing a chart at the BBC showing how the volume of TV watching changes as we age. The curve starts with a few hours a day as children, then drops sharply in our late teens and twenties as we go out to parties, gigs and pubs to get drunk and find mates. After we settle down as couples and families in our twenties and thirties, TV viewing goes up again – the most voracious TV viewers in the UK are not the advertisers‘ main targets of 16-34 yr olds, but the middle aged and retired.
A friend who works in digital media once asked a senior UK TV executive whether he was worried by younger audiences spending most of their time on social media and gaming consoles instead of TV. “Don’t worry”, the executive replied, “they’ll get older and more tired and then they’ll want to just flop out on the sofa and watch TV”.
Perhaps these younger audiences will return to TV as they get older, but with a new and very important twist. For the last 10 years has seen the return of the noisy crowd. The invisible voices of the audience are now being heard again over twitter, facebook and other social networks. When I moved from the BBC to Channel 4 in 2007, Twitter had only just launched and ratings were the only measure commissioners cared about. By 2011, when I left, nearly every commissioner knew how to search for the number of tweets alongside their shows, and the most followed people on Twitter were the actors, Presenters and Performers execs were commissioning to make TV. The feedback loop between audience and artists had, in a very new and odd way, been reconnected.
The new entrepreneurs of attention in the 21st century understand this new connection- they understand that culture spreads not by distribution – as with cinema and broadcast – but by circulation – sharing between friends over digital networks.
One of the most influential cultural entrepreneurs of the early 21st century has been Jonah Peretti, the co-founder of the Huffington Post and Buzzfeed. Peretti’s understanding of how digital culture spreads started in 2001, when Nike launched a ground-breaking online campaign that let people stitch personalised messages onto their trainers. Peretti, at the time a student at MIT in Boston, chose the word ‘sweatshop’, and when Nike refused his order, he started an email exchange to find out why. He sent this story to a few friends, and they in turn circulated it to more people, sharing it on email lists, websites and forums. Within a few days it had been read by millions of people and Peretti was on the TODAY show on NBC, talking about both the story and the interesting new way it had spread across the internet.
The Nike experience made Peretti curious about how stories spread through digital circulation, so he went around MIT asking for help to understand how it worked. The first factor was the science of networks and the number of nodes – or people in this case – you need for stories to become ‘contagious’ and spread widely. The second factor was cognitive science – why do we decide to share stories with each other? What kind of emotions – laughter, shock, outrage – drive us to circulate stories around our friends, and what are we saying about ourselves and our communities when we share stories?
Peretti called this ‘the social imperative’ – the ‘hook’ that drove people to share something they’d found with their close friends. After more experiments with creating viral culture he was hired by Arianna Huffington in 2003 to launch The Huffington Post. Derided at its launch as a vanity news project, its mixture of traditional news content and ‘meme’ culture grew an audience of 13.2m unique visitors by 2011, when it was acquired by AOL for $315m. By this time, Peretti had already launched Buzzfeed, an even purer distillation of circulation friendly content, with endless streams of list articles (called ‘listicles’ by Buzzfeed staff) amusing cat pictures and celebrity gossip. In July 2013, Buzzfeed had an audience of 37.9 million unique visitors, not far from the Huffington Post and mainstream media brands like the New York Times and CNN. Peretti had succeeded in creating not one, but two of the biggest ‘empires of attention’ in the 21st century. In a memo to staff in September 2013, he emphasised how this success was different from the broadcast media empires of the 20th century:
“There are many exciting, tempting, glamorous, lucrative opportunities that we will NOT do in the coming year and as more of these opportunities present themselves it will take discipline to stay on track. We will NOT launch a BuzzFeed TV show, radio station, cable network, or movie franchise — we’ll leave that to the legacy media and Hollywood studios. We will NOT launch a print edition or a paywall or a paid conference business — we’ll leave that to other publications. We have a great business model that has a bright future as social and mobile continue to become the dominate form of media consumption. We will stay away from anything that requires adopting a legacy business model, even a lucrative one like cable syndication fees or prime time television ads. What seems like a lucrative opportunity today is often a distraction from building something much more exciting tomorrow.”
That last line is a stark illustration of how the empires of attention are shifting as we move from an era of distribution to an era of circulation. Peretti’s memo lists a series of traditional media business models that BuzzFeed will not be following, and then closes by emphasising that they aren’t lucrative opportunities but are actually distractions from his core goal – inventing a new kind of media empire for the 21st century.
Peretti is a building a new Empire of Attention – one that synthesises the call and response of 19th century music hall and the incredible scale of 20th century broadcast distribution. The combination is a potent one – the sheer visceral impact of thousands or millions of people sharing and discussing your stories is a new experience for anyone used to traditional broadcast media, and we’re having to learn how to tell stories in an age of digital attention. We’re already hearing TV commissioners complaining that knee-jerk responses from audiences on Twitter are killing new TV shows before they have a chance to build an following. We are no longer a passive audience, but the judge and jury of what will survive and be recommissioned, deciding the fate of culture by how we spend our attention.
This new feedback loop can be incredibly empowering, but it is also destructive – the anonymity of social media can encourage trolling and other kinds of abuse. Crowds amplify the good and the bad in human behaviour, and the internet amplifies this even further. But I don’t think it’s possible to have one without the other – the noise is also the signal, and we will have to develop new ways to tell stories that take this into account.
The culture of the 21st century will be defined by how we synthesise these contradictions – scale and intimacy, spectacle and conversation, signal and noise. We have seen the relationship between audiences and artists move from intimacy to distance, and now back to a strange kind of intimate distance. What will culture look like in an age of digital attention, and what new empires will emerge around it? How we will we measure attention, and how will this change the relationship between artist and audience? Who will be the Charles Morton, the Arthur C Nielsen, or the Jonah Peretti of the next 50 years?
So, to finish, thank you again for your attention. Thank you to those of you here in the room, for your attention is the feedback loop that made it possible to tell my story. For those of you listening at home or online, we can now use the feedback loop of social media – I’m @matlock or @storythings on twitter. Thank you again for your attention, and I look forward to giving you some of my attention in return.
UPDATE: Jack Knight has written a fantastic comment on this post, giving a lot more background to BARB and sampled ratings in general. I highly recommend reading the comments after the post. If you’re interested in the history of audience ratings, I highly recommend reading ‘Rating The Audience: The Business of Media’.
BARB – the organisation that measures ratings for UK TV channels – has admitted that there were errors in its tracking system, and as a result some Channel 4 and ITV shows have ended up with false ratings. Broadcast Magazine’s article says that one of the programmes given false ratings was ITV’s X Factor – their highest rating show, and one of the biggest advertising targets in broadcast television:
“The entertainment show originally recorded an overnight audience of 8.96m (33.6%) on ITV1 and ITV1 HD in figures released last Monday, but this has now grown to 9.84m (36.91%) under the revised data. Meanwhile, C4 shows including 999: What’s Your Emergency and Grand Designs have also experienced audience uplifts. However, others have fallen, such as the 22 September episode of The Comedy World Cup, which dropped from 1.8m (7.9%) to 631k (2.76%) on the back of the gaffe.”
How can mistakes like this happen in a multi-billion pound industry reliant on accurate audience metrics? Looking for an answer to that question opens up lots more questions about why such a huge and influential industry relies on relatively crude measuring techniques that haven’t changed much in decades.
TV ratings are measured using mechanical devices that record the presence of viewers in the room when the TV is on, usually by the viewers pressing a button to register that they’ve entered the room. So it really registers presence, rather than attention – the viewer could be reading a newspaper, doing the ironing or using their iphone, but for the sake of the ratings they count as an avid viewer.
Ratings technologies have been refined over time, but the basic concept hasn’t changed since it was invented by Arthur C Nielsen to measure radio audiences in the 1930s. BARB is the UK version of TV ratings, using a panel of 5,100 homes to represent the UK TV viewing public. So each percentage point in the examples above stand for a measurement sample of just 51 homes. The amount of people in these homes is around 11,300, so each percentage point stands for a maximum of 113 people pressing their buttons when they walk into the living room. It’s often a lot less, as the percentages above are share of the total viewing audience (BARB calls this the ‘universe’) at that time – many BARB panellists might be out of their homes, or might not have the TV on at that time.
If we take the numbers of viewers in the sample above, we can work out the size of the TV viewing universe watching when these errors occured. For example, the 8.9m audience originally reported for X Factor was 33.6% of total viewers that night, so one percentage of that audience is 8.9m/33.6% – 264,880 viewers. This means that the BARB’s estimate for the total UK TV viewing audience on a Saturday night is around 26.4m people, which is 39% of the UK population of 62m people. So we could transfer this to roughly work out that the number of BARB Panellists registering themselves as viewers that night is 39% of 11,300 – 4,407 people.
Still with me? Lets now take the share of X Factor’s reported viewing to work out how many BARB panellists registered themselves as watching that programme. The original share reported was 33.6% of total viewers. We know the total BARB panellists watching TV was 4,407, so the number watching X Factor according to the original report was 33.6% of 4,407 – 1,481 people. So BARB measures 1,481 people watching a TV programme, and extrapolates that number to report an audience rating of 8.96m viewers. No matter how scientific and representational the survey, is remarkable to think that multi-billion pound creative decisions are made on such a small sample size.
Now lets look at the error size. BARB under-represented X Factor’s ratings by 3.31 percentage points, which was a difference of 880,000 viewers in the reported ratings. Again, if we take the total panellists viewing X Factor that night as 1,481 people, 3.31% is 49 people.
An error in measuring 49 people pressing a button when they walk into a room means that one of the UK’s largest media businesses under-represented the performance of their most important programme by 880,000 viewers. Is it just me, or is that completely insane?
When I started working in broadcasting in 2001, the idea that digital technologies were changing the media industries was pretty much a fringe debate. Most people had barely any experience of the internet, and many of the people I was working with were planning long careers in broadcasting, publishing or advertising – industries that had hardly changed in the last half-century. By the time I left broadcasting in 2011, everything was different, and digital had transformed these sectors beyond recognition.
Instead of denial, the tone of conversation in traditional media industries now is weary submission, as each sector sees the tidal wave of digital change break over their old business models, and hopes that they can find enough high ground left to survive. But I think that we’re barely halfway through, and there’s evidence that most significant shifts in culture take around 30 years to fully play out.
The reason for this is that cultural change is not just about technology or economics, but about changes in behaviour. The important phase of cultural change is not the adoption of new technologies, but about the way those new technologies change the way we consume or engage with culture. Its often the case that the first cultural products for new technologies merely mimic old forms, and it isn’t until the majority of audiences have changed to the new technology that new behaviours emerge clearly enough to sustain new forms of culture, and in turn new business models.
For example – the CD is 30 years old this week. Reading accounts of the development and launch of the CD as a technology, its interesting how much it was defined by traditional ideas of what listening to music should be like – ie listening to albums in their entirety, a behaviour learnt through years of buying and listening to vinyl records.
In fact, even the earliest CD players contained the seed of a radical shift in listening behaviours that would change the economics of music forever. This was the idea of random access to tracks – the ability to shuffle and skip through albums in ways that weren’t defined by the track listing set by the artist. Although this was a minor feature of early CD players, the new behaviour was a significant shift in how we listened to music, and developed over the next 30 years to create a new industry based around individual tracks, streams and playlists, dominated by companies like Apple that were not even legally allowed to be in the music industry in 1982.
Just as the CD contained the seeds of a new behaviour that would eventually change the music industry 30 years later, new behaviours around books and TV on platforms like Kindle and Youtube are starting to sow the seeds of disruption for the publishing and broadcasting industries. We’re only just beginning to see what behaviours might emerge, like the shuffle, to change these industries beyond recognition. Rather than being at the end of a decade of digital change, we might only be at the end of the beginning – by 2030 we’ll be looking back at Youtube and Kindle like we do the early CDs, marvelling at how much they resembled the media platforms they were only just beginning to replace.
In an interview with Broadcast Magazine last week, Simon Cowell suggested that Youtube could soon be considered competition for traditional TV Channels:
“There’ll be a point in the not-too-distant future when we’ll be able to watch TV and YouTube will be Channel 6. When we reach that point, they’re going to be serious competition.”
The comment comes just after the X Factor channel on Youtube joined Syco’s Britain’s Got Talent to become one of the few UK channels to top 1 billion views, closely followed by the BBC. Taking into account Youtube’s 2011 redesign to focus navigation around channels and the launch of its Original Content channels in the US and Europe, is Cowell right? Is Youtube becoming more and more like a to a ‘traditional’ TV Channel?
What is a TV Channel?
To answer the question, we need to remind ourselves what we mean by a TV Channel. Nowadays, channels are navigational elements – brands that convey to the audience a set of values about their programming schedule. But originally, channels and schedules were a solution to a specific problem – when you could broadcast content over a network all day (rather than the limited duration of a theatre or opera programme), what kind of structure would help audiences know when your content was available?
The first people to have this problem were the early ‘telephone newspapers’ created in the late 19th Century. They thought that telephones would be used to broadcast, not just for person-to-person conversation, and invited users to subscribe to content available on special one-way telephones installed in their homes.
The Telefon Hirmondo in Budapest was one of the most successful telephone newspapers, with around 15,000 subscribers at its peak. They solved the problem of how to organise content for their 12hr daily broadcasts by creating ‘issues’ – what we would now call ‘schedules’, carving the day up into chunks of hours or part hours. Here’s a sample of a Telefon Hirmondo ‘issue’:
|2:30 PM||3:00 PM||Parliamentary and local news.|
|3:00 PM||3:15 PM||Latest exchange reports.|
|3:15 PM||4:00 PM||Weather, parliamentary, legal, theatrical, fashion and sporting news.|
|4:00 PM||4:30 PM||Latest exchange reports and general news.|
|4:30 PM||6:30 PM||Regimental bands.|
|7:00 PM||8:15 PM||Opera.|
As the broadcast technologies of radio and then television emerged, they adopted the structure of these ‘issues’, and over a century later, we’re still organising broadcast content in pretty much the same way. The complex art of organising content for optimal viewing – the art of ‘scheduling’ – became one of the critical skills in broadcasting, defining the success of one channel over another, and therefore the price of advertising on that channel. A traditional TV channel is, in essence, its schedule.
What is a ‘channel’ on Youtube?
Youtube has traditionally been seen as a platform, not a channel. Rather than an editorialised schedule of content, it’s an open, searchable platform, allowing users to upload as much or as little content as they want, and for audiences to view content on those same terms. As the platform grew over the last 10 years, video views emerged as the most common metric of attention, with success seen purely in terms of the highest number of views. Reaching 1bn views is a significant milestone, one that only 50 channels have currently achieved.
But since the redesign, Youtube have pushed for subscribers to be the core metric, and for creators to focus on channels rather than individual videos. But Youtube videos are shared and circulated in lots of different ways, and the patterns of attention around videos are way more complex than broadcast viewing. Channel subscriptions are not yet the most popular way to find content on Youtube, with most viewing sessions starting with organic search – Youtube is the second biggest search engine on the internet. Youtube’s channel strategy is an attempt to change this, and to try to encourage more loyalty and ‘channel-like’ behaviour in its audience. The aim is to get longer viewing sessions, and to raise the channel brands on Youtube above individual videos, making it more suitable for the kind of viewing patterns expected on smart TVs in the living room. But this transition to a channel strategy is still in its early days.
What makes a successful Youtube channel?
As Youtube makes the transition from videos to channels, its worth comparing the list of most viewed videos to the list of the top 50 most subscribed channels on Youtube. The top of the most viewed list is dominated by mainstream content brands and talent-branded VEVO channels. Here’s the top 10 channels by video views:
Whereas the most subscribed is dominated by Youtube-native talent channels – Rihanna is the only VEVO channel to make the top 30 most subscribed. Here’s the top 10 most subscribed channels:
This illustrates the different strategies being using to make successful Youtube content. For some, the brand or talent is established enough outside of Youtube to drive views through organic search alone – this explains the number of VEVO channels in the top 50 video views list. Channels without other sources of traffic have to work harder to get attention to their content – this is why the top 10 most subscribed channels are Youtube-native comedy and games channels.
What’s interesting is when you look at the amount of work people put in to get subscribers. If we divide the number of subscribers by the number of videos uploaded to the channel, we get an unscientific, but interesting statistic – the average number of subscribers added per video. Here’s the channels that have done the least work to get their subscribers:
And here’s the channels that have worked the hardest, sometimes only adding a handful of subscribers per video upload:
Its interesting how many ‘traditional’ broadcasters are in that second list – AP, CBS and the BBC. And just out of this list, The Ellen Show comes in at number 12, TheXFactorUK at number 14, and BritainsGotTalent09 at number 16.
There are so many competing strategies on Youtube right now that comparisons like this are not hugely revealing, but there do seem to be three kinds of channels emerging:
Talent-led channels – broadly music based, views driven by organic search, very few uploads
Broadcast-led channels – linked to existing TV shows/channels, lots of uploads, (mainly clips), views largely driven by organic search, but few subscribers
Youtube-native channels – lots of subscribers, lots of uploads, most traffic driven by links within the Youtube platform
So, will Youtube become a TV Channel?
It’s early days in Youtube’s channel strategy, but at the moment, its hard to see the different strategies that talent, broadcast brands and native Youtube creators are using merging into something as coherent and consistent as a traditional TV channel brand. When Simon Cowell looks at Youtube and recognises it as a ‘TV Channel’, he’s seeing it from the perspective of someone who has been immersed in broadcast TV for years, and is more familiar with those patterns of attention than some of the new patterns emerging from native Youtube talent. To a man with a hammer, everything looks like a nail, so a man with some of the biggest broadcast TV brands will look at Youtube, which sometimes look like traditional TV, and assume that it will eventually become like the things he has spent his life building.
I think its more likely that TV channels will become a bit more like Youtube. If Smart TVs and other VOD boxes take off, we’ll start to see some new user journeys around content on our TV – organic search, subscriptions to channels/shows we love, social and algorithmic recommendations, etc. This will change the way that schedulers think about TV channels as much as the rise of multichannel satellite and cable did in the 1990s. Channel 4’s 4/7 channel on Wikipedia – scheduled partly in response to online buzz about Channel 4 shows – is an early indicator of this trend. Simon Cowell might be looking at the right thing, but from the wrong perspective – its not about Youtube becoming more like TV Channels, but about TV Channels becoming more like Youtube.
A few weeks ago, I was asked to talk on a panel about Marshall McLuhan in Bristol. One of the other panellists was Paul Morley, someone I greatly admire and who created many of the culture and ideas I grew up with in the 1980s (the first time I heard about Dada & Situationism was through his sleeves notes on Frankie Goes To Hollywood albums).
The panel was asked to discuss how relevant McLuhan was to 21st century digital culture, but quickly got sidetracked into a nostalgic eulogy for late 20th century culture, and in particularly Punk. Paul Morley was very dismissive of the landscape of digital culture, accusing social media networks of merely pandering to consumerist behaviours, and not creating anything of value or with real impact. From his perspective, the radical power of punk, and its impact on late 70s culture, was nowhere to be seen today.
Last night, Pete Townshend gave the John Peel lecture, and the people I follow seem divided between criticising his views on iTunes (which were really only a small part of the talk) and agreeing with his nostalgia for the role the record industry played in nurturing artists. The lecture was actually a pretty well-balanced view on the record industry today, with some specific insights into how the music business used to run, and how Apple and others could step up to play these roles.
But I think the nostalgia about the old days is misplaced. We are eliding a series of memories – about the way we consumed music, the role of companies in developing and distributing culture, and the physical artefacts themselves – into a set of assumptions about how culture should be supported, distributed and consumed. In doing this, we’re ignoring the fact that these assumptions were the product of a particular pattern of consumption, driven solely by the technical and economic drivers of the time.
In the late 20th Century, a world of limited channels for media distribution, achieving scale was incredibly hard, but the rewards were huge. With only a limited number of TV/Radio channels, or magazines, or shelf space in the shops, anything new had to displace the old. This led to a very predictable pattern of consumption, in which waves of ‘new’ content attempted, and occasionally succeeded, to break through into the main focus of audiences’ attention – the cover of NME, or Top of The Pops. Once there, the potential rewards from being in one of the few spotlights of attention were massive – easily enough to support artists for years, if they could manage to remain in the spotlight, or thereabouts.
This pattern – of working unknown in the shadows, and then ‘breaking through’ into the mainstream – is the thing we’re actually mourning when we talk about the last century. The media industries that were created around these patterns had the advantage of limited competition and stability, and as such could afford to indulge artists, or support bands over years, knowing that the reward of breaking into the spotlight would more than repay this investment.
If you grew up in this period, you learnt that this was the pattern of culture – a broad spectrum of niche, marginal culture, and a tightly defined mainstream that dominated attention. The positions of individual actors would periodically change, but the stage would remain the same.
The patterns now are very, very different. There are no technical limits to publication and distribution, but getting and focusing attention over a long period of time is a great deal harder. Scale is no longer a guarantee of stability. Production of culture is now open to anyone and everyone. Platforms and tools are becoming more central than publishers and distributors. None of this is new – our virtual book shelves are groaning with analyses of how the internet is changing content industries.
But in all of the studies of the technical and economic changes, we’ve missed the underlying shift that is driving these changes. The ways in which audiences’ attention can be driven to new culture is infinitely more complex than in the late 20th century, and its only been in the last 5 years or so that we’ve started to see what the new patterns of attention are. Some of them look familiar, with niche content organically (or calculatedly, in the case of shows like The X Factor) getting large amounts of attention. But these patterns are much more unstable that they used to be, and the rewards are nowhere near enough to offset hits and misses.
Alongside the familiar patterns of mainstream attention, there are a huge number of new patterns that could only exist in digital culture. Some of these patterns are very slow, with attention accruing over months or years, as social recommendation or small groups of fans gradually accrue around content. Some are extremely fast, synchronising audiences’ attention around a piece of culture within days, before moving on just as quickly. Some are driven by deliberate plans, orchestrated between broadcast channels and social media. Some emerge via the organic connections of lots of smaller drivers, from blogs and niche channels to SEO and twitter accounts.
But, regardless of the pattern itself, the difference is that they’re Spiky – there are no technical or economic constraints keeping the spotlight in one place anymore, so attention can move on as quickly as it arrived. This is the major shift that we are missing when we are nostalgic for the 20th century. We’re only just beginning to learn what culture looks like in spiky networks, and only just beginning to invent the companies and institutions that can survive long enough to support and invest in culture in this landscape.
Change no longer happens all at once for everyone, as it did with the rush of Punk puncturing the ennui of 1970s mainstream culture. In digital networks, change is happening everywhere, constantly, and the mainstream is a much more fragile and temporary consensus than it once was. There will still be moments when something breaks through to enough people at the same time to feel like Punk, but it won’t be the same thing. There are a hundred punk moments happening every day, if you look hard enough.
McLuhan would have understood this – he was, above all else, a master at recognising patterns in culture. What he did in the middle of the last century was point out that mass media was creating a phenomenal spotlight of attention through TV and other mass broadcast networks, and that the patterns of attention they created would be as important – financially, politically and culturally – as the content itself.
If he were alive today, I would like to think that McLuhan would be pointing out a slew of new patterns, and would be exploring the economic and cultural consequences as they emerged. Although McLuhan was a deeply religious man who resented the dominance of broadcast mainstream culture, his intellectual curiosity was fascinated by what these emerging patterns said about us. He didn’t mourn the patterns of the 19th century, but sketched out the landscape of the new culture, and was a prophet for the media industries of the last 50 years. We should take his lead, stop being nostalgic for the patterns of the last century, and start building the media industries of the future.
Something very interesting happened on Channel 4 last Wednesday. About half-way through the latest episode of Seven Days, one of the characters, Cassie, took out her laptop and started talking about how people were talking about her on the show’s website. Sitting at home, monitoring the performance of the site on my laptop, I saw a huge spike in traffic as thousands of other people logged onto the site to see what all the fuss was about. This spike was higher than we’d seen the week before, when the rush of people coming to the site on launch night crashed the servers, and even higher than the biggest peak we saw in the final series of Big Brother earlier this year. We’d clearly hit on something, but what was it?
For the last 11 years, Big Brother has been the poster-child for cross-platform projects – a show which was inextricably bound up in the interaction between the format, the audience and the ripples it caused in the outside world. But those ripples never made it back inside the house – we never saw BB contestants pull out a laptop and see what people were saying about them outside those high Elstree fences. The spike in traffic we saw in the middle of Seven Days was something new – it was an audience realising that they could become part of the conversation, part of the story, part of the lives of the people they were seeing on television. Cassie and the rest of the Seven Days cast were recognisably people living their own lives –in cafes, living rooms and bars – not the artificial tasks and traumas of Big Brother.
Seven Days has demonstrated that we’re living in a new world – a place where our audiences see their own lives broadcast to friends across networks like Facebook and Twitter, and where jokes, arguments and love affairs are conducted through comments and responses, likes and retweets, friending and tagging. Broadcasters have probably been a bit slow to create formats fast enough and open-ended enough to reflect the way we live our lives now. Seven Days feels likes it’s starting to explore what this might look like. It’s an exhausting, messy and complicated project to be working on, with a constant cycle of chatter going on between contributors, commissioners, producers and web teams. It’s hard, two weeks in, to get a grasp on what the show is, what it might be, and how we can best harness the intense spikes of attention we’re seeing around every episode.
I sat at home last Wednesday, watching my TV with my laptop, watching someone else reading about themselves on a laptop, whilst thousands of other people were doing the same. This is the world we’re in now, and Seven Days is an innovative and ambitious attempt to represent this world. Like Big Brother 10 years ago, it’s probably not right yet, but it does feel like the first step on a very interesting journey.
I’ve just had an interesting email conversation with Nicholas Lovell, the excellent games consultant and Gamesbrief blogger, prompted by his appearance at the Edinburgh TV Festival on a panel about the cross over between TV and Games. The session left me very frustrated, partly because it seemed to assume that the only reason that TV people would be interested in games is if they wanted to license their IP to produce a spin-off game. Nicholas (and Paulina Bozek, who made SingStar) did give a different perspective, but this came after two long sessions that were pretty dull histories of Sony and Ninetendo’s histories in the AAA game industry.
Having spent nearly a decade working for broadcasters, I know that this isn’t the way to get a bunch of creative people excited about your sector. How much more interesting it could have been if there were more creative talent there – Ben from Zombie Cow, Darren from Littleloud, or Phil from Preloaded – to explain how their creative process works. Making a TV programme and making a game share a lot of common skills, from great writing to stunning visual production and a keen understanding of your audience. The session at the TV Festival would have been a lot more valuable for everyone involved if it had focused on these issues, rather than a history of the games industry.
I was particularly frustrated, as I’ve spent the last few years (together with Alice Taylor) trying to get broadcasters to understand that games are valuable ways of delivering public value projects, not just parasitical, licensed projects feeding off a linear TV programme’s brand equity. The common ground between TV and Gaming isn’t licenses and IP – it is talent, stories and audiences. Its a pity that the panel in Edinburgh didn’t illustrate this.
A couple of years ago, when I started working at Channel 4, I came up with a model for thinking about social spaces online that focused on how users felt about being online, rather than the technical capacities of the platforms themselves. I did it so that people pitching to us would think about users rather than tech, but the model has stuck in my thinking, and seems more and more relevant today.
This is partly down to something that I thought about at the time, but didn’t write about. In the original post, I tried to describe a crude taxonomy of spaces in which users have relatively consistent expectations about what happened to the information they share, and relatively consistent behaviours that they expected from other actors (both real and technological) when they occupied that space. For example, if a space feels like its shared only by a group, users will share their information accordingly, and expect others to share their assumptions.
Likewise, user transgression is when someone shifts someone elses information from one register to in a way that wasn’t expected. A common illustration of this is newspapers taking photographs from Flickr without respecting the copyright limitations that users had put in place when uploading the photo. Loaded magazine was recently cleared of breach of privacy by the PCC following a complaint from a woman who uploaded a picture of herself to Bebo in 2006. Over the next few years her picture was circulated widely on forums, and she became an internet meme as the ‘Epic Boobs’ girl. When Loaded magazine called for their readers to help track her down, she claimed the article had caused her considerable upset. But the PCC claimed that as the picture was so widely distributed online already (appearing in the top 3 Google searches for ‘boobs’) the Loaded article could not be considered to infringe her privacy, although it would have been a different case if they had taken it directly from her Bebo profile in 2006. It was the gradual disemmination of her image between groups of users online that made it ‘public’ – not her original act, which she probably imagined to be for a group that she controlled, but groups who could access and share her image without her knowledge or control.
What is remarkable about the Epic Boobs and Facebook transgressions is that they are gradual and hard for the person involved to track. In an analogue media world, the transgression between registers is sharp and obvious – a newspaper would have had to contact you to get a copy of a photo for them to use, and your personal photographs couldn’t become a global property without you knowing about it. We now live in an age where transgression is insidious and invisible, where users can’t understand the potential risks of sharing until it’s caused them significant pain.
Understanding trangression is going to be *the* most important thing for business and users working online in the next few years. Users will need to interrogate the services they use for potential transgressions of their information across contexts (as with Facebook’s gradual publicising of user data); platform creators will have to be more explicit to users about how information transgresses different contexts, and make these transgreses more tangible to the user (simply ticking check boxes is not enough – these transgressions need to have grain and weight built into the interaction); and large organisations will need to understand the implicit and assumed contexts of the spaces they are using to connect to their users, and how to ask permission when they take contributions or data from one context to another.
We’ve been through nearly a decade of excitement about creating and scaling these new social spaces online. We now need to focus very clearly on how information moves between them, as these transgressions are not simply about data and networks. The boundaries that users understand implicitly are defined by emotions, not software, and we need to bear this in mind when we cross them.
[This is the third part of a short series, based on a talk I gave at MIPTV in March 2009, sharing some insights from our commissioning social media projects at Channel 4 Education]
We’ve had a couple of projects this year – like Yeardot and Battlefront – that run live for 9 months to a year. It’s incredibly hard to keep hold of people’s attention over such a long period of time, and to be honest, we didn’t expect to. The web is a smorgasbord of distraction, so you have to be realistic about how often people will come back to your project. This poses real problems for an ongoing narrative project – do you start a project with a big bang to capture attention? How do you deal with people coming late to the project? What if people drift away for weeks and then come back to the project again?
Designing a narrative structure that can cope with such diverse patterns of attention is really tough. Its probably easier for factual projects than fiction, partly because we’re used to drifting in and out of our friends’ online streams, so its simple to replicate this in factual/documentary projects. Its no accident that the first popular fiction projects – like Lonelygirl15 and its early precursor Online Caroline – used self-authored video and text to tell the story from the protagonists’ point of view.
Most users now carry with them a strong conceptual expectation about how stories are ‘read’ online, developed from their experience of following their friends’ lifestreams on Facebook et al. So it makes sense to follow a few simple principles to take advantage of these assumptions, rather than working against them and confusing your users:
Keep it simple, and signpost clearly
We massively overdesigned some of our projects when we first launched them. We tried to create too much atmosphere through strong designs, and the general response from user groups was “this looks great, but what *is* it?”. Through many iterations, we ended up simplifying all our sites a lot, with clear explanations of what the project was, who was speaking there, what you could do, and what was new. You probably have only a few seconds to engage someone before they move on – don’t risk being enigmatic, unless you’re dealing with a brand or project that the audience already knows and loves. If its a completely new thing, explain the project clearly and keep the navigation simple and consistent – most users will probably not come through the main site/home page, so the project’s purpose needs to ring clear from every possible interaction.
Have a clear voice for the project
Erika Hall’s Copy As Interface rightly points out that most navigation of the web relies on text, not images or video. For social media sites, this text is not the neutral voice of a machine or nameless authority, but vernacular, oral speech, written as if it were a conversation with a friend. As Erika puts it – “we’re not writing, we’re speaking with text”. Again, this is inherited from the fact that most of our interactions on the web now are with friends, not ‘sites’, meaning that we respond better to projects that use vernacular language. We find such language more engaging, approachable and interesting – as Erika Hall quotes Walter J Ong – “Orality knits persons together into community”. The single most effective thing you can do to engage your users and keep their attention is to have a clearly identified, oral, vernacular voice for the project. Ideally, this would be a named person, real or fictitious. On Battlefront, we have the excellent Orsi, who blogs for us on Bebo and generally gives the site a sense of idenitity. Incidentally, I think Battlefront’s tag line – ‘You’re Already Involved’ – is one of the best bits of copywriting on any of our projects.
Make it easy for people to leave footprints in the project
There are two good reasons for this – if users contribute to a project, no matter how small, then they’re more likely to remember it and come back. Secondly, a site with lots of user activity looks busy and active for other users – just like restaurants, people are more likely to stick around on a busy site than one that feels like you’re the only visitor. There is a third aim – to get some personal data that means you can regularly communicate with the user, but this can be a real barrier until people have spent a fair bit of time with the project and feel like they’re getting value. You can use promotional competitions as a shortcut to get user data, but I don’t think this is that valuable, as it drives a wave of attention that mostly just drifts out again like a tide once the competition is over. Ideally, you want your users to gradually increase the size of the footprint in the project as they get more immersed. This could mean initially clicking a simple vote or poll, then friending on a social network, subscribing to a newsletter, commenting and finally creating or embedding content in their own social spaces. On the home page of Battlefront, we’ve got a really simple interactive word-cloud with issues that users have uploaded for others to vote on, creating an immediate call for participation to new visitors. In fact, most new visitors seem to come through Facebook, meaning that they’re responding to specific call to action from the individual campaigners. On Yeardot, we deliberately tried to move users through to the individual contributors pages on Myspace, thinking that this is where the real conversation would happen. But we underestimated the complexity of this journey, and also the social barriers many feel in leaving a comment on a total stranger’s Myspace page, so after a few months we redesigned the main hub site to encourage commenting there as well. I don’t think we’ve really explored this yet, though, and we’ll be looking at the entry and exit routes through these projects to understand more about how to gradually encourage and feedback on activity from users – the user journey needs to start from their streams, not our site, and end up back in their streams again.
Or, you could just make games…
If you really want to engage people, get them to participate, and get them to return again and again, you might as well make games. As Aleks Krotoski pointed out in her talk at Dconstruct last year, there has been a baffling lack of communication between web design and game design, although this is now happening, and quickly. My fellow commissioning editor, Alice Taylor, knows far more than me about games, and is commissioning some excellent projects, such as Bow Street Runner, Routesgame, and a project with SixToStart that i’m incredibly excited about that will launch later this year. I hope Alice will write about her experiences commissioning these projects more on her blog in the next few months. But we should all learn from how the best games drag you in without having to read a manual; encourage early, simple interaction which is rewarded out of proportion to your effort; and then sets you iterative challenges that get the balance between effort and reward just right. Amy Jo Kim and Jane McGonigal have both written inspiring accounts of how gaming metaphors can be applied elsewhere on the web, and in real life. Sometimes I think its only a matter of time until gaming becomes the main metaphor for most of our social interactions. And then sometimes, I think its already happened…
In the next essay, I’ll talk about the holy grail – turning users’ attention into valuable interactions. For many, this will mean getting money out of them, but as I work for a public service broadcaster, I’m going to talk about more intangible things – how you know if people are learning about themselves, their lives, the world around them; and whether they’ve been inspired to act upon and change things as a result. Not too much to aim for, then…
[This is the second part of a short series, based on a talk I gave at MIPTV in March 2009, sharing some insights from our commissioning social media projects at Channel 4 Education]
“2008 is the year we hit Peak Attention. You can either carry on encountering as much as you do now, giving every input less and less attention every year, or you can start managing it, keeping some back to take long-haul attention flights. What are the consequences of living post-Peak Attention? Nobody will be able to understand anything hard unless they make sacrifices.”
When I started at C4 in 2007, the first challenge we faced was how to get the attention of the 14-19 age group we were targeting. Up until Jan 2008, C4 Education made (brilliant) TV programmes that were broadcast in the morning schedule, during term time – a throwback to earlier Education programming that was aimed directly at teachers and classrooms. About 6 years ago, the strategy shifted to trying to reach teens directly, making this morning slot an anachronism, as most of the target audience would have been in work, school or college. Janey Walker took over C4 Education in 2006, and realised that TV just wasn’t the right platform anymore, at least not for this audience, with these slots. This led her to make one of the most radical decisions in UK broadcasting – she decided that from Jan 2008, the entire budget (about £6m) for the department would go on cross-platform projects, trying to reach teens in the spaces where they spend their attention, rather than this rather empty part of the morning schedule.
A major broadcast channel like C4 has a number of routes to getting attention – a prime-time slot, an established programme brand, hiring a major talent/celebrity, or a massive above the line marketing push. We didn’t have any of these, so had to go out into the wild web and try and drum up as much attention as we could in whatever ways we could. Over the last 18 months we’ve tried a number of things – partnering with spaces where teens are paying attention, like Myspace and Bebo; distributing content over many different 3rd party networks; making content available for download to podcasts and mobile phones; seeding content in existing special interest communities; running competitions with glamourous prizes – to be honest, I’d fly a plane over a city with our project URL on it if I thought that would bring it to people’s attention. Not all of this worked, obviously. The traffic to our sites isn’t anywhere near the kind of traffic that C4 gets for Hollyoaks or Skins. But here’s a few things I think we’ve learnt.
Design for streams, not for sites
Most people using the web, especially in younger age-groups, now experience the web as streams, not sites. It might be the stream of updates in Facebook, or their contact’s Flickr photostream, or a string of results on Google, or in an RSS reader.
The average number of sites people regularly visit is generally reckoned to be 5 or 6, and most of these are services that organise and stream information to you – email, social networks, search engines, media libraries (ie Youtube or Itunes) etc. Every now and then a new site emerges that takes its place in this hallowed pantheon, but I wouldn’t bet on your project being one of them. Much better to design content that plays nicely with streams – content that can be interesting and enticing as a one-line text result in a search query, and that doesn’t mind being broken up into small pieces (Cf Matt Jones’ reference to the coke bottle in my earlier post). This is much harder for narrative projects than functional projects, as storytelling tends to rely on controlling the context for its impact – think of how the audience is controlled in a theatre or cinema, or how television has built its aesthetics around its ownership of the living room and the strictures of scheduling.
I don’t think we’ve really got this right in any of our projects yet, but its the thing I’m most interested in exploring and playing with. Funnily enough, I think I was playing with similar ideas in a project with Tim Etchells in 2000, called Surrender Control, which assumed that we had no control over the audience or context at all, and tried to create an intriguing piece of theatre using just 40 SMS texts. But maybe this is going too far – maybe you need to base a story in a solid context, but let it be manipulated, ripped up and shared around the web as much as your users want to. Dan Hill summed this up beautifully in a post about the social media ‘ripples’ around LOST a few years ago. In the end, I don’t think there’s one model here, just a set of principles – design your content to play nicely with streams, or to be ‘spreadable‘, in Henry Jenkins’ terms.
Don’t be snotty about marketing
This is a simple one, really. Too many people in the storytelling business look down on marketing. Some people I know in web startups think that marketing is a tax you pay for having an inferior project. Bollocks. The reality is, in a post-Peak-Attention world, sitting there expecting people to discover your genius is frankly naive. We’ve had our digital agencies and TV companies working as one team, making decisions together about casting, shooting, web design and marketing. Its all the same project, and it should be the same people making decisions about everything. Marketing now is just another way that people find out about you – another ripple in the stream. Its too important to draw an imaginary line that fetishes some activity as ‘creative’ and some as mere ‘marketing’. If you do that, you’ll miss the opportunity to be really creative with marketing, and will fail to market your creative genius. Think of the coke bottle again – every fragment and shard of the project should be as beautiful, enigmatic and thrilling as the whole.
One last piece of advice – take your time. Trying to mimic TV and get millions of people online to look at the same thing at the same time would be hugely expensive and a waste of time. In a spreadable media world, it will probably take weeks or even months for people to find your project. Designing something that has to be experienced within a certain period or synchronously with thousands of other people is a nice idea, but it plays against the asynchronous nature of the internet. Unless you’re building a site that works alongside a major piece of event TV – think X-Factor or Strictly Come Dancing – just relax, and don’t worry about overnights.
Instead, launch the project early, and often. Put out lots of little bits of content over time, and reward people who stick with you. Take the time to listen and work out why people are coming to the project, and more importantly, why they’re not. Make it easy for newcomers to pick up the story at any point, and to view content in any order if they want to. Attention is far too precious a resource these days to act like a bouncer and pull across a velvet rope if people turn up too late. The dirty secret of the web is that, although its never been easier to publish, its never been easier to be ignored. Worship every bit of attention you get from your users, as it’s their gift to you, not the other way around. Which brings me to the next issue – keeping attention. More on that in the next post.