Everyone knows the SXSW PanelPicker is too sprawling to actually navigate through conventional means like searching and browsing. No, we rely on social media, and the goodwill of those brave souls who actually hunt through the list for the gems. I’ve seen many of these lists, and it’s now the last day of SXSW voting. So I thought I’d compile a list of some of the lists I’ve spotted across the transom. Assume a media skew, since that’s my crowd. Feel free to add others (lists of picks, or just your picks) in the comments. Oh, and vote for my session, on what journalism can learn from science.
- Joy Mayer (Mizzou J-School): Journalism/community/education-themed
- Doreen Marchionni (Mizzou J-School): Journalism/social media-themed
- Cindy Royal (Texas State): Journalism-themed, w/ a tech & design twist (+ more picks from Cindy)
- Liz Henry (Bookmaniac): Culture-themed
- Patrick Ruffini (Engage DC): Politics-themed
- Linda Quach (Rovi): Connected TV-themed
- Maurice Cherry (3eighteen Media): panels featuring black folks
- Dana Herlihey | Evan Jones | Stephan Macleod (Stitch Media): Media-themed
- Michael Schaffer (ioStudio): Social media marketing-themed
- Laurel Miltner (PR2020): Marketing-themed
- Olivia Khalili (Cause Capitalism): Philanthropy/volunteerism-themed
- Jessica Lawrence (NY Tech Meetup): Entrepreneur-themed
- Nick Faber (BlogAds): Branding-themed
Media organizations are still grappling with the fact that on the Web, a page is only a tiny, tiny hyperlink away from any other. We’ve spent years trying to obfuscate this fact, at first refusing to “link out” beyond our artificial domain-spaces, then opening these “outside” links in new windows, building up elaborate schemes to suggest that these domain-spaces really are entirely separate places, that moving from one site to another really does require travel. As our media begin to disintegrate into their component parts – newspapers fragmenting into stories dissipating into excerpts breaking down, at last, into data – the last great refuge of the coherent, bundled mediascape of yesteryear has become the brand. Brand power – brand equities, brand identities, brand loyalties – will keep us relevant even after fragments of our content stand alone in the Webby wilderness, we’ve believed.
But what if it’s not true? What if the concept of “the brand” – which really kicked into high gear as the industrial era hit its stride – is eroding or devolving as well? Where would that leave us? What might it enable?
There’s some evidence that we hit Peak Brand long ago. Brands seem to have been reduced to products, becoming commoditized and generic. As easy as it’s become to “launch” a new one (Glenn Beck’s “The Blaze” seemed to explode out of the ether this month), it’s even easier to forget the dozens that seem to launch every moment (one year ago today, Mediagazer was heralding the launch of Fox’s mobile Hulu-killer, “Bitbop” – anyone remember that one?).
Today you can launch your media company and call it “TBD,” and folks will just shrug or chuckle. A year from now, some genius will launch a hot new media property called “Whatevs.” It’s only a matter of time until someone starts to go to all the trouble of launching a brand, and then realizes what they really need is a hashtag.
This matters because to us media folks, brands are still key to how we organize the digital universe in our heads, and it’s not at all clear that users think the same way. When we’ve started talking about Twitter “breaking” stories or users finding stories on Facebook, it underscores how porous the concept of a brand has become. Can we imagine a way to thrive in a universe where our brands might be invisible?
It also matters because concern for brand integrity can make us excessively risk-averse at a time when experimentation is vital. We’re in constant danger of designing media experiences that serve our brands first, our content second, and our users last. If every digital experience was crafted in isolation, freed from brand constraints, what would we be able to do? If we designed every page, every popup, and every app by asking how the brand might serve the user experience rather than vice-versa, where would that lead us?
Even if all of the above is true, brands aren’t going anywhere. AOL, among others, is doubling down on a brand-driven strategy. So we also have to imagine our roles as media organizations and media individuals in a world where brands only become bigger and more important, where in many users’ minds, our orgs are just “micro-brands” feeding the “umbrella brands” of Twitter/Facebook/Google/Bitbop/etc.
All these questions and suppositions are just teasers. For the real exploration of these ideas, you’ll have to vote for (and then attend) the session I’m hoping to lead with Megan Garber at this year’s Online News Association conference: “The Brand Is Dead, Long Live the Brand!” I promise great fun, tweetable nuggets of insight, and some rollicking surprises.
My entry for the legendary David Cohn’s Carnival of Journalism. Here’s his setup:
One of the Knight Commission‘s recommendations is to “Increase the role of higher education…..as hubs of journalistic activity.” Another is to “integrate digital and media literacy as critical elements for education at all levels through collaboration among federal, state, and local education officials.”
Okay – great recommendations. But how do we actually make it happen? What does this look like? What University programs are doing it right? What can be improved and what would be your ideal scenario? Or is this recommendation wrong to begin with? No box here to write inside of.
Get serious about self-driven learning.
We’re just at the beginning of an amazing moment for self-directed education. Dedicated auto-didacts can already sift through a lifetime of courses on everything from astrophysics to zoological medicine. For the most part, though, the world of self-guided learning still feels like the Wild West. Lectures are only spottily available; coursepacks are incomplete. Little guidance is provided for assembling a curriculum. Few mechanisms exist for students to work together, synchronously or not. And crucially, there are still very few markers of educational attainment that recognize these efforts. The self-taught astrophysics expert might have wonderful smarts, but no pathway to employment.
Schooling, at its best, isn’t about cramming our head with facts. It’s about teaching us how to learn. And increasingly, it’s about teaching us how to learn as much as we want to, at our own pace. Colleges and universities must embrace that shift, and as they begin to open up, they should begin to design learning experiences that work not just for enrolled students, but for motivated cyber-auditors. We’re far from the moment when the advantages of a classical four-year liberal arts education environment can be easily matched by an online learning experience (in most situations; we’re probably already past that moment for disciplines such as computer programming). But colleges need to start preparing for it now, and figuring out how they’re going to thrive in that world.
A few years ago, the vision behind Wikipedia looked like a pitiful impossibility, probably the way Wikiversity looks today. I absolutely would not discount the possibility that a free, crowd-powered educational experience might become a formidable competitor to an expensive degree program, and sooner than you think. I hear the derisive guffaws of a thousand assistant professors, fresh off another long night of grading their students’ work. I used to know a journalist or two who thought that way.
Teach “Your City 101.”
Most undergrads are dropped into higher-ed institutions with little appreciation for the workings of the city that houses their school. Meanwhile, every college has a surplus of longtime faculty who are gadflies in their communities, deeply immersed in local civic life. Get these two factions hitched up. Offer an elective to all students – for credit – in the history and politics of the college town. And here’s a twist: invite local non-enrolled residents to take the course for free.
Students will be able to transcend the usual town/gown divisions with a greater awareness of their effect on their city. Locals will develop a systemic understanding of the civic machinery that powers the place they call home. (You might even get a few would-be lifelong learners interested in the charms of the extension school.) All involved will get an introduction to the all-important discipline of micropolitics.
Make local news the textbook. Collaborate with a metro editor on the syllabus. Prove that city planning is just as exciting as it looks on The Wire.
Build a local wiki.
Journalism professors: the next time you get the idea to do an experimental one-off academic journalism project, consider building something that might outlive your class. Already, academic environments have been fertile incubators for some of the best local wikis. When three UC Davis students decided to create DavisWiki in 2004, they probably didn’t guess the site would live far beyond their time in Davis, CA. It’s gone on to become an indispensible resource for residents, the site of a collaborative news investigation that won attention from the likes of the NYTimes. For a while, an Omaha professor of marketing led a class of students in the construction and maintenance of a local wiki for that city.1
Wiki software and design remains abysmal – a problem that I hope DavisWiki founder Philip Neustrom is able to crack with his Knight-funded LocalWiki project. But that hasn’t prevented several existing city wikis from becoming impressive local guides, and I suspect that what exists is only the beginning of what could be if universities truly jumped on this bandwagon. The trio of (1) better wiki interfaces, (2) the dedication of student journalists and professors, and (3) a journalistic sensibility could make a college-powered local wiki a marvel to behold.
In fact, forget the local news. Aim to make your local wiki the textbook for Your City 101.
Journalism students should blog.
This little sub-head will probably inspire everyone who attends, teaches at, or works for a journalism school to rise up with a thousand counterarguments, many of which have some merit. “But many j-school students already do blog!” “Who reads blogs?!” “How will this help our information needs?!”
First, most j-school students don’t really blog. Many contribute occasional posts to blogs. Few learn the lessons of the format. I’ll make the case that blogging is the most malleable container for journalism today. A blog post can take the form of a classical inverted pyramid story, a video broadcast, an audio podcast, a photoessay, a bulleted list, a Q&A, or any of hundreds of other story formats you could devise. So students who aspire to report in any other format can train on a blog. But they gain many other powers that I think will be essential to tomorrow’s journalism: a sustained relationship with a crowd, an understanding of the potential life of a news story, the immortal hyperlink.
Second – and here’s where I really think society could benefit – I think that journalists will increasingly find success in specialization. Blogging encourages us to find a niche, to accumulate expertise and share our deepening understanding of a subject. The last half-century of journalism produced an army of generalists skilled at storytelling and parachuting, but bereft of the subject knowledge necessary to deliver what we need most – context. I hope that journalism finds its renaissance in a new generation of truth-chasers driven to grasp the bigger picture and render it in living stories we can all follow.
J-school profs, help your industry colleagues out.
The extent to which more industry-pointed research could help satisfy the information needs of society is arguable, but it would sure help journalists. More here.
- OmahaWiki is sadly no longer in existence. It’s been replaced by a somewhat sad-looking, thoroughly Monetized™ site called “TownCommons” that clearly harbors aspirations beyond Nebraska. [↩]
A call for help from the academy
For the past couple of days, I’ve been sitting in on the International Symposium on Online Journalism at the University of Texas. This gathering has increasingly piqued my interest every successive year. My past experience of the event has been watching from afar as brilliant folks from domestic and international journalism schools and organizations have delivered absorbing presentations on how journalism is changing and how society is responding to those changes. I’m glad I finally have the good fortune to attend.
Being here has given me the kick in the pants to write something I’ve had in mind for a while now.
The word “symposium” tends to produce smirks in the newsroom. During the year I spent at RJI, there were times I could palpably feel my newsroom colleagues suppressing eye rolls as I described the research being conducted by professors all around me. A strain of proud anti-intellectualism runs strong in the American press, where political reporters casually disdain political scientists, and working journalists fantasize about “retiring” to academia – retreating from the rigors of the field to navel-gaze about it for a spell.
My own career has straddled the worlds of industry and scholarship – in addition to my fellowship year at RJI, my first paid job in journalism was at Poynter. Several of the most brilliant journalists I know are now in the academy, shaping minds and asking far-sighted, important questions. I value the insight of folks like Jay Rosen – a journalism professor who’s never been a reporter – because they were forcefully articulating visions 20 years ago that the long-begrudging industry is finally accepting among its brightest hopes today. So I’m a fierce defender of the academy and its relevance to the discipline.
My belief in that relevance animates the requests I make today. Because I also believe the academy can do more for us – much more. Having hung around journalism schools a fair amount, I think they have a lot of surplus capacity to make our work better, and I intend to tap it, if you’ll let me.
I’m working on this project for NPR that reminds me every day how much we don’t know about the journalism we’re trying to create. Don’t get me wrong – there’s a lot we do know, and one of the goals of this project is to facilitate the sharing of that knowledge among NPR, its member stations, and others in our network. But we’ve developed most of that knowledge in the school of hard knocks, using crude derivations from broad metrics, responding to rough trends in user behavior, slowly grasping for something better than we have.
As we make specific decisions about engineering our sites, I’m frustrated how many decisions we have to make on the basis of hunches, rather than data. Our work raises a steady stream of questions with no readily available answers, only our suppositions drawn from our experiences. What does a user want most when she encounters a topic page? How do we organize a series of links in a way that least detriments attention? What signals on a blog post are most effective at promoting sharing?
My teammates and I are advocates and practitioners of usability testing, A/B testing, eyetracking, and the like, and we’re incorporating those tools into our planning. We’re using an Agile approach to incorporate our on-the-fly learning into our ongoing site development. NPR is the most delightfully research-friendly news organization I’ve ever had the pleasure of working for. And of course, we’ll be watching our analytics like hawks.
But an Omniture report is no substitute for a peer-reviewed study, and our research powers pale in comparison to those of the collective journalism academy. So help me out. I’ve got five requests – some bigger-picture, others more focused. If you think you can help with any of these, please give me a shout.
1. Help us build research into our practice.
As we construct our Argo sites, we’ll be building in hooks for research, devising ideas for A/B testing, and tweaking the sites in response to findings we uncover in real time. But we won’t have the backgrounds or bandwidth to develop studies that produce lasting insight. We’ll lean on our research team as much as we can to help reveal the motivations behind user behavior, but I suspect a collaboration with a top-flight academic team could supercharge what we might discover. I’d love to partner with some numerophilic doctoral students who could geek out over our traffic reports and divine long-term patterns across our sites that might escape our daily perusal.
2. Organize (and open up) your findings so they find us.
I have a strong suspicion that several of the questions I posed above have answers – or at least the beginnings of them – in the existing scholarly literature. But the world of journalism scholarship is so disperse that I have no idea where to go to find that literature.
It’s disheartening that the academy hasn’t gotten farther along in organizing its work on the Web for maximum impact and findability. Each year, PEJ’s State of the News Media report achieves ubiquity in newsrooms – partly because of its enormous scope, but in no small part because of the terrific elegance of its presentation online, with its consistent, easy-to-understand taxonomy. Why have none of the major academic journalism institutions built on this idea for other domains of scholarly research?
At the very least, there should be a Docuticker for journalism alone – a hub collecting and cataloguing the flood of research around journalism that flows daily. Something even more ambitious could easily become a mainstay of newsroom planning.
3. Synthesize your work.
Malcolm Gladwell sparked a cottage industry of fantastic storytellers who could translate ideas and insights from the academic world for a non-scholarly audience. Another reason for the State of the News Media’s impact is the fact that Tom Rosenstiel and company work with economical, lucid writers to convey complicated points.
In the design and technology worlds, scholars have developed a tremendous ecosystem for processing findings from the lab into insights for practitioners. Sites such as SmashingMagazine.com are brilliant at distilling the best information from the world of research into wonderfully useful guidance for Web creators. What’s keeping us from developing a similar information ecosystem for the news industry?
I saw a number of studies this weekend that working journalists would find fascinating and helpful. Yet they’re not available in forms I’d feel comfortable sending around the newsroom. In fact, I’ve never seen scholarship cited in the newsroom that wasn’t accompanied by a readable narrative translation of its findings. I understand that most scholarship is pointed at the academy rather than the industry. But that shouldn’t preclude industry-relevant conclusions from being written up in industry-readable language.
4. Give us a place to ask you questions.
This blog is something of a poor platform from which to deliver a request for help. I’d love to be able to target more focused research queries to domain-specific professors and students. Meanwhile, on the flip side, I’ve worked with and heard from professors and grad students hungry for ideas for research. There’s a disconnect here that makes no sense.
An Ask MetaFilter or Stack Overflow for news research could help foster a marvelous connection between academics and professionals. If a journalism department somewhere were to pioneer that type of forum and commit to addressing queries that came in (even if the answer is, “We have no answer”), I promise I’ll be the first to sign up.
5. Help us devise topic-based story structures.
Topic pages are janky. We know this. With a few exceptions, news site topic pages are good at one thing – attracting Google juice.
I’ll discuss in detail some of my problems with topic pages in another post, but before we can build a better topic page (or topic bundle), we need to understand some specifics. How do the goals of users who arrive at topic pages via search differ from those of users who reach them from our site? Are users generally attracted or repelled by the diversity of content on topic pages, and how does design play into those reactions? Is there an ontology of topic page types? What are their effects?
I have tons more questions along these lines, but I won’t crowd this post with them. Suffice it to say, if questions like these interest you, please e-mail me.
Longtime readers of this site probably know that I’ll be speaking on a panel at SXSW on Monday with NYU’s Jay Rosen, Apture’s Tristan Harris and paidContent’s Staci Kramer about the future of context. I trust that if you’ve been reading and you’ll be in Austin for SXSW, you’ll be in Hilton H on Monday morning at 9:30. This is a preview of my opening argument for the panel. If this seems like familiar territory for me, don’t worry, the panel is going to cover plenty of untrodden territory. And the session will be all the better if you share your thoughts and questions in the thread below. Also see Jay’s conversation-starter here.
If you’re like most people, you have a certain amount of ambient knowledge that health-care reform is happening. You pay attention to headlines, and you see a lot of stories about Nancy Pelosi saying this, or Mitch McConnell saying that. You catch a line or two about it in a Presidential address. You’ve watched some headlines about it in the evening news.
Chances are that most of the information you’ve encountered about this subject has been what I’d call episodic. Over time, you may have heard a lot about budget reconciliation, insurance premium hikes, the public option, the excise tax, the Wyden-Bennett bill, the Stupak amendment, and on and on and on. You know that Democrats are trying to do something to the health care system, but it’s either a government takeover or an insurance industry giveaway. Hard to tell.
This constant torrent of episodic information is how many of us encounter information about current events. This has been true for as long as any of us has been alive, but in the wake of the real-time Web, it’s become ever more constant and ever more torrential.
Hundreds of headlines wash over us every day. And part of why many of us engage in this flow is because we have faith that over time, this torrent of episodic knowledge is going to cohere into something more significant: a framework for genuinely understanding an issue. And we live with it ’cause it sort of works. Eventually you hear enough buzzwords like “single-payer” and “public option” and you start to feel like you can play along.
But mounting evidence indicates that this approach to information is actually totally debilitating. Faced with a flood of headlines on an ever-increasing variety of topics, we shut off. We turn to news that doesn’t require much understanding – crime, traffic, weather – or we turn off the news altogether.
It turns out that in order for information about things like the public option and budget reconciliation to be useful to you, you need a certain amount of systemic knowledge to be able to parse it. You need an intellectual framework for understanding health care reform before the episodic headlines relating to health care reform make any sense.
It further turns out that this systemic knowledge is actually a whole lot easier to provide than the episodic stuff. At the pace of daily news, health care reform seems really, really complicated. But one of the most knowledgeable journalists reporting on the health-care process has already distilled almost every health care system in the world into four essential types. It would take maybe ten minutes to fill in the details on this framework, but once you get that knowledge, it suddenly becomes a lot easier to understand the system we have in the US, and the system that the Democrats are trying to turn ours into. From there, all those headlines about “bending the curve” actually start to make sense.
Right now, the most common way the news industry attempts to impart systemic knowledge is by wedging it into our episodic reports. We’ll give you tons of stories on Congresspeople sneezing something that sounds like “reconciliation” and every time, a little ways in, we’ll say something like, “Reconciliation is a procedural tactic originally designed to speed adoption of budget resolutions through Congress.”
This is completely bass-ackwards. Journalists spend a ton of time trying to acquire the systemic knowledge we need to report an issue, yet we dribble it out in stingy bits between lots and lots of worthless, episodic updates. We do this for several reasons – high among which is your continued willingness to read story after story and watch ad after ad to get updates we could sum up in a sentence – and also high among which is the fact that we used to deal exclusively in media that are pretty rigidly bounded by time. The only way we knew how to tell the story is in terms of “What happens next?” not in terms of “What’s happening.”
These terms I’ve been using – “intellectual framework,” “systemic information,” etc. – this is what I mean when I say “context.” I’ve pitched you on the consumer benefits of context, but information creators are also slowly beginning to come around to the long-term ROI of delivering context as well, for several reasons. For one thing, our information becomes much more valuable and much more desirable to you as your framework for understanding it becomes better. Jay Rosen has astutely noted the uptick in attention to financial crisis stories after This American Life’s Giant Pool of Money episode laid out the context of the crisis. For another thing, the success of Wikipedia and the enduring popularity of items like “The Ultimate Guide to Everything You Need to Know About Social Media” has taught us there’s a real market for context. There are also significant advertising benefits to having more sophisticated structures for information than “latest updates.” We could dwell on the “why” for a long time.
But I want to use our time at SXSW to explore a more forward-pointed question: How?
For the first time, we have a medium perfectly equipped to capture and deliver both episodic and systemic information. How will these two modes of information interact on the Web? What sort of design and storytelling structures must we invent to impart context? Fundamentally, in a medium that’s not constrained by time, what is the future of the Timeless Web?
Help make our panel better. What are your thoughts, and what are your questions?
Folks are emailing/Tweeting over links to Google’s “Living Stories” prototypes, done in collaboration with the New York Times and Washington Post. I’m about to hop a plane to Amsterdam to give a talk about the future of context, in which this idea plays a prominent role (as you know), so I figure I should lend some thoughts. (Update: Had to board before I finished the post, so I’m publishing from Amsterdam. Hoi!)
First, all the organizations involved deserve props for looking beyond the current news story format. Even with all its flaws, the static news article on the Web is an overwhelmingly dominant paradigm. To reimagine it – especially from within the walls of a giant, classical institution – takes vision.
Second, it’s not the most impressive incarnation of the ideas behind it. It feels a touch austere, like the quiet tinkerings of a Google engineer’s idle hours. I say that having built something much like it (without some of the cool bits). In fact, Columbia Tomorrow probably felt the same way to the folks who viewed it – “All those big ideas, and this is the product?”
The lack of sizzle is evident in Howie Kurtz’s story about the project. He calls it “a new online tool that, well, isn’t exactly going to revolutionize journalism.” I think NYT digital CEO Martin Nisenholtz gets it about right in the Times story about the initiative: “In it,” he says, “you can see the germ of something quite interesting.”
I don’t think the fact that it’s still only a “germ” at this point diminishes the thought or work that’s gone into these efforts. We really haven’t built anything quite like this before. Inventing the future takes time! And I suspect the first time many people laid eyes on Wikipedia, their reaction was much the same: Some fancy encyclopedia you got here. Um, there’s a typo on the “List of Goonies characters” page.
So I’m tremendously heartened by the fact that influential organizations are starting to act on these ideas. Every groping step away from the conceptual and toward the concrete pushes this conversation forward. The basic question – “What might this look like?” – becomes less relevant, leaving room for bolder and more interesting questions to sprout.
Right now, the main reaction flitting around in my head is this: both Columbia Tomorrow and Google’s living stories seem, from one angle, like a retreat from Wikipedia rather than a step toward (or beyond) it. They’re tugging the radical reality of the Wikipedia topic page – pure, organized, ever-changing – back to a somewhat familiar, news-oriented frame. What if we started with a Wikipedia topic page, and began to imagine how a newsroom could improve that? How might we improve the storytelling? What might the talk page become? What would bring people back to follow the story as it progresses?
Footnote: By the way, Danny Sullivan has the best take I’ve seen, if you want a read on how “Living stories” work.
This is the keynote address I gave last Saturday at the Twin Cities Media Alliance Fall Forum. Please excuse the bad audio quality, like the thumps every time I advance a slide. I might record a better-quality version when I’ve got a moment.1 A transcript of the remarks is below the fold.
Sorry, Gina, but it strikes me that WikiCity could serve as a poster child for what’s generally wrong with the direction of hyper-local news efforts. Once again, what we’re seeing is a quasi-franchise business model based on selling low-CPM ads against freely generated content. Nothing special.
Spend any time wandering around WikiCity, and what you find is the same dog who doesn’t bark. No sense of each town’s quirkiness; no sense of place. Instead of a local cafe where the cook knows you like your eggs scrambled, you get an Egg McMuffin.
I’ll allow myself some snark here, ’cause I think it’s deserved. I would bet that most of what you need to know about this acquisition can be gleaned from this sentence in the World-Herald’s article about it: “WikiCity (http://www.wikicity.com) has more than 13 million Web site pages and is one of the largest ‘wikis’ in the world.”
How many of those millions of “Web site pages” do you think was ever touched by a real person? And how many will ever be seen by a single person?
WikiCity in its current state strikes me as a textbook example of a site built by robots. Such sites tend, in my experience, to appeal mostly to other robots.
Contrast it to Wikipedia, whose every page was built, word by work, link by link, on the actions of individual people. Or to Everyblock, whose pages run on powerful algorithms, lovingly engineered and hand-polished by a brilliant and careful team of makers. These are large sites built on millions of niches, but neither were built that way to start. Wikipedia began as a small collection of pages that became a massive collection over time. Everyblock started as a selection of data sets in a handful of cities, and has grown over the years to encompass hundreds of data sets in more than a dozen cities. They started small and built up, like every success story I know, rather than the reverse, which is the WikiCity approach.
“Scaling down” remains a problem for the Web, on site after site. Sites such as Wikipedia and Delicious function beautifully in domains where they can garner enough attention. If a Wikipedia topic is significant enough to draw the interest of even a dozen editors in a few months, chances are it will be pretty decent. But the more niche you get on Wikipedia1, the shallower and spottier the pages become. Look for a popular topic like “usability” on Delicious, and you’ll find a wonderfully curated selection of links, courtesy of the wisdom of crowds. But for a significant topic outside the site’s core niche of designers and techies, Delicious underperforms.
Howard Owens has written passionate criticisms of approaches to “hyperlocal” news that start with a giant, anonymous maze of computer-generated pages, all alike, all imagining that users will spontaneously arrive to populate their pages with genuine, quality material. Everything I’ve seen tells me Howard’s criticisms are right. These efforts are attempts to bring a mass mentality to a niche world. I’ve never seen a successful wiki that wasn’t built like Wikipedia, from the bottom up, page by page.
If I were advising the World-Herald, I’d tell them to reboot WikiCity and start building a wiki just for Omaha. Better yet, start with just one of the city’s six regions. Build on what you can from Wikipedia – giving proper attribution, of course – but begin with the understanding that it’s not going to be very complete just yet. Assign someone to add as much information as they can to the site every day. Create a content plan to prioritize what information you’ll pursue first. Early on, create pages for the most trenchant issues affecting the neighborhood; diligently and prominently link to those pages when the issues appear in your coverage.
For months, I expect this exercise will seem like a neverending, pointless slog, and no one will join in. After a few months, your traffic will still be underwhelming, but you’ll notice a tiny stream of fellow-travelers who’ll timidly participate here or there. Keep at it, and in a year, you’ll have a small but dedicated community. And you will probably have built something more significant than you had realized. After two years, it will begin to seem like it was worth the investment.
Come to think of it, that last paragraph could probably be applied to most successful businesses on the Web.
- That’s my neighborhood [↩]
Now comes the fun part. Over the next couple of months, I’ll be setting up a website for the panel, which I hope will be a great resource for anyone looking for what’s being tried and what’s needed to create a more contextual Web. There, we’ll begin collaboratively setting the agenda for the panel. I hope you will all participate in that process, and I hope to see many of you in Austin in March! Thanks again for voting.
For the longest time, whenever I read the news, I’ve often felt the depressing sensation of lacking the background I need to understand the stories that seem truly important. Day after day would bring front pages with headlines trumpeting new developments out of city hall, and day after day I’d fruitlessly comb through the stories for an explanation of their relevance, history or import. Nut grafs seemed to provide only enough information for me to realize the story was out of my depth.
I came to think of following the news as requiring a decoder ring, attainable only through years of reading news stories and looking for patterns, accumulating knowledge like so many cereal box tops I could someday cash in for the prize of basic understanding. Meanwhile, though, with the advancements of the Web and cable news, the pace of new headlines was accelerating—from daily to minute-by-minute—and I had no idea how I’d ever begin to catch up.
In 2008, I encountered a study describing others from my generation who seemed to share my dilemma. The Associated Press had commissioned professional anthropologists to track and analyze the behavior of a group of young media consumers. Their key conclusion: “The subjects were overloaded with facts and updates and were having trouble moving more deeply into the background and resolution of news stories.”
The study’s participants seemed to respond to this ever-deepening ocean of news much like I had. We would shy away from stories that seemed to require a years-long familiarity with the news and incline instead toward ephemeral stories that didn’t take much background to understand—crime news, sports updates, celebrity gossip. This approach gave us plenty to talk about with friends, but I sensed it left us deprived of a broader understanding of a range of important issues that affect us without our knowing.