I should start a category in this blog called “the spilt
milk department”. I’d include all the great ideas I had in the last
years and that were never developed or wrote about. I could make up
excuses and say I’ve been busy (I have) or that I’m lazy (I am), but the
fact is things evolve quite rapidly on the interwebz and it’s easy to
blurt out a few ideas that quickly will dissolve away into
cyber-oblivion, and you need time to build a solid concept. But the rule
of thumb is “just do it”, and not “procrastinate until it’s perfect in
your mind”. Enough crying. Here goes:
Two years ago I thought about how we could trace back to the original
source of information, mostly news, on the web: the first tweet, the
igniting blog post, the seminal article that got shared, sliced to
quotes, linked to, built upon, changed, remixed, archived. Not only it’s
a good way to understand the current online ecosystem, but if analyzed
correctly, this flow could provide new insights to create new
distribution strategies for news contents, and a real assessment of the
impact of a specific event in social media (all media is social these
days, by the way). I presented
my thoughts about this last year to a class of MA students in their Cyberjournalism class at Porto University and the plan was to build a tool that would track that flow from the very start.
Of course, better said than done, except at the NYTimes, where they
built a tool quite similar to my original concept called Cascade.
They did in 3D, like I wanted to, and it’s beautiful to look at. Check
their video but read on afterwards to dive into my own concept, which is
similar in basics but somewhat different.
The way information about a news event is distributed has
changed dramatically in the last years. The so called traditional media
are no longer the diving force in this process rather to be substituted,
in part, by the active participation of users, that became creators,
distributors, and sharers of news contents, using tools like Twitter,
Youtube, Facebook, blogs and other social networks.
We already know this, but what I’m looking for with this proposal is a
model that provides a clearer view on this change, its consequences,
and how media should rework their strategy in this decentralized logic,
and take the most from it. To understand that logic allows a better,
more profitable management of their resources, in gathering and
subsequent information distribution, leaving their central role as
source of all journalistic contents, but intervening in different parts
and ways along the flow. It’s the end of the mainstream media concept
and their transformation to stream media.
The ideas presented here were the basis for a prototype, that could
be validated and improved with a more scientific and empirical approach.
My goal was to create a three dimensional, dynamic visualization, that
corresponded to reality. The NYTimes example presented above is a good
example of that. But for now, I’ll use my drafts, which are quite
simple, to give a basic perspective on the current and future changes in
the news paradigm.
In the age of pyramids

The old paradigm
For decades, news were built and distributed in the same way,, and we
have to understand the classic news model that dominated from the
industrialization of journalism, specially in the 20th century, and that
remained relatively unchanged even with the coming of new media like
Radio or Television. The breaking point occurs with the internet and the
Digital Revolution.
Media as source of the information flow
In a pretty simple way, this is how it worked:
- Event;
- Journalists gathered information in the field with the news story characters, witnesses, official entities;
- Information was edited, published;
- The audience had access to the information in print the next day or in the following news segments on TV or Radio;
- After consuming the news, all the information could be discussed by
the readers/listeners/viewers in a more or less private way, in a
direct, interpersonal relationship. The information could be recovered
in new news pieces if new developments occurred;
And its useful life ended here, not long after it was created. Unless
the audience members created their own archives, there weren’t many
chances to recover or reuse that information, since all content (or most
of it) was archived and in the possession of media companies. The main
feature of this half-life of news was its caducity. Only the more
relevant events would become a part of the audience’s collective memory,
in a scattered, individualized and disorganized way, in most cases.
The very own ritualization of the journalistic process, with news at
fixed hours and institutional establishment next to communities and
sovereign bodies, contributed to make it closed and limited to a reduced
number of people that determined the degree of importance of a specific
event, following rules and guides, almost clerically. They were the
defenders of public interest, the public opinion makers, and the power
to reach the masses granted them the status of the Fourth Estate. News
consumers participated in the information routine simultaneously as
actors and public, with very few interference in the practice of news
professionals, that held absolute control on what and how it should be
published.
The media were the creating hub of all news contents, and all their
work was directed from top to bottom, in a pyramidal structure, whether
inside news organizations, in content distribution, or in content
building (the inverted pyramid), and they were its keepers. But with the
new technologies it all collapsed, and we watched the horizontalization
of the news process, that now occurs also beyond the borders of the
traditional journalistic structures.
From carpet bombing communication to relational communication
Relational
- Of or arising from kinship.
- Indicating or constituting relation.
So if in the previous paradigm information was static, closed, finite
and with a short half-life, the current situation is pretty much the
opposite. If the audience was indiscriminately bombarded with
information, the internet provided room for niches, with specialized
information. And links and the link economy changed everything: we can
comment and quote on a specific piece of information providing immediate
access to it. We share and point the path to information. Add the
social media/aggregation/recommendation/distribution tools and we are no
longer passive elements in the news flow, but active characters in the
creation and distribution of information.
With the 24 hour news cycle and the permanent breaking news status
(news are no longer “breaking” these days, they’re just “happening”;
news can be “exclusive”, but basing their importance in a time factor
is, to say the least, irrelevant) the pressure on media to keep the
information flowing has become intense:for example, the traditional news
cycles for newspapers no longer comply with the needs of a permanently
connected audience, nor the construction of ritualized information
published at a specific moment if made with “breaking news” in mind.
Another thing that doesn’t work is closed content, or content that
cannot be shared or distributed in different platforms. Facebook has
become an important place to access information, where individual
articles from specific brands chosen by users are shared and commented
by personal networks immediately. Twitter was probably the first place
where this happened in a massive scale. Recommendation became a simple
process, that took news from the media platforms to individual
platforms, that are based in the sharing logic. Check this
Pew Research Center report about how people navigate news online.
What we are watching today is the “user curation of content”, where
news spread faster and farther than they did, just because the users are
no longer mere recipients of indiscriminate information but active
participants in its distribution. This new ecosystem broke the previous
model where media were in a stand bombing the audience with information,
in a one way relationship, moving now to a situation where exchange is
the rule.
And media is no longer the sole source of news, just remember how
many events were first transmitted by users through social networks and
online tools only to be picked up but news pros afterwards: earthquakes,
the
Iran election and, more recently,
Bin Laden’s death are great examples about the role of users and social media in the distribution of news.
With these factors in mind, I thought about how we could visualize
the flow of information, from the very first tweet, post, video, etc.
The upward spiral

a not as cool looking representation of the news flow
The main difference between my idea and the NY Times visualization is
that I thought about a spiral instead of ramifications of content. This
seemed to be the most effective way because I had a few parameters in
mind: time, range and audience attention. In my draft Facebook wasn’t
considered because it was 2009 and it wasn’t as important in the flow
like Twitter was, and blogs had more relevance in this ecosystem.
So, this is how it works: at the epicentre there’s the event, first
tweets and posts, picked up afterwards by media as breaking news,
retweets/shares in social networks, comments, more media articles, new
user generated content based on new information or built upon the
existing one,and so on and on. Audience attention is higher at the
beginning, and it fades as the flow widens. This is not a process closed
in time, because information can be recovered and reutilized days,
weeks, months, years later, which is another feature of digital content:
it’s perennial, and database or archive journalism is something that
has been growing recently.

Side view. The arrows mean "time"
These are measurable parameters, if only there was a tool to compile and calculate them…
Of course in reality things might differ, and the constant elements
may vary, because media companies can be the creators of the first
tweet, or other tools can be used – Iran elections had a huge impact
also because of the YouTube videos made available by the protesters –
but the core idea is the following: there is a root (or roots) that
generate more content, linking opportunities through sharing,
recommendation or referral, and construction of new content based on the
pre-existing information.
What I had in mind and that Cascade does superbly is to track the
various connections, and evaluate the ripple effect caused by a single
piece of content. Mapping the origins of information and assess their
impact can be useful not only to validate that information but also to
develop new strategies in content distribution.
The Tornado Effect
The development of this flow, represented by a spiral, would create
what I call the Tornado Effect. Imagine a horizontal axis, a timeline
representing the event developing chronologically, and the vertical
axis where the information spirals upwards in time and connections. The
more numerous the connections and links, the higher the spiral; the
longer the interest lasts, the lengthier the timeline. Most events would
briefly touch down and dissolve away, others would be powerful enough
to drag a significant number of users and platforms into it, or even
generate new tornadoes.
There is a scale to measure tornado power, the
Fujita scale, that goes from F0 to F10,
and I was thinking about having one applicable to these phenomena,
let’s say from G1 to G10 (the Gamela scale – meant it as a joke, ok?).
Events like Michael Jackson’s death, the revolutions in Northern Africa,
William and Kate’s marriage would be high up in the scale; blog posts
and articles shared on Twitter by a small group of people within a short
period of time would be close to zero.
And if we selected paths of information, from an original source to
it’s various ramifications, like we can observe in Cascade, we would
have something like lightnings inside the tornado, connecting the
related content dots along the spiral. Sounds fun doesn’t it?
Implications
Leaving visual metaphors aside, what’s the purpose? First, we could
analyze where does the information come from; then how it is shared and
used; and finally, who contributed the most. A thorough analysis of the
path of information could represent a shift in the strategy for media
companies that could reconsider how to find, distribute, create and
aggregate content, making it more viral, and rethinking it to augment
its longevity and usefulness.
The role of the users in this process would be better observed and
weighed, and a more complete view on the platforms and mechanisms they
apply accessing and sharing information would be possible.
And above all, new types of journalism practices would have to be
applied. Curation has become a huge concern for media pros, and there
are many tools to curate content, professionally produced or not.
Archive journalism would become more important than it is now, since
curation is creating archives in almost real time in a way, but with
these tornadoes mapped out we could infer who tweeted first, what was
the relationship between events and content, and related contents, down
to the millisecond, since everything online has a timestamp. Information
is now perennial, and should be used taking that feature into account.
Of course, this is not a fully developed idea, and I lack the skills
to build such analytic tool. But as concept, I believe it would be
useful to understand the spreading of news and create new strategies
from that understanding, and at the same time mapping out and monitoring
the evolution of information. And the Cascade project is quite close to
that.
Leave your disagreements, ideas, praises, this week’s lottery numbers in the comment box. Thank you.

picture shamefully stolen from a place I can't remember now
Related posts:
- Re-conceptualizing the news product – Guest post @ Innovative Interactivity
- PortugalDailyView.com: Portuguese news in English
- news:rewired – city university, london, 14/01/2010
- Honeycomb: News vs UX | Notícias vs Experiência do utilizador
- Of Past and Future: Archives and information’s lifespan