By CHARLES ARTHUR
LONDON - One of the enduring lessons about predictions of the future is that they don't quite happen in the way that the predictors expect.
Take the forecast, quite common a few years ago (and still found among design undergraduates, who, for reasons I don't quite follow, keep designing "electronic papers" in their degrees) that the newspaper of the future would be precisely tailored to our tastes, produced individually and printed near our home.
According to the theory, the paper would consist of cherry-picked items from all the papers that you like - say, Robert Fisk's reports from Baghdad, that funny TV columnist who works for Another Paper, and that cricket writer who used to work here but has Gone Elsewhere.
It wouldn't be the Independent, but it wouldn't be any other paper either.
And your neighbour's paper might omit Fisk but include Thomas Sutcliffe's TV reviews.
And you would pay different amounts for your papers, because the columnists and news analysis and news would all be costed individually.
Sounds nice, but it ignores the reality pointed out a few years ago by the Forrester analyst Mary Modahl, who explained that when you split content up it becomes more expensive.
If you were to try to gather, individually, every piece of information in this paper, it would probably cost you 50 times more than it does.
And you'd have to keep renewing the contracts.
Aggregation makes things cheaper.
Yet the internet has enabled a diversity of views to explode, principally through "blogging" (posting a continually updated stream of one's thoughts, views and discoveries on a personal website) and by the "syndication" feeds that many major news sites now run.
All you need to find the opinions and news you want online for free is a "news aggregator" program and the patience to search for the sources.
The key to this is a web publishing format called RSS, for "Really Simple Syndication" (or "RDF Site Summary" or "Rich Site Summary" - opinions differ).
It grew out of a good idea in 1997 from the software company Userland that got picked up and melded with one of Netscape's similar ideas.
Essentially, it takes a certain amount of text from a site and makes that available over the web, rather like being able to tune into a radio signal; instead of choosing a radio station preset, you "subscribe" to a particular URL (ending in .xml) for that website.
Then your aggregator program can periodically ask that RSS URL, and those of all the other sites you've subscribed to, for any new headlines and text.
These will then be shown, connected to the site, in your aggregator program; if you like you can go over to the parent site and read in full, since the RSS feeds are often very limited.
A good first application is Amphetadesk which runs on pretty much everything from Windows 95 and Mac OS 7 onwards.
The download is around 2Mb, and it's pretty self-explanatory (if a bit of a memory hog at times).
Alternatively there's Feedreader which is a Windows-only program (it certainly runs on XP).
It has a rather neat story-search facility.
The "normal" view is a familiar three-pane one: sites in a long strip at the left, headlines from the chosen site in a top pane, and the RSS text - what there is - of the selected story in the bottom pane.
If you're on Mac OSX there's the very nice (and free) NetNewsWire Lite which also has a three-pane layout, although it presently lacks a search system.
The author, Brent Simmons, does say though that he'll incorporate that in the free version some time. (The paid-for version, which also does weblogging, has it already.)
There are various paid-for versions too, but you'll probably want to try out each of the free programs first to see whether you want to persist.
The more difficult question, though, is: where can you find news sources that you want? Each news aggregator comes with its own presets, and with links to sites - such as Newsisfree and Moreover where you can range far and wide for the subjects that interest you.
Gruesome pictures of war death? Analysis of Xbox pricing? The latest BBC headlines? Anything you want is out there.
The only problem is finding it, and of course keeping up to date with it.
This leads us back of course to the argument that we began with.
If news is free and can be found anywhere, read for no cost and downloaded to your laptop (and presumably in time, to your handheld; can it be long before someone writes a Java aggregator which can run on a mobile phone?) then one might wonder what newspapers are for.
This is a tangled question, and it's easy to think that people who work in or for newspapers should almost be disqualified from considering it, in the same way that people who work for record companies can have no reasonable opinion about what the internet will do to their industry.
The key is in the thing that helps us all get by - money.
Would you find someone willing to take the risks that, say, Robert Fisk or the other journalists in Baghdad did to report what happened without getting paid, and without the resources a news organisation can provide? The blog by Salam Pax clearly came from Baghdad, but as the city fell last week he hadn't posted since March 24.
News aggregation is great - but some news takes that little bit more to unearth or report.
- INDEPENDENT
All the news that's fit for you
AdvertisementAdvertise with NZME.