Gerry McGovern paid us a visit

And he gave us a talk about the internet in general. While I enjoyed the talk in general, there were some ideas which I really liked and some with which I’d adamantly disagree.

Here goes.

The task-centric internet.
That’s the main theory. We went from a tool-centric internet to a content-centric internet. Now the web is (or should be) task-centric, that is focused around what people who come to your web site want to do. All the rest is clutter.

I’m not too convinced about that. I like the idea of helping visitors achieve what they want to do, right from the homepage and without hassle. Now in a web site design, you should also consider what you want your visitors to do. Yes the choices a visitor faces should be kept to a minimum. But in my opinion it is ok to orient those choices. It is ok to send a message to tell your visitors about something they were not necessarily looking for, but which may be of interest to them.

Navigation should help people, not reflect the brand.
I mostly agree with that. This echoes what Jakob Nielsen says about links, which should look like links, i.e. in blue and underlined, with a different color for visited links. Now Nielsen is more subtle about this than McGovern was. Navigation links, menu options etc. are seldom underlined and this is generally for the best.

In your text, use words that people search for.
The two examples he gave were “low fares” vs. “cheap flights” and “climate change” vs. “global warming”. It turns out that airline companies liked to use “low fares” while customers were rather searching for “cheap flights”. And, in the academic litterature, you’d find more mentions of “climate change” than “global warming”, although, again, people search the latter. So the advice was to use the searched expressions.

While it makes sense in the first case, it’s more questionable for the 2nd. If you write a website for academics, you want to attract the people who searched for “climate change”, not necessarily “global warming”, even if they are more numerous.

Don’t write links in paragraphs.
Huh? While I agree you shouldn’t write a paragraph around a link when the link itself suffice, I don’t see anything wrong with using a link within a paragraph, far from it. When writing for the web, connecting with other resources and websites has many benefits. The rationale he gave was that people are either reading or clicking, hence the paradox. To that I say, not anymore! there are so many things you can do with a link, like opening it in a new tab, bookmarking it or tagging it for future reference, etc.

Keep headings short.
Indeed. There are only advantages to that. It was quite interesting to see him bashing our clippings.

An interesting point he raised was that before the internet, news releases were never meant to be published. Now they are available to the main public, and often redistributed by some e-journalists as is.

Blogging is really a conversation.
Blogging is about exchanging rather than proposing. I really didn’t like that analysis. In my book, unless you have something to say, unless you have substance, no one will want to exchange with you. You just can’t run a blog saying, ok you guys tell me what I should write about. Protagonists won’t materialize out of thin air. There are quite a few successful blogs without comments. In my view, comments are a side effect of blogging rather than its essence.

Update your content frequently.
Some content has a shorter time span than other. He showed us great examples of that on our own website. Basically, everything you’d write in the future tense is soon outdated. There’s some content, however, that in my opinion can stay online for a while.

Monitor your content and take it out when needed.
That was a very interesting point. When you hear something like that, anyone’s reaction would be to say, my site is already huge, so I need extra resources to monitor my content. His approach is the opposite. He says that you should only build a site so big that you can monitor it with your current resources. If your web site is too big, you should downsize it. And in fact, most organizations are taking large chunks of their public web site offline!


The state of presentations in 2008

There have been many changes in how people understand presentations in 2008. How far have we gone?

In 2008, 2 major books on the topic have been published. Presentation Zen, by Garr Reynolds, and slide:ology, by Nancy Duarte.

People are accepting that a well-executed presentation can change the world. An Inconvenient Truth got nothing less than 2 academy awards and a nobel prize. And rumors about the health of master presenter Steve Jobs caused the stock markets to panic.

People are also finding that tools to create successful presentations are incredibly commonplace. From a technical standpoint, anyone with a computer could have created “shift happens“, which has been viewed by 5 to 10 million people.

As a result, blogs are now swarming with sensible presentation advice. A google query for “death by powerpoint” returns 397000 hits today. A year ago, searching for presentation tips yielded ideologic (as opposed to evidence-based) guidelines such as “no more than 7 bullets per slide” or “one slide per minute”. (you can still find those as well).

2008 was also the year where Slideshare took off. Not only did the viewership and amount of contents increase drastically, but the quality, relevancy and sophistication of the best presentations is now incredible. Empowered by inspring examples, clear guidelines and adequate tools, many are thriving to emulate great presenters.

So if I just end here, one could conclude that the world is definitely saved from ineffective presentations. The reality is slightly different.

This year, I have seen so far approximately 400 live presentations, and god knows how many online. Some were excellent, many were good, most were at least adequate. But a good proportion of them were still boring and I’d be lying if I claimed I could remember as much as 10% of them.

One explanation I came up with for that is that many presenters are still focusing on the final deliverable product rather than the fundamentals. These folks are very sensitive to advice like “mind your typography”, “illustrate your slides with large images”, or “forget bullets”. Now typography or images are important and can make a difference between a good and an excellent presentation. But it’s crucial to have a message to deliver and to focus on that message.

Bulleted texts are accused of cluttering the presentations. But if every little point or anecdote is illustrated with a vividly-colored image, then the images themselves become the clutter and clog everyone’s limited attention. You’d remember the images and cool effects but not the point. And a week later, you’ll have forgotten the images and the presentation altogether.

So my own piece of advice is that your big images won’t make your presentation. Your angle,  structure and consistency will. The best advice I got from Presentation Zen was to prepare a presentation away from a computer and only produce it once it’s final. It works. It really does.

Once this will be an accepted practice, seminars, classes and meetings will be much more exciting (let’s hope!).


Go deep rather than go wide

Yesterday, I attended a Presentation Zen webinar. One phrase that struck me was this advice to go deep rather than go wide. In a presentation, there is only so much time to present information, before everyone’s attention collapses.

Rather than try to cover as much ground as possible, it’s much more efficient to focus on a subject and make sure to deliver.

And if you are expected to deliver lots of information on a wide scope, then a written report is a more appropriate medium.

The presentation should be available in slideshare any time soon.


Re: flowing data contest, code for my entry

Here is the processing applet with source code for my entry. The code is based on a charter program I’ve been working on and off for a while, which I’ll publish once it’s more polished.
Select the applet then press a key to alternate between the 2 representations. The text file is the data, and the two png files are the images of the results.

Inserting processing applets in wordpress is not obvious. Here’s a discussion post for anyone interested.

source: fdcontest.pde

media files: params.txt image.png image_matrix2.png


Is data visualization needed?

Today, Flowing Data has a post on the jobs in data visualization and gives a list, im my opinion quite restrictive, of the industries that need data visualization specialists – media, research, design.

The truth is, any activity that produce data, which is every activity on  earth, needs a better way to represent and communicate that data. And they also often need a way to engage others through data. This is exactly the mission statement for data visualization.

As far as I know, areas with the most complex data visualization challenges are sciences, like medicine, physics or meteorology, which use gigantic datasets. Other fields, like finance or even intelligence also demand advanced visualization solution. But really, all activities handle data, because they all generate data.

So everyone needs data visualization. But do they need data visualization experts? The answer is a resounding yes, even if they are not aware of this yet. Graphical representations are as pervasive as data. Even the most humble of datasets has good chances to be displayed on a graph, more or less happily. By improving the effectiveness of these charts, the impacts on businesses can be tremendous.

At a higher scale, complex problems can be tackled with data visualization. Visual analytics, for instance, helps people think with data and visual interfaces. It can help people discover outstanding elements in a large dataset, or hidden relationships between elements, among other things. Possibilities are limitless.

All in all, there are many situations which need data visualization. When people and businesses will have realized that, finding a jobs for data visualization experts might get easier.

PS. That reminds me of an old post I once read stating that “deep understanding of […] databases is a key skill that too many managers lack.”- fast forward 4 years, add the visual element and the conclusions are the same, understanding how to show data can really help you go places.


Junk charts

The enemy.

I’d like to write a few words about blogs I read regularly and to start this series, I am happy to talk about Junk Charts. 

Junk Charts is the one-man crusade of Kaiser Fung against ineffective charts, who has been working on the blog since 2005. Kaiser is a convinced follower of Tufte, whom he owes his blog’s name. Tufte describes chart junk as whatever clutters graphs and obscures information contained in the chart data. 

To this end, Kaiser collects ineffective, misleading or just plain wrong graphs and redraws them according to sound principles. He fights a current trend that gives too much emphasis on the aesthetics of a published graph, to the expense of its meaning. 

He is now assisted by readers who gladly submit candidates for redesign. I am grateful that no OECD chart ever had the honors of Junk Charts, although a couple of charts which were direct copies of ours have been featured, such as this one.

In 3 years, Junk Charts has amassed quite a diverse community of readers, and posts are always followed by a healthy discussion – just what blogging is about. I have found great tips and ideas both in the articles and the comments.

So thank you Kaiser for Junk Charts and I am always looking forward to the next post!


Flowing Data’s chart contest

This week, FlowingData has organized the contest. A chart was submitted, and contestants were asked to improve it. 

A lot of my job revolves around reviewing and correcting graphs, so I was more than happy to compete. 

Here is the original graph, hosted & designed by Swivel:

Immigration to the U.S. by decade

The rules of the contest stated that the new graph should use the same data. But instead of re-using the dataset hosted on Swivel, I checked the source to answer some questions I had.

Here goes: 









1820-30 151,824 106,487 36 11,951 17 33,333
1831 40 599,125 495,681 53 33,424 54 69,911
1841-50 1,713,251 1,597,442 141 62,469 55 53,144
1851-60 2,598,214 2,452,577 41,538 74,720 210 29,169
1861-70 2,314,824 2,065,141 64,759 166,607 312 18,005
1871-80 2,812,191 2,271,925 124,160 404,044 358 11,704
1881-90 5,246,613 4,735,484 69,942 426,967 857 13,363
1891-00 3,687,564 3,555,352 74,862 38,972 350 18,028
1901-10 8,795,386 8,056,040 323,543 361,888 7,368 46,547
1911-20 5,735,811 4,321,887 247,236 1,143,671 8,443 14,574
1921-30 4,107,209 2,463,194 112,059 1,516,716 6,286 8,954
1931-40 528,431 347,566 16,595 160,037 1,750 2,483
1941-50 1,035,039 621,147 37,028 354,804 7,367 14,693
1951-60 2,515,479 1,325,727 153,249 996,944 14,092 25,467
1961-70 3,321,677 1,123,492 427,642 1,716,374 28,954 25,215
1971-80 4,493,314 800,368 1,588,178 1,982,735 80,779 41,254
1981-90 7,338,062 761,550 2,738,157 3,615,225 176,893 46,237
1991-00 9,095,417 1,359,737 2,795,672 4,486,806 354,939 98,263
2001-06  7,009,322 1,073,726  2,265,696 3,037,122 446,792 185,986

187 Years        







* includes others unidentified by nationality


The FAIR (Federation for American Immigration Reform), who’ve published this on their website, also made a chart out of this data: 


So let’s take a look at the data. 

At first glance, it is very aggregated: data are not available per country or per year, but per continent and per decade. However, the last “decade” is only 6 years long. Also, Oceania includes all the unidentified immigrants. Immigrants from Africa and “Oceania” are a tiny fraction of the total flow so it would be difficult to draw a conclusion from their data.

So if I want to tell a story about this dataset, I would choose the following. 

The total flow of immigrants to the USA has gone through major changes. 

Looking at the composition of this flow: over 90% of the immigrants were Europeans at some point, but now that ratio is down to around 15%. 

Now, for a critique of these two graphs. 



  1. It’s not very telling to keep presenting those numbers aggregated by decade. 
  2. Especially if the last decade is not corrected. All curves seem to dip, although the underlying variables are actually growing.
  3. You can clearly see the point where American immigrants take over Europeans (and later, when Asians do the same). But again, those absolute figures are not very interesting. You cannot see the share of the various continents to the total. 
  4. The Africa and Oceania curve clutter the graph and bring little information. 
  5. The fact that Oceania includes other countries is not disclosed (not that it would change the graph tremendously). 
  1. To do this graph, they’ve annualised the data, which is a more sensible option. 
  2. The year labels are difficult to read. 
  3. The last column (2001-2006) is exactly similar to the others, which comprise 10 years. 
  4. Again, Oceania and Africa don’t bring much to the graph. 
  5. It’s very difficult to see the evolution of one given continent, except Europe. 
An idea that I had and discarded was to show cumulated values (stocks). 
The left graph shows the cumulated values as part of the total. The second shows the cumulated values in absolute figures.
On both graphs, one can see the decline of the share of European immigrants. It’s more striking on the second, when the blue curve suddenly flattens around the turn of the century, while the green one (America) then the red one (Asia) start to thicken. 
So we have a story there. But then, what are these numbers? what would the sum of all those migrants mean, over nearly 200 years? That’s a very different number from the stock of all migrants currently living in the USA, because over so much time, most of them are dead. And it is also a very different number from the sum of all immigrants that ever came to the USA. Starting at 1820 is quite arbitrary – and does in fact exclude most African arrivals. So based on that dataset alone, which is the rule of the contest, it’s just not possible to work with cumulated values and get meaningful results.  
Then, I thought of doing a matrix chart instead of the stacked column chart done by FAIR. 


Doing a matrix chart like this (several charts one top of the other, using the SAME SCALE, wich can be added vertically – and visually) is the textbook way of showing variables in such a way that one can see their evolution over time and their proportion in the total. 

This kind of chart is not natively supported in Excel, so I’ve done it with processing

(I wrote a program to make them in Excel, but will talk about that in a later post.)

It’s an interesting graph: it shows Europe immigration peak, then America taking off, followed by Asia. In the early 20th century, the Mexican revolutions caused much emigration to the US, this is the ripple in the graph. 

But then, I thought it was too complex. Frankly, by glancing at it, you don’t get anything. You might learn information by examining it. 

So I have done this one which I am going to submit. 

And here I have my 2 stories in a much lighter graph. 

The blue rectangles are the total immigrants. Various laws and events have shaped that curve, I first wanted to annotate it but I’ve decided against it. I just kept the Immigration Act which was in force between 1924 and 1965 and which largely explains the drastic drop in immigration in that time. 

Without any other variable to compete with it, you can clearly follow its story. 

Then, I’ve added the share of Europeans in all the immigrants. That’s another clear story: in the early 19th century, they made the bulk of the immigrants, but then, their share dropped sharply to around 15%. My guess, though, is that the shape of the first leg of this curve (from about 70% to over 90%) is due to the fact that many unidentified immigrants were really Europeans. 

For the title of the left axis, I’ve chosen naturalization over number of immigrants or another denomination because most of the “immigrants” of the last few decades are really people already residing in the USA which get naturalized.

But that’s another contradiction in the dataset. In 1868, when the 14th amendment to the Constitution came into force, about 4 million former slaves became American citizens. They are not shown in the data. In 1924, the Native Americans who were not yet citizens were also granted citizenship. They too are not included int he dataset. However, since 1965, most “immigrants” are change of status migrants who were already in the USA. But then, we are to play with this dataset so that’s the best I could come up with.

Lastly, a few words about the design. I took some of the colours from a chart I really liked, by Viveka Weiley. In her chart she uses the MyriadPro font (guess she’s a Mac, but I’m a PC). I am using Frutiger which is quite similar.