Skip to main content

Step 5: Criteria ranking

Ranking methodology

The guidelines recommend ranking the criteria after scoring, as a more detailed understanding of the criteria can be reached by that time. However, some sites performed this step while selecting criteria, mainly to gain time. Actually, it does not really matter so much when the ranking of criteria is done, i.e. both ways are acceptable. (We might think of changing the methodology guidelines accordingly.)

Only little information on the process of ranking was found in the workshop reports; mostly just the result was displayed. So, it cannot be said if a discussion among participants took place or not. Some sites decided to let the stakeholders vote for their favourite 3 criteria in each category, such as in Chile and Spain. In Spain, the number of votes for each criterion was then used for ranking. Others indicate that there was a group discussion (Turkey, Russia). The ranking step was not done at the study sites of Cape Verde, although they did include a ranking in the Facilitator software, and Morocco, as they did not use the Facilitator software during the workshop.

Result of Step 5

The expected result of step 5 is, that the weight / importance of each criterion is identified and agreed upon. It must be assumed, that this was not achieved by all study sites. The importance the ranking has for the result of the application of the Facilitator software was probably underestimated by most sites. The ranking actually applies a weight to each criterion. Portugal is the only site which also ranked the main categories (economic, ecological and socio-cultural) besides ranking the single items belonging to these categories. By equalizing the main categories in the ranking procedure, they made sure that in the overall result, no category was ranked above another, e.g. that economic criteria were not ranked above ecological criteria. This point needs more attention and has to be better explained and stressed in the workshop guidelines.