[ncdnhc-discuss] The NCDNHC's .org report is numerically inconsistent.

Milton Mueller Mueller at syr.edu
Tue Aug 20 21:18:42 CEST 2002


OK, I see now what accounts for a lot of the confusion.

Annex 5, which was a separate Excel file has been
added into the pdf file. That explains why Dany doesn't 
recognize the number of pages as 49, and it also explains 
why the rankings on  p. 49 were not sorted.

Annex 5 (an Excel file) was not formatted for public view, it was
used purely for calculation purposes. That is why it appears 
confusing. We didn't sort the data by the final scores, which 
is what you complained about in your first message, because
we didn't think it would show up in the final report. 
We did sort them in the report itself. And obviously,
we made one mistake (GNR) in copying data from
the spreadsheet to the file. But the final scores in the
report are correct. ISOC really is a 21.25, check your
arithmetic. I think you made the arithmetic mistake 
this time  ;-)

I don't agree with your critique of averaging the
rankings. The problem is that the different dimensions
of evaluation - differentiation, public support, and 
responsiveness/governance - are not commensurable. 
In ranking applications on each of the three dimensions,
we created distinct numerical scales - one for "public 
support" one for differentiation, and one for governance. 
Each of these scales is SOMEWHAT arbitrary, but 
does have internal consistency in measuring the specific
thing it is measuring.  But to then treat all 3 of those scales 
as if they could be measured against each other takes the 
arbitrariness well past the breaking point. We don't really know 
HOW a score of 21.75 in "responsiveness" relates to
a score of 84 in "public support." And it is, in my opinion,
bad practice to "normalize" or combine them in any way.
So the ONLY useful measure of an overall ranking, imho,
is to average the rankings themselves, or simply to
look at the three rankings together. Dany and I could
not agree on this, so we just agreed to put both approaches
in there. Really the Board has to decide which one(s)
to use.
--MM

>>> Thomas Roessler <roessler at does-not-exist.org> 08/20/02 02:39PM >>>
On 2002-08-20 14:14:55 -0400, Milton Mueller wrote:

>The "real" numbers were in a spreadsheet which automatically 
>tallied the total according to the weightings. In the process of 
>transferring the results in the spreadsheet to a Word document 
>table, a typographical error was made, and GNR was given a 5 in 
>"Relationship with Community" when it actually received a 4. 

OK.  Now, what about the error in ISOC's score?

>The original spreadsheet was delivered to ICANN as Annex 5. The  
>title of the Excel file was "rank-calc.xls" On the worksheet  
>entitled "Echelle 2" can be found the correct numbers. 

The PDF document I've been looking at contains an Annex 5 on pages 
43-49, and that one contains the further errors I pointed out in my 
original message.

-- 
Thomas Roessler                        <roessler at does-not-exist.org>
_______________________________________________
Discuss mailing list
Discuss at icann-ncc.org 
http://www.icann-ncc.org/mailman/listinfo/discuss




More information about the Ncuc-discuss mailing list