clt says apparently we don't exist

Good grief…smh

Used to our athletic teams being disrespected. Didn’t expect this. Flat out untrue.

Apparently Mr. Parilla was using these rankings when speaking:

http://www.leidenranking.com/ranking/2015

[size=18px]Selection of universities included in the ranking[/size]

[size=13px]The 750 universities included in the Leiden Ranking 2015 were selected based on their publication output in the period 2010–2013. Only so-called core publications were counted, which are publications in international scientific journals. Also, only research articles and review articles were taken into account. Other types of publications were not considered. Furthermore, collaborative publications were counted fractionally. For instance, if a publication includes three addresses of which two belong to a particular university, the publication was counted with a weight of 2 / 3 = 0.67 for that university. About 1100 fractionally counted publications were required for a university to be included in the Leiden Ranking 2015.[/size]
It is important to note that universities do not need to apply to be included in the Leiden Ranking. The universities included in the Leiden Ranking are selected by CWTS according to the procedure described above. Universities do not need to provide any input themselves.

Data quality

[size=13px]The assignment of publications to universities is not free of errors, and it is important to emphasize that in general universities do not verify and approve the results of the Leiden Ranking data collection methodology. Two types of errors are possible. On the one hand, there may be false positives, which are publications that have been assigned to a university when in fact they do not belong to the university. On the other hand, there may be false negatives, which are publications that have not been assigned to a university when in fact they do belong to the university. The data collection methodology of the Leiden Ranking can be expected to yield substantially more false negatives than false positives. In practice, it turns out to be infeasible to manually check all addresses occurring in Web of Science. Because of this, many of the 5% least frequently occurring addresses in Web of Science have not been manually checked. This can be considered a reasonable upper bound for errors, since most likely the majority of these addresses do not belong to universities.[/size]

Apparently we aren’t a “top” research University…

What was really odd was there was zero mention of the university. You’d think they’d say we’d made good strides in research but need to do more, talk about PORTAL, etc.

BTW, we recently had our Carnegie classification upgraded because of our research work.
http://graduateschool.uncc.edu/news/unc-charlotte-granted-new-carnegie-classification

The lack of a medical school is really holding us back…this is just more evidence.

I wonder if CHP has seen this.

I wonder if CHP has seen this.[/quote]

He’ll reply with Dorn’s recruiting bio and your 49erclub ranking.

[quote=“bleedsgreenandgold, post:4, topic:30262”]Apparently Mr. Parilla was using these rankings when speaking:

http://www.leidenranking.com/ranking/2015

[size=18px]Selection of universities included in the ranking[/size]

[size=13px]The 750 universities included in the Leiden Ranking 2015 were selected based on their publication output in the period 2010–2013. Only so-called core publications were counted, which are publications in international scientific journals. Also, only research articles and review articles were taken into account. Other types of publications were not considered. Furthermore, collaborative publications were counted fractionally. For instance, if a publication includes three addresses of which two belong to a particular university, the publication was counted with a weight of 2 / 3 = 0.67 for that university. About 1100 fractionally counted publications were required for a university to be included in the Leiden Ranking 2015.[/size]
It is important to note that universities do not need to apply to be included in the Leiden Ranking. The universities included in the Leiden Ranking are selected by CWTS according to the procedure described above. Universities do not need to provide any input themselves.

Data quality

[size=13px]The assignment of publications to universities is not free of errors, and it is important to emphasize that in general universities do not verify and approve the results of the Leiden Ranking data collection methodology. Two types of errors are possible. On the one hand, there may be false positives, which are publications that have been assigned to a university when in fact they do not belong to the university. On the other hand, there may be false negatives, which are publications that have not been assigned to a university when in fact they do belong to the university. The data collection methodology of the Leiden Ranking can be expected to yield substantially more false negatives than false positives. In practice, it turns out to be infeasible to manually check all addresses occurring in Web of Science. Because of this, many of the 5% least frequently occurring addresses in Web of Science have not been manually checked. This can be considered a reasonable upper bound for errors, since most likely the majority of these addresses do not belong to universities.[/size][/quote]

clt is really sad after seeing that list.

[quote=“cltniners, post:10, topic:30262”][quote=“bleedsgreenandgold, post:4, topic:30262”]Apparently Mr. Parilla was using these rankings when speaking:

http://www.leidenranking.com/ranking/2015

[size=18px]Selection of universities included in the ranking[/size]

[size=13px]The 750 universities included in the Leiden Ranking 2015 were selected based on their publication output in the period 2010–2013. Only so-called core publications were counted, which are publications in international scientific journals. Also, only research articles and review articles were taken into account. Other types of publications were not considered. Furthermore, collaborative publications were counted fractionally. For instance, if a publication includes three addresses of which two belong to a particular university, the publication was counted with a weight of 2 / 3 = 0.67 for that university. About 1100 fractionally counted publications were required for a university to be included in the Leiden Ranking 2015.[/size]
It is important to note that universities do not need to apply to be included in the Leiden Ranking. The universities included in the Leiden Ranking are selected by CWTS according to the procedure described above. Universities do not need to provide any input themselves.

Data quality

[size=13px]The assignment of publications to universities is not free of errors, and it is important to emphasize that in general universities do not verify and approve the results of the Leiden Ranking data collection methodology. Two types of errors are possible. On the one hand, there may be false positives, which are publications that have been assigned to a university when in fact they do not belong to the university. On the other hand, there may be false negatives, which are publications that have not been assigned to a university when in fact they do belong to the university. The data collection methodology of the Leiden Ranking can be expected to yield substantially more false negatives than false positives. In practice, it turns out to be infeasible to manually check all addresses occurring in Web of Science. Because of this, many of the 5% least frequently occurring addresses in Web of Science have not been manually checked. This can be considered a reasonable upper bound for errors, since most likely the majority of these addresses do not belong to universities.[/size][/quote]

clt is really sad after seeing that list.[/quote]

Perhaps we are not on the list because our “leaders” did not see the need to complete the paperwork?

Would you rather be #750 or not on there and think we are top 50?

[quote=“49r9r, post:11, topic:30262”][quote=“cltniners, post:10, topic:30262”][quote=“bleedsgreenandgold, post:4, topic:30262”]Apparently Mr. Parilla was using these rankings when speaking:

http://www.leidenranking.com/ranking/2015

[size=18px]Selection of universities included in the ranking[/size]

[size=13px]The 750 universities included in the Leiden Ranking 2015 were selected based on their publication output in the period 2010–2013. Only so-called core publications were counted, which are publications in international scientific journals. Also, only research articles and review articles were taken into account. Other types of publications were not considered. Furthermore, collaborative publications were counted fractionally. For instance, if a publication includes three addresses of which two belong to a particular university, the publication was counted with a weight of 2 / 3 = 0.67 for that university. About 1100 fractionally counted publications were required for a university to be included in the Leiden Ranking 2015.[/size]
It is important to note that universities do not need to apply to be included in the Leiden Ranking. The universities included in the Leiden Ranking are selected by CWTS according to the procedure described above. Universities do not need to provide any input themselves.

Data quality

[size=13px]The assignment of publications to universities is not free of errors, and it is important to emphasize that in general universities do not verify and approve the results of the Leiden Ranking data collection methodology. Two types of errors are possible. On the one hand, there may be false positives, which are publications that have been assigned to a university when in fact they do not belong to the university. On the other hand, there may be false negatives, which are publications that have not been assigned to a university when in fact they do belong to the university. The data collection methodology of the Leiden Ranking can be expected to yield substantially more false negatives than false positives. In practice, it turns out to be infeasible to manually check all addresses occurring in Web of Science. Because of this, many of the 5% least frequently occurring addresses in Web of Science have not been manually checked. This can be considered a reasonable upper bound for errors, since most likely the majority of these addresses do not belong to universities.[/size][/quote]

clt is really sad after seeing that list.[/quote]

Perhaps we are not on the list because our “leaders” did not see the need to complete the paperwork?[/quote] You might be on to something there, 49r9r

It’s a long slog when your battling the rest of the state who are trying to keep your region down and the rest of your university system doing the same.

Nailed it.