Guest post by Nick Wise on "remote short-term research fellowships" - where academics can earn £££ by adding the affiliation of "the University of Religions and Denominations" to their publications.

http://deevybee.blogspot.com/2024/02/an-intellectually-enriching-opportunity.html

#fraud #research #integrity #misconduct #publishing #SalfordUni

An (intellectually?) enriching opportunity for affiliation

Guest Post by Nick Wise     A couple of months ago a professor received the following email, which they forwarded to me. ...

@deevybee
What I cannot quite grasp is how university-wide schemes to amass publications are to have any likelihood of being worthwhile. If an individual researcher embellishes their CV with fraudulent papers that will get them ahead when their employer treats evaluation as a metrics game. But a university must face more scrutiny - and these schemes are so poorly designed, so over the top, the only way they can work if everyone closes their eyes. Who is this for? International rankings?
@deevybee What I want to know is how this Bhatia Khan lady is a top 2% researcher. What's the metric being used here? She has a lifetime count of 2300 citations and a low h-index. What are these Stanford rankings? Could you link to those? Sounds like a BS ranking to me.

@ShravanVasishth
I checked with Nick. He replied: "This is according to Elsevier's ranking: https://elsevier.digitalcommonsdata.com/datasets/btchxktzyw/6

She is not a 2% scientist judged over all scientists' whole careers but over 2022. The ranking may be BS but it's as official as they come.'

I've added a PS to the post with this information.

October 2023 data-update for "Updated science-wide author databases of standardized citation indicators"

Citation metrics are widely used and misused. We have created a publicly available database of top-cited scientists that provides standardized information on citations, h-index, co-authorship adjusted hm-index, citations to papers in different authorship positions and a composite indicator (c-score). Separate data are shown for career-long and, separately, for single recent year impact. Metrics with and without self-citations and ratio of citations to citing papers are given. Scientists are classified into 22 scientific fields and 174 sub-fields according to the standard Science-Metrix classification. Field- and subfield-specific percentiles are also provided for all scientists with at least 5 papers. Career-long data are updated to end-of-2022 and single recent year data pertain to citations received during calendar year 2022. The selection is based on the top 100,000 scientists by c-score (with and without self-citations) or a percentile rank of 2% or above in the sub-field. This version (6) is based on the October 1, 2023 snapshot from Scopus, updated to end of citation year 2022. This work uses Scopus data provided by Elsevier through ICSR Lab (https://www.elsevier.com/icsr/icsrlab). Calculations were performed using all Scopus author profiles as of October 1, 2023. If an author is not on the list it is simply because the composite indicator value was not high enough to appear on the list. It does not mean that the author does not do good work. PLEASE ALSO NOTE THAT THE DATABASE HAS BEEN PUBLISHED IN AN ARCHIVAL FORM AND WILL NOT BE CHANGED. The published version reflects Scopus author profiles at the time of calculation. We thus advise authors to ensure that their Scopus profiles are accurate. REQUESTS FOR CORRECIONS OF THE SCOPUS DATA (INCLUDING CORRECTIONS IN AFFILIATIONS) SHOULD NOT BE SENT TO US. They should be sent directly to Scopus, preferably by use of the Scopus to ORCID feedback wizard (https://orcid.scopusfeedback.com/) so that the correct data can be used in any future annual updates of the citation indicator databases. The c-score focuses on impact (citations) rather than productivity (number of publications) and it also incorporates information on co-authorship and author positions (single, first, last author). If you have additional questions, please read the 3 associated PLoS Biology papers that explain the development, validation and use of these metrics and databases. (https://doi.org/10.1371/journal.pbio.1002501, https://doi.org/10.1371/journal.pbio.3000384 and https://doi.org/10.1371/journal.pbio.3000918). Finally, we alert users that all citation metrics have limitations and their use should be tempered and judicious. For more reading, we refer to the Leiden manifesto: https://www.nature.com/articles/520429a

Mendeley Data
@ShravanVasishth
PS - please can you comment on the blog if you want to reply to this! thanks