Counting citations hasn't been a reliable measure of scientific impact for a while, especially on platforms like Google Scholar that compile info from random documents. Hyper-authorship, predatory journals, etc have all contributed to the problem.
This preprint just drives home how important it is to measure scientific impact more carefully and without reliance on automated metrics
Google Scholar is manipulatable
https://arxiv.org/abs/2402.04607
#science #academia #publishing #citations
Google Scholar is manipulatable
Citations are widely considered in scientists' evaluation. As such, scientists may be incentivized to inflate their citation counts.arXiv.org
Frank Aylward reshared this.
El Duvelle
in reply to Frank Aylward • • •Frank Aylward
in reply to El Duvelle • • •@elduvelle excellent point! In many cases it is not needed or desirable.
But even when trying to self evaluate, I often ask what discoveries would have the broadest relevance, and what lines of inquiry could lead to the most drastic developments. But that is a much more nuanced discussion of impact that can't be captured by metrics.
Frank Aylward
in reply to Frank Aylward • • •So if we are going to do that we should at least be aware of what biases these metrics have.
El Duvelle
in reply to Frank Aylward • • •sure, but why do you have to report those “scientific impact” measures? What are they even used for? It’s not like researchers get paid more if they have more “impactful” papers (whatever that means, and also, that would be completely unethical). So are they used to estimate future “impact”? But how can one predict the other? You might be lucky and get once really surprising and impactful results but how often is that going to happen in one’s career?
Maybe people should instead measure the things that actually indicate good science: clear protocols, well-designed experiments, reproducibility of the findings, proper sharing of data and code, accurate and unbiased reporting of the existing literature, etc.
Frank Aylward
in reply to El Duvelle • • •I agree with you that there are better ways of doing this, and moving away from over reliance on these metrics would probably be the first step.
Self-citations in around a dozen countries are unusually high
Singh Chawla, DalmeetMAHanson
in reply to El Duvelle • • •@elduvelle yeah, chiming in, in many countries researchers are incentivized or outright pressured to publish in "high impact" or other box-ticking-exercise journals, and they receive bonuses, salary raises, or positions based on their CV and what is on it.
That's not to say that's good, but it is the reality that ~half the world experiences? And these conversations from Western perspectives often discount that as "bad," but lack appreciation for what drives it despite it being "bad."
El Duvelle
in reply to MAHanson • • •@MarkHanson so.. what drives it? Why would a university or government reward their researchers for publishing “impactful” science, whatever that means? Why don’t they instead encourage them to publish good science?
@foaylward
Frank Aylward
in reply to El Duvelle • • •Frank Aylward
in reply to Frank Aylward • • •Frank Aylward
in reply to Frank Aylward • • •Ko-Fan Chen 陳克帆
in reply to Frank Aylward • • •Shravan Vasishth
in reply to Frank Aylward • • •Shravan Vasishth
in reply to Frank Aylward • • •Hey I like this! Where do I sign up to buy citations, you ask? Here you go:
https://spubl.com.ua/en
" Intrigued by a citation-
boosting service that we unravelled during our investigation, we contacted the
service while undercover as a fictional author, and managed to purchase 50
citations."
Scientific Publications
Scientific Publications