Professionahead
Professionahead
Outdated practices and lack of simplicity result in ‘unfindable’ work, Carole Goble tells Jisc Digital Festival 2015
SOURCE: GETTY
The way in which academic papers are published makes much research “unfindable”, while scholars’ lack of transparency about their research methods renders many of their conclusions highly questionable.
This is the view of Carole Goble, a professor in the School of Computer Science at the University of Manchester, who last week addressed the Jisc Digital Festival 2015 (Digifest) conference in Birmingham, organised by Jisc, the UK’s higher education IT consortium.
She told delegates that the current “knowledge-turning mechanism”, whereby researchers publish in journals “PDFs and the odd Excel [spreadsheet]”, resulted in the “burying” of data and research, rendering them “unfindable”.
Professor Goble told Times Higher Education that she knew of young researchers who wished to present their data more clearly and visibly and who wanted to dedicate time to achieving that. “Their [supervising] professors have said: ‘Well, what are you doing wasting your time doing that?’ You could be writing a paper.”
Researchers who were failing to embrace more forward-thinking methods of publication needed re-education, she continued.
Too often, she told delegates, academics sought to write extremely complicated papers, based on elaborate methods, in the hope of ensuring that their research was submitted to the research excellence framework – even though this approach meant that their work was read by “the square root of bugger-all people”. The current system, she said, meant “RIP” for research papers: “rest in publication”.
Interviewed for a THE podcast, Professor Goble said that pressure to produce overly complicated work sometimes stemmed from a desire to avoid “academic trolling” – bullying by scholars who are critical of someone’s work.
“[You can be] trolled because you made something straightforward, because you wanted a community to understand it, when your job was to make it look clever,” she said.
To describe how this could work, she gave the example of one of her own papers. “I could have presented a paper about ‘the detailed denotational semantics of the lander calculus used underneath the workflow engine’, or I could have just said, ‘here is a workflow engine and this is how to use it’. But that kind of useful and highly cited paper is [often viewed as] merely ‘useful’, as opposed to ‘academic’.”
According to Professor Goble, another problem in research is that of academic rivalry, particularly in some disciplines, stifling collaboration.
“In biology, if I am looking at the function of a gene and you are looking at the function of a gene, then the first person to publish wins,” she said. “You are not going to get a paper [if you are] the second person to discover the purpose of this gene…which leads to this very defensive, quite competitive publishing world.”
Professor Goble had a low opinion of the approach many researchers take to software, suggesting that even when papers are read, opacity about research methods meant that conclusions had to be treated with caution.
She cited research by the Software Sustainability Institute that suggested that one-fifth of academics who develop their own software for use in research have had no training in programming. “If we have broken software, we have broken science,” she said.
No comments:
Post a Comment