terça-feira, 27 de agosto de 2019

Mudanças curriculares na Medicina


Medicina da UEM  desenvolve ações de desenvolvimento docente do Curso e da Universidade


O Conselho Acadêmico e o Núcleo Docente Estruturante do Curso de Medicina da UEM participará nos próximos dias de ações de desenvolvimento docente.

O primeiro evento, um curso de extensão voltado aos professores dos 11 departamentos que oferecem componentes curriculares para o Curso, terá o seu primeiro encontro no dia 29 de agosto de 2019 à noite. O objetivo principal será a avaliação do projeto pedagógico do Curso à luz das atuais Diretrizes Curriculares Nacionais (DCN, 2014) e um estudo comparativo com outros modelos curriculares.




Nos dias 02 e 03 de setembro de 2019, participaremos de uma mesa-redonda sobre o ensino de graduação na UEM onde coordenadores apresentarão experiências exitosas e desafios de seus Cursos.

Estas discussões visam embasar mudanças que valorizem o ensino de graduação e estimulem a sua integração com a extensão, pesquisa e pós-graduação.








segunda-feira, 26 de agosto de 2019

Preprints



Questionable Rejection


Sociologist says journal dismissed her paper because she'd shared it elsewhere as a preprint -- even though the publication had a pro-preprint policy. How often does this happen?



 

Most academics have lots of rejection stories. Far fewer have rejection stories like Alison Gerber’s.

The U.S.-trained postdoctoral researcher in sociology at Lund University in Sweden recently got a terse email from the editor of an unnamed journal saying she couldn’t publish her paper because Gerber had already shared it in the preprint repository SocArXiv.

“One of the reviewers who agreed to evaluate your paper for the journal had the presence of mind to plug the title into a search engine,” the email reads. “He asked if he should review the paper for us given that it is already published (without the benefit of peer review). Of course not.”






The email’s haughty tone in itself is stunning, if not unusual. But Gerber was struck by the idea that sharing her preprint with other social scientists made her ineligible to formally publish it elsewhere. After all, many sociologists and academics in dozens of other fields are pushing for increased sharing of manuscripts and data in preprint archives in the name of transparency and better science. The idea behind unrefereed preprints is to get research in the public domain faster than the traditional peer-review process allows, to get feedback from colleagues that might make eventually published papers better, and to find potential collaborators.

Luckily, Gerber didn’t take the editor’s word for it and asked a librarian to help her investigate. The trusty librarian (also unnamed) soon reported back to Gerber what she’d expected all along: that the journal had no policy against publishing papers that had been submitted elsewhere as preprints. In fact, the journal had a policy encouraging preprint sharing.

Gerber declined requests for an interview and that she name the journal. She did say that the journal's editorial team worked mostly in the U.S., United Kingdom and Canada. Beyond that, she referred questions to her Twitter thread about her misadventure. According to her account, Gerber sent a polite follow-up to the journal editor, who soon admitted the error and said the paper would be considered for publication after all.

Interest in Gerber’s mini-saga was high -- probably for a number of reasons. A supposedly “blind” reviewer had searched for her paper title, which could have easily given away her name, gender and academic credentials even if she’d just shared it at a conference and not on SocArXiv. Both the reviewer and the editor did not know their journal's own policy on preprints. And the exchange revealed either a misunderstanding of or antipathy for preprint repositories on the part of at least one journal.

What is the purpose of a preprint repository such as a SocArXiv and how does it differ from that of a traditional publication? Can, and should, the two systems really coexist? Philip Cohen, professor of sociology at the University of Maryland at College Park and a member of SocArXiv’s steering committee, said that he and his collaborators designed the platform “not to replace journals but to supplement them.”

The point of preprint repositories “is to get work out faster and for free, and then still use the peer-review system for validating what's good and/or important,” Cohen added via email. “Almost everyone in the journal disciplines, as opposed to humanities, is still publishing in journals, even if they are also posting papers on systems like SocArXiv. Journals are how we get formal recognition and get tenure.”

Preprint archives typically have some standards for publication, but they are nowhere near as stringent as most peer-reviewed journals, which publish on regular schedules and are space-limited. SocArXiv papers, for instance, are moderated before they appear online -- a process that takes up to two days, not months. SocArXiv's policy says that papers must be scholarly, relevant to the social sciences, "plausibly categorized" and correctly attributed, and in moderated languages.

Even if the journal Gerber submitted to doesn’t have a problem with preprints, do other publications have policies against them? Jessie Daniels, a professor of sociology at Hunter College and the Graduate Center of the City University of New York who has written several books on digital sociologies, said that the vast majority of academic journals have no issue with preprints.

Situations like Gerber’s aren’t common because blind reviewers aren’t typically googling paper titles, she added. But academics as a group remain “woefully ignorant about open access, scholarly communication and the way the landscape of knowledge production is changing in the digital era.”

domingo, 25 de agosto de 2019

Universidade e Sociedade



“A sociedade mudou e nós não percebemos”

No USP Talks, reitores das três universidades estaduais paulistas enfatizam necessidade de uma maior interação com a sociedade





Os reitores Marcelo Knobel (Unicamp), Vahan Agopyan (USP) e
Sandro Valentini (Unesp) e o jornalista Herton Escobar durante 
debate sobre educação
Foto: Cecília Bastos/USP Imagens
.


As universidades precisam interagir mais com a sociedade, e se comunicar melhor com ela, reconhecendo que as demandas e a percepção pública dessas instituições estão em constante evolução.

“Esse é o nosso grande desafio”, disse o reitor da USP, Vahan Agopyan. “Não porque a universidade ficou isolada numa torre de marfim”, como se costuma dizer, mas porque “a sociedade mudou, e nós não percebemos”, completou.

Os comentários foram feitos na última edição do USP Talks, que reuniu os reitores das três universidades estaduais paulistas para falar sobre os desafios da educação no Brasil, no último dia 20, em São Paulo.

Segundo Vahan, a sociedade hoje é “mais exigente, mais preparada e tem expectativas maiores” do que antigamente, que precisam ser levadas em conta na maneira como as universidades se relacionam com ela. “Não adianta falar em número de papers ou prestígio do paper, porque não é o que a sociedade deseja”, disse. “Quero que a sociedade entenda que a universidade é onde ela vai conseguir ter as suas discussões e o seu desenvolvimento. Essa é a nossa meta.”



Comunicação com a sociedade, melhor formação e internacionalização foram
alguns dos temas destacados pelos reitores no evento especial do 
USP Talks  – Foto: Cecília Bastos/USP Imagens
 
O reitor da Universidade Estadual de Campinas (Unicamp), Marcelo Knobel , falou sobre a necessidade de dar mais flexibilidade ao sistema e ampliar a multidisciplinaridade da formação universitária, buscando melhorar a preparação dos alunos para os novos desafios do século 21 — incluindo a capacidade de se adaptar rapidamente a novas carreiras e novas tecnologias que deverão surgir, muitas vezes de forma inesperada.

“Estamos indo na contramão do mundo inteiro no que se refere à maneira de ensinar e ao que ensinamos”, disse Knobel. “Nosso sistema é engessado, extremamente conteudista, rígido e muito, muito complicado. Tudo no sentido contrário do que a tendência mundial indica.”

“Sem dúvida precisamos avançar e começar a educar para o século 21, e não continuar ensinando como no século 19”, destacou, também, o reitor da Universidade Estadual Paulista (Unesp), Sandro Valentini. Ele lembrou da sua origem simples no interior paulista, filho de um funileiro mecânico, e disse que entende a educação como uma forma de emancipação. “Todos os jovens deveriam ter essa oportunidade.”

Valentini destacou também o papel da internacionalização, “não apenas para atingir rankings, mas para transformar os jovens”; e a importância de demonstrar o impacto social das pesquisas feitas nas universidades, “muito importante para recuperar ou fortalecer a nossa legitimidade perante a sociedade”.

Cada reitor fez uma apresentação individual de 15 minutos, e na sequência participaram de um debate com a plateia, em que foram levantados diversos temas. Entre eles, o programa Future-se, do governo federal, e a CPI das universidades, na Assembleia Legislativa do Estado de São Paulo (Alesp).

Os vídeos do evento estão disponíveis abaixo e no canal do USP Talks no YouTube, onde é possível assistir também a todos os eventos anteriores do programa.



 

Serviço

O USP Talks é uma iniciativa da USP, com o objetivo de aproximar as universidades da sociedade. Os eventos são realizados mensalmente, no auditório do Museu de Arte de São Paulo (Masp), com entrada gratuita e transmissão ao vivo pela internet. Para saber mais e acompanhar a programação, siga nossas páginas nas redes sociais: Facebook, YouTube e Twitter.

terça-feira, 6 de agosto de 2019

Overestimation of results



Abstract ‘Spin’

Study says authors exaggerate their findings in paper abstracts, and that's a problem when readers take them at face value.

Getty Images



We’ve all been told not to judge a book by its cover. But we shouldn’t be judging academic studies by their abstracts, either, according to a new paper in BMJ Evidence-Based Medicine. The study -- which found exaggerated claims in more than half of paper abstracts analyzed -- pertains to psychology and psychiatry research. It notes that “spin” is troublesome in those fields because it can impact clinical care decisions. But the authors say that this kind of exaggeration happens in other fields, too.

“Researchers are encouraged to conduct studies and report findings according to the highest ethical standards,” the paper says, meaning “reporting results completely, in accordance with a protocol that outlines primary and secondary endpoints and prespecified subgroups and statistical analyses.”

Yet authors are free to choose “how to report or interpret study results.” And in an abstract, in particular, they may include “only the results they want to highlight or the conclusions they wish to draw.”

In a word: spin.

Based on the idea that randomized controlled trials often inform how patients are treated, researchers used PubMed to find these kinds of studies. Their sample included those published from 2012-17 in well-regarded psychology and psychiatry journals: JAMA Psychiatry, American Journal of Psychiatry, Journal of Child Psychology and Psychiatry, Psychological Medicine, British Journal of Psychiatry and Journal of the American Academy of Child and Adolescent Psychiatry.

Crucially, they analyzed only trials with results that were not statistically significant, and therefore were susceptible to spin -- 116 in all.

Evidence of spin included focusing only on statistically significant results, interpreting nonsignificant results or equivalent, using favorable rhetoric with regard to the nonsignificant results and declaring that an intervention was beneficial despite its statistical insignificance.

How often did articles’ abstracts exaggerate the actual findings? More than half the time, or 56 percent. Spin happened in 2 percent of titles, 21 percent of abstract results sections and 49 percent of abstract conclusion sections. Fifteen percent of abstracts had spin in both their results and conclusion sections.

Spin was more common in studies that compared a proposed treatment with typical care or placebo than in other kinds of studies. But industry funding was not associated with a greater likelihood of exaggeration, as just 10 of 65 spun trials had any of this kind of funding.

The study notes several limitations, including that looking for spin is inherently subjective work. But it says that it’s important to guard against spin because researchers have an ethical obligation to honestly and clearly report their results and because spinning an abstract “may mislead physicians who are attempting to draw conclusions about a treatment for patients.” Physicians read only an article abstract, versus the entire article, a majority of the time, it says, citing prior research on the matter, and many editorial decisions are based on the abstract alone. Positive results are also more likely to be published in the first place, the paper notes, citing one study that found 15 percent of peer reviewers asked authors to spin their manuscripts.

What’s to be done? Journal editors may consider inviting reviewers to comment on the presence of spin, the article suggests.

Reporting guidelines also are used by several journals already to “ensure accurate and transparent reporting of clinical trial results, and the use of such guidelines improves trial reporting,” the paper says. While the recent Consolidated Standards of Reporting Trials on abstracts don’t contain language discouraging spin, it says, “research reporting could be improved by discouraging spin in abstracts.”

Lead author Sam Jellison, a medical student at Oklahoma State University, underscored that his paper is not the first to explore academic spin. Yet making more readers “aware of what spin is might be the first and largest step to take to fight this problem,” he said. Jellison said that the existing literature suggests spin is not unique to psychology and psychiatry, and that those fields are actually “middle of the road” in terms of prevalence.

Philip Cohen, a professor of sociology at the University of Maryland at College Park who blogs about research, pointed out that reviewers already look at abstracts as part of their process, so in addition to the journal editor, "reviewers should be able to see if the abstract is overstating the findings.”

Still, a common way that sociologists inflate research findings in general is to mention those that are not statistically significant while downplaying the lack of significance, attributing it to a small sample or using phrases such as “does not reach statistical significance,” he said, “as if the effect is just trying but can't quite get there.”

Beyond questions of spin, Cohen said there is surely a problem with “people only publishing, or journals only accepting, dramatic findings,” he said. So the greatest source of exaggeration is probably in what gets published at all, with null findings or those that contradict existing positive results never seeing the light of day -- what Cohen noted has been called the "file drawer" problem.

While psychology isn't alone in the spin room, the field has had its share of data integrity and public perception problems. A landmark study in 2015, for example, found that most psychology studies don’t yield reproducible results.

Brian Nosek, a professor of psychology at the University of Virginia and lead author on the reproducibility study, said that spin involves two “connected problems,” neither of which is easy to solve. Authors are “incentivized to present their findings in the best possible light for publishability and impact, and readers often don't read the paper.”

As an author, he said, “even if I want to avoid spin,” it’s “entirely reasonable for me to try make the narrative of my title and abstract as engaging as possible so that people will read the paper.” And at the same time, it’s “very difficult to capture the complexity of almost any research finding in a phrase or short abstract.” It’s really a “skill” to present “complex findings briefly without losing accuracy.”

As a reader, Nosek continued, “even if I want to make the best possible decisions based on research evidence, I don't have time to read and evaluate everything deeply." In some cases, he said, "I need to be able to trust that the information conveyed briefly is accurate and actionable.”

Ultimately, when “decisions are important, we should have higher expectations of readers to gather the information necessary to make good decisions,” he said. “But we need to recognize pragmatic realities and develop better tools for readers to calibrate the confidence in the claims they see in brief, and provide cues prompting them to dig more deeply when the evidence is uncertain.”

It’s also “in our collective interest to provide authors more training in communicating their findings in abstracts and press releases," Nosek added.