Purpose of the articles posted in the blog is to share knowledge and occurring events for ecology and biodiversity conservation and protection whereas biology will be human’s security. Remember, these are meant to be conversation starters, not mere broadcasts :) so I kindly request and would vastly prefer that you share your comments and thoughts on the blog-version of this Focus on Arts and Ecology (all its past + present + future).

Premium Blogger Themes - Starting From $10
#Post Title #Post Title #Post Title

How are researchers using AI? Survey reveals pros and cons for science

Despite strong interest in using artificial intelligence to make research faster, easier and more accessible, researchers say they need more support to navigate its possibilities. 

By Miryam Naddaf04 February 2025

Researchers think that for some tasks, generative AI tools can already do a better job than humans. Credit: Getty

Using artificial intelligence (AI) tools for processes such as preparing manuscripts, writing grant applications and peer review will become widely accepted within the next two years, suggests a survey of nearly 5,000 researchers in more than 70 countries by the publishing company Wiley.

The survey asked researchers how they are currently using generative AI tools — which include chatbots such as ChatGPT and DeepSeek — as well as how they feel about various potential applications of the technology. The results suggest that the majority of researchers see AI becoming central for scientific research and publishing (see ‘Acceptable use’). More than half of the respondents think that AI currently outperforms humans at more than 20 of the tasks given as example use cases, including reviewing large sets of papers, summarizing research findings, detecting errors in writing, checking for plagiarism and organizing citations. More half of the survey participants expect AI to become mainstream in 34 out of 43 use cases in the next two years.

Source: ExplanAItions report, Wiley

“What really stands out is the imminence of this,” says Sebastian Porsdam Mann at the University of Copenhagen, who studies the practicalities and ethics of using generative AI in research. “People that are in positions that will be affected by this — which is everyone, but to varying degrees — need to start” addressing this now, he adds.

Wiley, headquartered in Hoboken, New Jersey, posted the survey findings online on 4 February. Josh Jarrett, senior vice-president and general manager of the publisher’s AI growth team, says he hopes they will serve as a road map for innovators and start-ups looking for opportunities to develop AI tools. “There's broad acceptance that AI is going to reshape the research field.”

Limited uses

The survey polled 4,946 researchers worldwide, 27% of whom are early-career researchers. Perhaps surprisingly, says Jarrett, the results show that “people aren't really using these tools much in their day-to-day work”. Only 45% of the first wave of respondents (1,043 researchers) said that they had actually used AI to help with their research, and the most common uses they cited were translation, proofreading and editing manuscripts (see ‘Uses of AI’).

Although 81% of these 1,043 respondents said they had used OpenAI’s ChatGPT for personal or professional purposes, only one-third had heard of other generative-AI tools such as Google’s Gemini and Microsoft’s Copilot. However, there are clear differences across countries and disciplines, with researchers in China and Germany, as well as computer scientists, being the most likely to use AI in their work.

Source: ExplanAItions report, Wiley

The majority of survey participants expressed interest in expanding their AI use. About 72% want to use AI for preparing manuscripts in the next two years — for tasks such as detecting errors in writing, plagiarism checks and organizing citations. Sixty-two per cent think that AI already outperforms humans in these tasks (see ‘Who does it better: humans or AI?’).

Around 67% of respondents also expressed interest in using AI to handle large amounts of information, for example helping to review the literature, summarizing papers and processing data. Early-career researchers showed greater interest than did more senior colleagues in using AI for writing grant applications and finding potential collaborators. “These are both things that come easier with experience and seniority,” says Porsdam Mann. “Using AI will help even those things out a little bit.”

However, researchers are less convinced about AI’s capabilities in more-complex tasks such as identifying gaps in the literature, choosing a journal to submit manuscripts to, recommending peer reviewers or suggesting relevant citations. Although 64% of respondents are open to using AI for these tasks in the next two years, the majority thinks that humans still outperform AI in these areas.

Source: ExplanAItions report, Wiley

Obstacles and opportunities

Despite a burgeoning interest in AI tools, the survey suggests that researchers need more support to use them confidently. Nearly two-thirds of respondents said that a lack of guidance and training is preventing them from using AI to the extent that they would like (see ‘Causes for concern’). Researchers are also worried about how safe it is to use these tools: 81% of respondents said they had concerns about AI’s accuracy, potential biasesprivacy risks and the lack of transparency in how these tools are trained.

“We think there's a big obligation of publishers and others to help educate,” says Jarrett. About 70% of respondents want publishers to provide clear guidelines on what uses of AI are acceptable, and 69% think publishers need to help them avoid errors and biases.

Source: ExplanAItions report, Wiley

“Some centralized training must be carried out, and that should be made mandatory just like good clinical-practice training is mandatory across the world,” says Tejaswini Arunachala Murthy, an intensive-care nutritionist at the University of Adelaide in Australia who took part in the survey. “We are ready to give the time. We are ready to learn, and we want to learn,” she adds. “All these AI researchers … need to train us how to use it appropriately.”

Wiley is currently interviewing more researchers and collecting feedback to update its own guidelines for using AI, which it plans to release in the coming months. These guidelines will help researchers to better understand how to use AI safely in research, including when human insight is necessary and what disclosures should be made. “I don't think any of us are ready to recommend this tool over that one,” says Jarrett. The aim is to “give people some general guidelines of how to stay safe and start to share best practice”.

(Sources: Nature)

    Powered By Blogger