Photo by Herlambang Tinasih Gusti on Unsplash
Overview of the 4th Conference on Research Evaluation and Sci-Tech Journals
The 4th Conference on Research Evaluation and Sci-Tech Journals, held from January 13 to 16, 2026, in Zhuhai, China, marked a pivotal moment in the evolution of academic publishing and research assessment practices. Organized under the leadership of Yang Liying, this gathering brought together over 500 experts, policymakers, journal editors, and researchers from across China and internationally. The event focused on addressing pressing challenges in scientific and technological (sci-tech) journal development, emphasizing reforms in evaluation metrics, the rise of open access (OA) models, and sustainable publishing ecosystems. Zhuhai's modern convention center provided an ideal backdrop, fostering discussions amid China's rapid advancements in global research output.
Attendees explored how China, now the world's largest producer of scientific papers, is shifting away from over-reliance on traditional metrics like the Journal Impact Factor (JIF), introduced by Eugene Garfield in 1955 as a measure of a journal's average citations per article. This conference built on prior reforms initiated in 2018, aiming for a more holistic assessment that values quality, innovation, and societal impact over sheer publication volume.
Key Themes and Sessions Highlighted
Sessions delved into academic publishing trends, with a spotlight on digital transformation and AI-driven tools for peer review. One prominent discussion examined the integration of alternative metrics (altmetrics), which track online mentions, downloads, and policy citations alongside traditional bibliometrics. Experts debated the pros and cons of these approaches, noting that while JIF has propelled China's rise—evidenced by its 2022 overtake of the US in high-quality publications per Nature Index—it often incentivizes quantity over quality.
Journal evaluation reforms were central, with panels outlining China's 'breaking the four-only' policy (only papers, only titles, only JIF, only rankings). This policy, formalized in 2020, encourages evaluations based on peer recognition, contributions to national priorities, and real-world applications. Case in point: the National Natural Science Foundation of China (NSFC) now prioritizes project outcomes over publication counts, leading to a 15% increase in applied research funding allocations since 2023.
Distinguished Speakers and Their Insights
International voices enriched the dialogue. Nandita Quaderi, Editor-in-Chief of Web of Science at Clarivate, shared global perspectives on responsible metrics, drawing from the San Francisco Declaration on Research Assessment (DORA, 2012), which advocates moving beyond JIF. She highlighted how China's reforms align with international trends, citing a 25% global drop in JIF citations in humanities since 2020 due to diversified evaluations.
Domestic leaders like Yang Liying emphasized sci-tech journal internationalization. Chinese journals, numbering over 5,000 in sci-tech fields, have seen English editions surge by 30% in the past five years, per China National Knowledge Infrastructure (CNKI) data. Other speakers included representatives from the Chinese Academy of Sciences (CAS), discussing integration of big data analytics for fraud detection in submissions.
China's Research Evaluation Reforms: Historical Context
China's journey began in the 1990s with heavy emphasis on Science Citation Index (SCI) papers for promotions and funding. By 2015, Chinese authors contributed 20% of global SCI papers, but issues like 'paper mills' emerged, with over 10,000 retractions involving Chinese papers between 2018-2023, according to Retraction Watch. The 2018 'Several Opinions on Deepening Reform' marked a turning point, promoting 'representative works' evaluation.
Step-by-step, institutions implemented changes: first, universities like Tsinghua reduced JIF thresholds for tenure; second, ministries introduced 'zero-based' budgeting tied to impact; third, databases like CNKI incorporated OA mandates. A 2022 study in China’s Research Evaluation Reform analyzed consequences, finding a 12% rise in interdisciplinary papers post-reform.
Open Access and Future of Sci-Tech Journals
Open Access (OA), where research is freely available without paywalls, dominated future-oriented sessions. China aims for 50% OA by 2030, up from 25% in 2025, per Ministry of Science and Technology (MOST) goals. Discussions covered green OA (self-archiving), gold OA (publisher-funded), and diamond OA (no-fee, community-led). Challenges include funding: APCs (Article Processing Charges) average $2,500 globally, straining Chinese institutions.
- Benefits: Increased citations (OA papers cited 18% more, per OECD STI Outlook 2025).
- Risks: Predatory journals, with 5,000+ identified in China by 2024.
- Solutions: National OA platforms like OpenCIF.
Experts proposed hybrid models, blending subscriptions with OA, as seen in Nature Portfolio's China partnerships.
Statistics and Case Studies from the Conference
Delegates presented data: China's sci-tech papers grew 8.5% annually (2015-2025), per CAS reports, but quality metrics like Field-Weighted Citation Impact (FWCI) improved only 4%. Case study: Peking University's shift to narrative CVs resulted in 20% more grant wins for early-career researchers.
| Metric | China 2020 | China 2025 | Global Avg |
|---|---|---|---|
| SCI Papers (% global) | 21% | 28% | - |
| OA Share | 15% | 25% | 22% |
| Retractions/1000 papers | 2.1 | 1.4 | 1.0 |
Another case: International Science Council synthesis influenced sessions, showcasing pilot programs in Shanghai reducing publication pressure by 30%.
Stakeholder Perspectives: Editors, Researchers, Policymakers
Journal editors advocated for AI ethics in reviewing, while researchers shared burdens of 'publish or perish'. Policymakers from MOST outlined incentives like tax breaks for OA publishers. Balanced views emerged: a Fudan University professor noted, 'Reforms liberate creativity but require robust training.' Global attendees appreciated China's scale, predicting leadership in narrative assessments.
For those navigating these changes, exploring research jobs on AcademicJobs.com can connect you with institutions prioritizing innovative evaluations.
Challenges, Impacts, and Proposed Solutions
Challenges include metric gaming and international recognition gaps. Impacts: Positive for global science via collaborative OA; negative if reforms slow output. Solutions proposed:
- Standardized peer review AI tools.
- International benchmarks like Leiden Manifesto (2015).
- Training via platforms like AcademicJobs career advice.
Conference resolutions called for a 2027 national framework integrating these.
Global Implications and Future Outlook
China's model influences Asia and Africa, with Belt and Road partnerships exporting evaluation tools. OECD projections suggest diversified metrics could boost global innovation by 10% by 2030. Future: Expect AI-enhanced journals and blockchain for integrity.
Researchers eyeing China opportunities should review postdoc positions amid rising demand for evaluation experts.
Practical Takeaways for Academics and Institutions
Actionable insights: Diversify portfolios with preprints (e.g., arXiv usage up 40% in China); engage in altmetrics tracking; pursue interdisciplinary work. Institutions: Adopt DORA principles for hiring. For career growth, leverage professor jobs listings tailored to reformed landscapes.
This conference underscores China's commitment to quality-driven science, positioning it as a global leader.