Ever since I first started working in publishing (a looooong time ago), there have been complaints about the process itself — from authors, editors, and reviewers, not to mention my publishing colleagues. If anything, these seem to have gotten worse in the 25+ years since the transition to online publications, because the move to digital publishing was not accompanied by a move to a truly digital publishing process. Until now…
The last few years have seen progress in the development of end-to-end publishing platforms that are intended to finally address this challenge, including through Silverchair’s acquisition of ScholarOne, the launch this month of Morressier’s Journal Manager, and the rollout of Wiley’s Research Exchange system.
In this interview, my former Wiley colleague, Liz Ferguson, now their Senior Vice President of Research Publishing, shares her thoughts on the challenges and opportunities for Research Exchange and for publishing platforms and processes more generally.

Let’s start with how you currently gather feedback on the publishing process at Wiley and what you’ve been hearing — good and bad – from your authors, editors, and reviewers
We’ve built multiple feedback channels into our approach so we can synthesize multiple sources of input. We run regular virtual roundtable sessions where 500+ editors across 50 countries have learned more about the platform and shared any feedback or concerns ahead of their journal migrating to Research Exchange. We have also run in-person workshops with editors in the US, UK, Germany, China, and Australia.
Through these conversations and other channels, including regular surveys, user interviews, live intercepts in the system, and customer support interactions, we’ve collected over 3,300 inputs during our Research Exchange transition process. Most recently, we launched the Research Exchange Advisory Board with external stakeholders — editors, society partners, publishing professionals — who will provide direct input into our development priorities.
We’re hearing a mix, as you’d expect, with such a complex and fundamental system change. On the positive side, 92% of authors have told us the submission process is fast and easy to use — that’s encouraging validation that the streamlined workflows are making a difference to them. But we’ve also heard about pain points, particularly around the learning curve for editors moving from legacy systems and constraints of the system — some a question of timing, some deliberate, some we can improve on.
What’s been valuable is that the collective feedback has directly contributed to more than 40 of the 80 major platform updates since the start of the year. Many of these were already planned, but when the community tells us where we need to focus, we review our roadmap and act accordingly by re-prioritizing or refining what we deliver.
After many years of relying on older platforms that were built to support the print publication process, there’s now a new wave of publishing platforms, including Wiley’s own Research Exchange, that are truly digital-first. What sorts of improvements will authors, editors, and reviewers see as a result?
Having worked with publishing systems that were essentially digitized versions of print workflows, I can appreciate how transformative truly digital-first thinking can be. For authors, we’re seeing AI-powered workflows that can extract metadata automatically, making submission faster and more accurate. Instead of manually filling out forms, authors can focus on their research while the platform handles the administrative details. Additionally, the platform allows authors and institutions alike to automatically identify Transformational Agreement eligibility, with open access coverage clearly displayed at the point of submission.
For editors, the biggest shift is having integrated tools that work together rather than juggling multiple systems. Our reviewer matching uses AI to search across all publication records, not just limited databases, and provides visibility into reviewers’ current commitments across journals. It’s new, so it doesn’t always generate perfect results, but we’re continuing to improve it with the aim of offering editors better matches and reducing reviewer fatigue. The upfront screening capabilities are also filtering out submissions that shouldn’t make it into peer review, again reducing pressure on both editors and reviewers.
Reviewers benefit from more targeted invitations based on their actual expertise, consolidated manuscript PDFs instead of multiple files, and the ability to submit reviews even after deadlines if decisions haven’t been made yet. It’s about reducing friction at every step while maintaining the rigor that scholarly publishing requires.
How will you be measuring their satisfaction with the new platform?
We’re tracking both quantitative metrics and qualitative feedback. On the numbers side, we monitor submission completion rates, time-to-decision metrics, and user engagement patterns. But what tells us more about genuine satisfaction are the ongoing conversations, post-migration surveys for editors, post-submission surveys for authors, inputs from the Research Exchange Advisory Board, and systematic analysis of support inquiries.
What we learned early on is that satisfaction isn’t just about individual features working well — it’s about the overall experience feeling seamless and supportive. So, we’re measuring things like whether editors feel they can focus more on content quality rather than administrative tasks, and whether authors feel supported throughout their publishing journey.
While the improvements that you and other companies are making will be very welcome, managing the transition to a new platform or process is always challenging — all the more so when you’re doing it on a massive scale as Wiley is. What are some of the key lessons you have learned?
It is impossible to anticipate every challenge with change at this scale, and we certainly did not. Some of our early assumptions taught us valuable lessons. One crucial insight was that migration and platform communication pre-migration needed to be much more specific. Editors were receiving high volumes of communication, creating frustration and fatigue.
We also learned that training needs to be iterative — not just a one-time session before launch, but ongoing support as people encounter real-life scenarios. We continued to adapt our learning approach as we found that editors strongly preferred shorter, more modularized training, as opposed to longer-form training manuals.
Perhaps most importantly, we learned that transparency about what has been delivered, what is coming in soon, and what is coming later builds more trust than presenting everything as perfectly polished. We’ve published a public-facing product roadmap including release notes, created new training resources for editors, and shared “behind the scenes” videos to acknowledge challenges openly and show how we’re addressing them.
Research integrity is clearly a major concern in our community at present and, like several other scholarly publishers, Wiley has had some major problems with this in recent years. What are the most important ways that platforms like Research Exchange can do to tackle issues around research integrity — before, during, and after publication?
This is where modern platforms make a real difference, though technology alone isn’t the solution. Research Exchange conducts more than 25 comprehensive integrity checks at initial screening — things like detecting signals of potential papermill content, AI-generated text, and unusual publication patterns – but every potential issue gets flagged for human expert review. We developed all but two of these checks internally and are also working with industry initiatives such as the STM Integrity Hub. Research Exchange recently won a Silver EPIC Award from the Society for Scholarly Publishing for these capabilities.
What’s particularly valuable is cross-journal analysis that wasn’t possible with fragmented systems. We can identify concerning patterns across the entire portfolio, providing scale that’s difficult to achieve with legacy platforms. The platform also includes automatic conflict of interest detection and researcher identity verification to prevent some issues from arising in the first place.
But technology has to work alongside human expertise and clear policies. The screening tools help editors focus their attention on the submissions that need deeper review, rather than spending time on routine checks. It’s about augmenting editorial judgment, not replacing it. This applies at both the journal and portfolio level. Our screening technology and data can also help recognize developing trends and support post-publication investigations.
Trust markers, like those in ORCID records, are growing in importance. Publishers can both contribute to them (by adding information to authors’, editors’, and reviewers’ records) and benefit from them. Is this something that you’re doing or planning to do?
We’re actively integrating ORCID throughout the Research Exchange workflow. When authors submit, their ORCID profiles help with automatic metadata extraction and affiliation verification. For reviewers, we’re implementing a system where, after final decisions are made, we can update ORCID profiles for reviewers who’ve opted in – giving them credit for their peer review contributions.
What’s exciting about this integration is that it creates a positive cycle. The more complete ORCID records become, the better our matching and verification systems work, which in turn makes the publishing process smoother for everyone.
This feels like a natural evolution to me. Publishing has always been about building scholarly reputation and networks, and ORCID provides infrastructure to make that more systematic and portable across publishers and institutions.
What’s next for publishing platforms in general and Research Exchange in particular? What further improvements do you think authors, editors, and reviewers can expect in the next five years or so?
Across the industry, we’re seeing a move toward more predictive and anticipatory systems rather than just reactive ones. I see platforms becoming better at suggesting potential collaborators, identifying emerging research areas, and even predicting which manuscripts might need additional integrity screening based on pattern recognition.
For Research Exchange specifically, we’re working on enhanced reviewer matching through keywords, improved conflict of interest detection using email address cross-referencing, and transparent peer review options where journals want to publish review histories alongside accepted papers. We’re also exploring better integration with data repositories like Dryad to support open research practices.
What excites me most is the potential for platforms to reduce the administrative burden that pulls researchers away from actual research and editors away from quality content. Imagine systems that can handle manuscript transfers seamlessly between journals, automatically check compliance with funder requirements, or provide real-time feedback on research integrity concerns during submission rather than weeks later. These are some of the step changes that we’ve been able to accomplish with Research Exchange, and which we’re excited to continue building on.
It’s worth noting that we also have a unique advantage in the development trajectory. Since we’re in full control of the development roadmap, we can be agile and embrace new capabilities as soon as they are ready for primetime. For example, as our editors and partners identify new priorities, we can move quickly to address them. And we can ensure that the platform makes best use of AI capabilities where they can be additive to human expertise – for example, through robust research integrity screening.
The goal isn’t to replace human expertise but to give researchers, editors, and reviewers better tools so they can focus on what they do best – advancing knowledge and maintaining the quality of scholarly discourse. We’re still in the early stages of discovering the many opportunities for truly digital publishing infrastructure to support and uphold the research process, and we’re excited about the role that we can play in creating this future state together with the scholarly community.