Get in touch
555-555-5555
mymail@mailservice.com

News & Views: In Research We Trust

Lori Carlin • February 28, 2024

In Research We Trust

By Meg White and Heather Staines


The Traditional Role of Publishers


Maya Angelou famously said “Do the best you can until you know better. Then when you know better, do better.”


Professor Angelou probably did not have scholarship or science top of mind when she made this observation, but in truth, this statement gives a concise summary of how scholarship works: it is a constant evolution of ideas - one discovery built upon another - hypotheses and ideas are tested, validated, and retested to gain consensus. Scholars and scientists constantly challenge, probe, and replicate in order to advance human knowledge and discovery.


Publishers have long played a significant role in the creation and dissemination of scholarly information, with a key goal being to ensure the validity and efficacy of the works that they publish. Publication in a scholarly journal is a “stamp of approval” from the most learned minds in a specific field. Collaborating with scholars and researchers, publishers provide peer review, work to uncover false or misleading data and information, and endeavor to expose “bad” scholarship or science so that quality research shines through. However, the growing number of high-profile cases ranging from plagiarism to outright fraud have raised questions about the effectiveness of the systems currently in place to ensure research integrity. Is scholarship facing an integrity crisis, and if so, what can be done to regain trust in the system?


Challenges on Multiple Fronts


The Expanding Ecosystem


From increased funding levels to the globalization of research, the pace of discovery and knowledge expansion has never been greater. Since 2010, the total number of scholarly papers published has more than doubled, from approximately 2.4M to nearly 5M (Source: Delta Think OA DAT Annual Market Sizing Update 2023). Some of this growth can be attributed to a more global and diverse author pool, with papers from authors in the Global South growing from just less than 25% to around 45% of total published output (Source: Delta Think OA DAT Annual Market Sizing Update 2023). While this expansion contributes to a more inclusive landscape, this rapid growth is stressing the existing peer review infrastructure. In a time when scholarship is exploding across multiple fields at an unprecedented rate, even small publishers and societies need to operate at scale in order to keep pace.


Technology and Artificial Intelligence


For all of its benefits, artificial intelligence is creating an entirely new class of challenges, such as text and image generation and manipulation which defy detection via traditional development and review processes. Frontiers recently retracted a paper that included AI-generated images after it went viral. Temple University launched an ongoing internal investigation in 2021, at the request of the US Office of Research Integrity (ORI), into alleged data manipulation. However, while technology helps create challenges, it can also be part of the solution. To combat fraud, Science recently announced plans to use AI-powered proofing to help identify suspicious images. All of this leads to the question: has the amount of fraud and bad scholarship actually risen as a percent of the overall scholarly record, or does technology simply provide the tools to identify it more efficiently? The answer is probably yes and yes. Is technology the problem and the solution at the same time? Yes and yes here as well.


Publishers in a Leadership Role


Publishers can and should lead on research integrity. This challenge also presents an opportunity. At a time when some in the scholarly community openly question the value of a traditional publisher, what publishers have always done well – contribute to and support the integrity of the scholarly record – is MORE critical than ever. Publishers can play a unique role by building upon existing systems and processes and collaborating to create and utilize new tools that help ensure the integrity of the content that they create and disseminate. If done well, the publisher brand and the authority it confers will continue to be the “stamp of approval” for vetted science and scholarship, even as “bad” or “questionable” content continues to proliferate.


It’s Everyone’s Job


We can all contribute to ensuring trust in the integrity of the scholarly record. Many new tools will help publishers leverage their experience and “best practices” as good stewards in the scholarly information ecosystem, as well as innovate to ensure that their processes are ready to meet new challenges.

  • Bibliometric tools and databases such as DimensionsWeb of Science, and Scopus (and even our own Data & Analytics Tool) help publishers explore citation metrics and research analytics, while providing more transparency around research output.
  • New fit-for-purpose tools such as Signals and the Papermill Alarm from Clear Skies, among the many the AI initiatives highlighted at the recent STM Innovations Day, help researchers, editors, and publishers detect and prevent publication fraud by looking at researcher profiles, collaboration, or submission patterns.
  • Image duplication and manipulation tools like Proofig and Imagetwin make it easier than ever before to check for faulty research practices or fraud.


New players, as well as long-standing initiatives, are focusing efforts and resources on combating predatory practices.

  • STM’s Integrity Hub is designed to help publishers “effectively and efficiently respond to the increasing and alarming volume of materials entering scholarly communications that violate accepted research integrity.”
  • United2Act is a global coalition of commercial and non-profit publishers and societies focused on exposing paper mills, and it is working to establish shared tools and resources in support of integrity in the scholarly publishing process.
  • COPE, a long-time advocate for ethics in publishing, recently published a position statement calling for immediate action against paper mills and in support of United2Act.
  • Watchdog groups such as Retraction Watch, recently acquired by Crossref, and PubPeer, a post-publication peer review effort, have helped shine a light on fraudulent papers and publications.


Conclusion: Progress, Not Perfection


Where do we go from here? Scholarship is not now and has never been a perfect system. Self-correction is a feature, not a bug. Fraud has always existed, theories are disproven – scholarship continues apace as new discoveries are made and new knowledge gained. Publishers have an opportunity to play a critical role in this process by clearly defining their value as guardians and defenders of integrity in research and scholarship. What questions is your organization focusing upon now in this critical space? What tools are you interested in exploring? How can you combine your expertise and resources with others to further your mission?


Let’s Talk


Delta Think helps publishers, professional societies, technology companies, startups and others find their place in the rapidly transforming scholarly communication ecosystem. From sustainability, to research integrity, to diversity, equity and inclusion, our experience is your opportunity. We’d love to share more about what we are seeing and hearing in the world of scholarly communication, and how your organization can manage to change this fast-moving landscape. Contact us today to get the conversation started.


This article is © 2024 Delta Think, Inc. It is published under a Creative Commons Attribution-NonCommercial 4.0 International License. Please do get in touch if you want to use it in other contexts – we’re usually pretty accommodating.


TOP HEADLINES


Report of the 2nd Diamond Open Access Conference – February 20, 2024

"The 2nd Diamond Open Access Conference brought together stakeholders from around the globe to explore, discuss, and share insights on the diamond open access scholarly communication ecosystem. This report reflects the mission of the conference to showcase good practices and policies from all over the world, offering attendees a comprehensive perspective on the role of diamond open access in scholarly communication."


US funders meet to discuss cultural change in open science – February 15, 2024

"In January, Frontiers hosted a discussion forum for United States-based research funders to explore the challenges and opportunities funders face in supporting and incentivizing open science. Representatives from the Gordon and Betty Moore Foundation, the Bill and Melinda Gates Foundation, the National Institutes of Health, the Howard Hughes Medical Institute, and the Alzheimer’s Association came together to address topics."


The UNESCO Open Science Outlook: OS progresses, but unequally – February 1, 2024

"Last December, UNESCO published the first global report on the trends of Open Science (OS). In this blog post, the main findings are highlighted: OS is increasing but does so unevenly and its monitoring is mainly focused on outputs, missing potential progress in participation and dialogue."


The European landscape of institutional publishing - A synopsis of results from the DIAMAS survey – January 31, 2024

"DIAMAS is a HORIZON Europe project that aims to understand and support institutional publishing, paying particular attention to initiatives that do not charge fees to read or publish scholarly outputs...The following synopsis presents a summary of the DIAMAS project’s Landscape Report 'Institutional Publishing in the ERA; results from the DIAMAS survey' highlighting its main findings."


The Second Digital Transformation of Scholarly Publishing – January 29, 2024

"Today, the scholarly publishing sector is undergoing its second digital transformation...In this current second digital transformation, many of the structures, workflows, incentives, and outputs that characterized the print era are being revamped in favor of new approaches that bring tremendous opportunities, and also non-trivial risks, to scholarly communication...It is our objective with this paper to examine the needs for shared infrastructure that will support this second digital transformation."


OA JOURNAL LAUNCHES


February 6, 2024

All EMS Press journals open access in 2024 following successful Subscribe To Open round

"EMS Press is delighted to announce that all 22 journals in its Subscribe To Open (S2O) programme will be published as open access for the 2024 subscription period. This means that for the first time the Press’s annual journal output will be entirely open access, with a blend of S2O and Diamond publications."

 


By Dan Pollock and Ann Michael February 20, 2025
Overview A recent post on the Open Café listserv posed a question about the true extent of fee-free open access publishing, but it noted the incomplete coverage of the data cited. We have more comprehensive data, but just as we started our analysis, DeepSeek’s release sent markets into turmoil. The stage was set for a timely experiment. We first answer the question using our data. Then we see how the AI did. Background What proportion of open access is not paid for by APCs? In discussing this, a recent Open Café listserv post cited studies by Walt Crawford – a librarian, well-known in the academic library and OA communities for his analysis of open access. He has paid particular attention to “diamond” OA journals, which charge neither readers nor authors. His studies are based on data from the Directory of Open Access journals ( DOAJ ). Excellent though both sources may be – and, full disclosure, we contribute to the DOAJ – the DOAJ’s remit covers only fully OA (“gold”) journals. As listserv founder Rick Anderson noted, “By counting only articles published in DOAJ-listed journals, Crawford’s studies radically _undercount_ the number of APC-funded OA articles published – because DOAJ does not list hybrid journals, which always charge an APC for OA and which produce a lot of genuinely OA articles (though exactly how many, no one knows).” Using our data Actually, we do know … or at least have some fair estimates of hybrid OA. Our data allows us to determine the share of open access output in APC-free journals, as follows.
By Dan Pollock and Ann Michael February 11, 2025
Overview Following the 2024 US election, the new US administration has instructed employees in some key federal agencies to retract publications arising from federally funded research. This is to allow representatives of the administration to review the language used, to ensure it is consistent with the administration’s political ideology. In this special edition of News & Views, we quantify how many papers might be affected and estimate their share of scholarly publishers’ output. The initial numbers may be small, but we suggest the effects on scholarly publishing could be profound. Background On 20 January 2025, Donald J. Trump took office as the 47th President of the United States. Within hours he signed an Executive Order 1 (EO) 14168 proclaiming that the US government would only recognize two sexes, and ending diversity, equity, and inclusion (DEI) programs inside federal agencies. The following day, his administration instructed federal health agencies to pause all external communications – “such as health advisories, weekly scientific reports, updates to websites and social media posts” – pending their review by presidential appointees. These instructions were delivered to staff at agencies inside the Department of Health and Human Services (DHSS), including the Food and Drug Administration (FDA), the Centers for Disease Control (CDC) and Prevention, and the National Institutes of Health (NIH). The events that followed are important, as they directly affect scholarly papers and our analysis. A memo on 29 January instructed agencies to “end all agency programs that … promote or reflect gender ideology” as defined in the EO. Department heads were instructed to immediately review and terminate any “programs, contracts, and grants” that “promote or inculcate gender ideology.” Among other things, they were to remove any public-facing documents or policies that are trans-affirming and replace the term “gender” with “sex” on official documents. By the start of February, more than 8000 web pages across more than a dozen US government websites were taken down . These included over 3000 pages from the CDC (including 1000 research articles filed under preventing chronic disease, STD treatment guidelines , information about Alzheimer’s warning signs, overdose prevention training , and vaccine guidelines for pregnancy). Other departments affected included the FDA (some clinical trials), the Office of Scientific and Technical Information (the OSTP, removing papers in optics, chemistry and experimental medicine), the Health Resources and Services Administration (covering care for women with opioid addictions, and an FAQ about the Mpox vaccine). Around this time, it further emerged that CDC staff were sent an email directing them to withdraw manuscripts that had been accepted, but not yet published, that did not comply with the EO. Agency staff members were given a list of about 20 forbidden terms, including gender, transgender, pregnant person, pregnant people, LGBT, transsexual, nonbinary, assigned male at birth, assigned female at birth, biologically male, biologically female, and he/she/they/them. All references to DEI and inclusion are also to be removed. The effects of the EO Commenting on the merits of policy and ideology lies beyond our remit. However, when these matters affect the scholarly record – as they clearly do here – then they are of interest for our analyses. Specifically, what might the effects of the EO be on the publication of papers, and what effects might accrue from withdrawal of research funding? If federal agencies are being instructed to withhold or withdraw submissions, then, to quantify what this might mean to publishers, we have estimated the volume of output from a few key federal agencies. It is summarized in the following chart. 
By Lori Carlin January 23, 2025
Emerging technologies are reshaping how we create, distribute, and consume content. Publishers face the critical task of making smart technology investments to stay competitive and enable strategic objectives. How do you ensure that your next tech purchase aligns with your organization's needs and goals? Enter the needs assessment process – your roadmap to making informed, strategic technology decisions. From defining clear objectives to creating a comprehensive RFP, these best practices will help you navigate the decision-making process with confidence and ensure that your investments deliver value for your organization and your customers. Technology is not a solution; it is a tool. The temptation to adopt technology without a clear definition of what you are trying to achieve is an all too common (and usually very costly) mistake. Does your strategy include delivering a more personalized experience for your users? A customer data platform may be the right technology. Interested in using AI to build research integrity into your editorial process? Perhaps it’s time to revisit the capabilities of your editorial management system. Looking to support education and learning for students, faculty, and professional learners? Maybe it is time to evaluate formal learning management systems. Once you are confident about what you are seeking to achieve, the real work begins. Here are the key components that will help lay the foundation for a successful process from inception to deployment: Analyze Current State: Audit existing systems and processes to understand current capabilities and limitations. Conduct a Gap Analysis: Identify gaps between current capabilities and desired future state. Collect and Analyze Data: Gather qualitative and quantitative data from staff, users, customers, industry benchmarks, and about existing systems. Consider Resources and Constraints: Assess available resources, including budget, skills, and time. Research Solutions: Investigate potential technologies and/or types of solutions that could address identified gaps. Prioritize Needs: Work with stakeholders to prioritize needs based on impact and feasibility. Create RFP: After identifying prioritized needs and potential solutions, develop an RFP that clearly outlines project objectives, specific requirements, evaluation criteria, budget, and timelines. Distribute the RFP: Identify vendors with fit for purpose solutions and capabilities and distribute. Evaluate Proposals: Review vendor responses against established criteria and prioritize them based on how well they meet your needs. Plan for Adoption and Training: Consider the change management aspects of introducing new technology and processes. Be sure to develop a plan for user adoption, training, and ongoing support in your new systems. Technology as a Strategic Ally A methodical needs assessment is not just a procurement exercise – it is a strategic opportunity to reimagine how technology can transform your organization. The most successful technology investments are those that solve real problems, align with organizational goals, and empower your team to work more efficiently and creatively. Don’t fall into the trap of just moving what you are currently doing over to a new system. This is an ideal occasion to think about how you would design workflows and processes if you were to start from scratch and use that framework to evaluate the new capabilities available. You don’t want to duplicate what you are doing today; you want to step back and take the opportunity to build something better whenever possible. Customer Data Platform? Editorial Management System? Learning Management System? Something Else? Delta Think partners with publishers to do the foundational and implementation work required to ensure that technology decisions match the organization’s capabilities, fit the budget, and are grounded in voice-of-customer data. Our processes, including stakeholder interviews, surveys, and workshops, combined with expert landscape research, analysis, and assessments, underpin technology decision-making that is market-focused and customer-driven. If your 2025 objectives depend on or are enabled by technology, we’d welcome the opportunity to help you learn, plan, achieve. Please contact us today to start the conversation.
By Dan Pollock and Heather Staines January 14, 2025
This month’s topic: How reliable are the headlines you read in reports? Scroll down to read about this topic, along with the latest headlines and announcements. Delta Think publishes this News & Views mailing in conjunction with its Data & Analytics Tool . Please forward News & Views to colleagues and friends, who can register to receive News & Views for free each month. Delta Think will be attending several upcoming conferences, including NISO Plus (Feb 10-12) and Researcher to Reader (Feb 20-21). We would love to see you there – please get in touch or visit our Events page to see all the meetings we will be attending. How reliable are the headlines you read in reports? O verview A number of sources provide information about patterns in the overall scholarly journals market. However, as we so often mention in our analyses, important nuances lie beneath the headlines. This month we explore just how much variation exists and highlight the importance of specificity. Background As part of our annual market updates, we estimate the proportions of open vs. subscription access content each year. Over the last few years, we have observed how OA has approached 50% of output, but we note that it has yet to punch through that number. However, this headline varies greatly depending on your area of publishing. An example from physics The chart below shows the nuances across just a few of the 200+ subjects that we track.
By Dan Pollock, Ann Michael December 10, 2024
This month’s topic: How much content can AI legally exploit? Scroll down to read about this topic, along with the latest headlines and announcements. Delta Think publishes this News & Views mailing in conjunction with its Data & Analytics Tool . Please forward News & Views to colleagues and friends, who can register to receive News & Views for free each month. Delta Think will be attending several upcoming conferences, including APE (Jan 14-15), NISO Plus (Feb 10-12), and Researcher to Reader (Feb 20-21). We would love to see you there – please get in touch or visit our Events page to see all the meetings we will be attending. How much content can AI legally exploit? O verview During the recent PubsTech conference , we were asked how much content could be legitimately used to train artificial intelligence systems without being specifically secured through a licensing agreement. In considering this question, we find some counterintuitive results. Background Generative AI (genAI) is a type of artificial intelligence that can create new content—text, images, music, and more – by analyzing patterns in massive datasets. These models are typically trained on publicly available data scraped from the web. In the US, developers often invoke the “Fair Use” copyright doctrine to justify this training, claiming it is limited to specific purposes (training) and transformative in nature (different from the original use). In reality, the legal position is complex and evolving , with many rights holders and their representatives – unsurprisingly – taking the opposite view. Even if legal clarity emerges, different geographies and jurisdictions will likely reach different conclusions. The legal complexities of AI and copyright law are beyond our scope. However, for scholarly publishers, particular issues apply. Half of our output is open access , and open access content is designed to be reusable. Open or not, content has varying restrictions on onward use – for example, non-commercial use is often allowed with attribution. How much scholarly content is exploitable?  For the purposes of analysis, we will assume that the license under which content is published will have a material bearing on the legitimacy of its use to train AI systems. Therefore, looking at share of licenses, we might be able to answer our question.
A blue hot air balloon is flying in the night sky.
By Lori Carlin December 6, 2024
Welcome to the next issue of Delta Think's Ideas in Action - ideas that spark your imagination and encourage creativity...information that makes you stop and THINK! Want to know more about partnering with Delta Think? Contact Delta Think at info@deltathink.com to set up a time to meet and learn more. Charleston Conference 2024 Reflections November always marks several noteworthy activities and events both personally and professionally, including one of our favorites – the Charleston Conference – where stakeholders from all areas of our industry – librarians, service providers, and publishers alike, get the opportunity to debate, collaborate, and share insights. Richard Charkin, OBE, described the Conference this way in his 2024 opening keynote remarks: “This meeting is incredibly important. Serious people debating serious issues.” We agree and add that the spirit of Charleston is also grounded in engagement – with colleagues and friends and making time for a bit of fun. Karaoke optional! Whether you were able to attend or not, here are some reflections on the 2024 Conference from the Delta Think Team. Libraries as Leaders – Lori Carlin The first thing that hit me was the energy of the conference overall; it was invigorating. Walking into the exhibit area on Vendor Day, you could sense a heightened level of interest from attendees eager to see and hear about new and interesting developments. Is it AI that is fostering this renewed energy? AI is certainly a hot topic, as stakeholders wonder how to best incorporate AI into their products, services, and workflows. Or perhaps the spotlight on Research Integrity and the various products that can help the scholarly community address these issues. Whatever the reason, I have always appreciated Charleston’s approach to exhibits, with a single dedicated day for vendors to showcase their wares, and the packed ballroom left no doubt that this concentrated attendee/vendor time was appreciated by all. As for sessions, the Opening Keynote featuring Katina Strauch and Richard Charkin was interesting – both bringing their own sense of wit to their description of their different but equally circuitous paths to scholarly publishing and their eventual role as community leaders. I also have to call out a session I moderated – “Keeping Libraries as Central Players in an Evolving Teaching and Learning Space,” and not because I moderated it! It was the librarian panelists as well as the interaction from the audience that made this session lively and interesting. What it reinforced for me is the leadership role librarians now play as not only information resource agents and gatekeepers in their communities, but data analysts, policy drivers, and educators, ensuring that advancements in teaching and learning are recognized and implemented. Books and eBooks in the Spotlight – Diane Harnish There was a noticeable “buzz” at Charleston around eBooks and book-based content. Whether for teaching and learning or research usage occasions, the value of book collections, or exploration of evolving funding models and roles, books were top-of-mind for librarians and publishers. For example, “Whose Future Is It? Practical Strategies for Supporting Community-led Open Access Book Publishing” focused on how libraries can take a leadership role in open access book publishing. The concurrent session was full of practical insights into how libraries develop effective strategies to support community-led and academy-owned OA book publishing, with an emphasis on equity. On a more macro-scale, Niels Stern, Managing Director, DOAB & OAPEN Foundation led a Neapolitan discussion entitled “Open Access Policies for Books: Librarian Roles in Nudging Institutional and National Change” which explored the work of the recently concluded PALOMERA Project, an initiative to examine and analyze the research policies and strategies for open-access books in 39 countries in the European research area. The project generated evidenced-based, actionable recommendations to “help ensure that books don't get ‘left behind’” in a global move toward open research. I found this session ideal for any stakeholder – library, funder, or publisher – interested in ensuring sustainable infrastructure for eBook, especially scholarly monographs. After more than 30 years in scholarly communication, this was my first Charleston and I will definitely be back! Research Integrity + AI and Copyright – Heather Staines Working closely with Dr. Elisabeth Bik and Dr. Ivan Oransky to explore research integrity issues was timely and enlightening. While there are many new tools to detect misconduct, both agreed that focusing on the human factor will be key—seeking change in research assessment and the kinds of publications that count. Their Neapolitan, “Challenges and Opportunities Around Research Integrity: A Conversation” session provided an informative overview of some of the most biggest challenges to research integrity (image manipulation, paper mills) and how Retraction Watch, COPE Guidelines, and other tools can be used by all stakeholders to raise awareness and help ensure the integrity of the scientific record. The other session which kept my interest was the “Long Arm of the Law” moderated by Ann Okerson. Copyright Clearance Center’s Roy Kaufman helped scope out the legal issues related to AI companies using copyrighted content to train their LLMs and shed some light on cases related to copyright and LLM training currently winding their way through the courts. ITHAKA’s Nancy Kopans followed JSTOR’s perspective as an aggregator working to balance the rights of copyright holders and publishers with the needs of students, faculty, and researchers. Definitely an area to watch! Katina’s Legacy – Meg White Charleston founder and convener Katina Strauch has passed the torch, but her legacy is a reminder that there is always more to discover, learn, and tackle. She never slows down and in many ways, defines what it means to always be evolving, embodying a true growth mindset. Katina and Richard Charkin kicked off the conference with a “Fireside Chat” Keynote moderated by Richard Gallagher, President and Editor-in-Chief of Annual Reviews (and the new owner of the Charleston Hub). As Lori mentioned, these two trailblazers were meeting for the first time, but they reflected on shared pivotal moments in their professional lives, including the intersection of publishing and librarianship, as we have moved from the internet to digitization of content and collections, and now to AI. I had the pleasure of interviewing Katina as part of the Charleston Leadership Interviews and the ATG Podcast, so watch for that conversation coming soon at the Charleston Hub. Her passion certainly informs many of the key values we strive for here at Delta Think as we work with the scholarly communications community to LEARN, PLAN, ACHIEVE. Bravo! Finally, we offer our congratulations to writer, director, producer, and star Heather Staines and her merry band of players. Thank you for an entertaining look at libraries, publishing, education, research, academia, and more in “Schmetadata: The Musical” a light-hearted start to the Conference’s final day. Next Steps What were your “aha moments” at Charleston 2024? What are your organization’s biggest priorities and challenges for 2025 and beyond? At Delta Think, we believe in the power of collaboration and innovation to drive progress. We can help you embrace change and unlock your potential. Reach out today to start the conversation and we look forward to hearing more. More Ideas News & Views: Market Sizing Update 2024: Has OA Hit A Peak? (Oct 2024) –Each year, Delta Think’s Market Sizing analyzes the value of the open access (OA) scholarly journals market. This is the revenue generated by providers or the costs incurred by buyers of content. We estimate the OA segment of the market to have grown to just over $2.2bn in 2023. This is only a marginal growth over the previous year… ( read more ) Content Licensing Do’s and Don’ts in the Age of AI (Oct 2024) – Artificial Intelligence’s (AI) seemingly endless capabilities and applications present great opportunities (and some challenges too) for publishers and societies across the publishing enterprise. One of the main areas of both growth and reason for caution to emerge is the potential to license scholarly content to AI providers—primarily to be used… ( read more ) Exploring AI (Sept 2024) – AI technologies have already sparked profound changes across our industry, enabling machines to perform tasks that previously required an abundance of human intelligence. AI algorithms can analyze vast datasets to uncover patterns, LLMs can generate coherent text, and genAI can simulate human-like creativity. Here we explore some of… ( read more ) Events We’ll be attending the following events. Please contact us at info@deltathink.com if you’d like to set up a time to chat. APE, January 14-15 Researcher to Reader, February 20-21 ER&L, March 3-6 London Book Fair, March 11-13 2025 NAS Journal Summit, March 19-20 Turn Your Ideas Into Action A partnership with Delta Think can provide the expert insights you need to meet your goals and amplify your ability to: Learn about new and evolving insights, perspectives, and possibilities Market Research and Intelligence Customer Insight and Experience Data Analytics and Market Evidence Plan your path forward to success Business and Product Strategy Commercial Optimization Brand, Marketing, and CDP Strategies Achieve your goals Manage Change Implement Projects, Products, and Partnerships Build Results Metrics and Analysis O ur insatiable curiosity, coupled with our expertise in data-driven, evidence-based analysis, and strategy development – TOGETHER – we will discover your best path forward. Want to know more? Schedule a call today or visit deltathink.com
By Heather Staines October 31, 2024
We are proud to share a video recording of our October News & Views companion online discussion forum! Join us for our annual update of the market size and revenue share of Open Access and a lively conversation around the trends and the wider issues that may be informing the overall market in scholarly communications.  If you missed the session, or if you attended and would like to watch/listen again, or share forward with friends, please feel free!
A mountain range with snow on the peaks and clouds in the sky
By Dan Pollock, Ann Michael October 22, 2024
Overview Each year, Delta Think’s Market Sizing analyzes the value of the open access (OA) scholarly journals market. This is the revenue generated by providers or the costs incurred by buyers of content. We estimate the OA segment of the market to have grown to just over $2.2bn in 2023. This is only a marginal growth over the previous year. It is a small fraction of the long-term historical growth of the OA segment. A reduction in the output of the large OA-only publishers has had a profound effect on the market. It has benefited established publishers, who are seeing a growth in OA, even while the overall market softens. We expect this pattern to continue in 2024. Have we reached peak open access? Have the underlying drivers of OA changed? And are we now in an era of lower OA growth? Headline findings Our models suggest the following headlines for open access market sizing:
A clipboard with the words do 's and don 'ts written on it
By Lori Carlin October 21, 2024
Artificial Intelligence’s (AI) seemingly endless capabilities and applications present great opportunities (and some challenges too) for publishers and societies across the publishing enterprise. One of the main areas of both growth and reason for caution to emerge is the potential to license scholarly content to AI providers—primarily to be used to “train” large language models (LLMs). While this type of licensing opportunity may be compelling, it requires thoughtful integration into the organization’s overall content portfolio management and revenue strategy. Recently announced licensing agreements between scholarly and academic publishers and technology companies highlight AI’s insatiable demand for primary, verified, reliable information. AI developers rely on this high-quality, vetted content to train models, refine algorithms, and enhance natural language processing capabilities. This demand can present a lucrative opportunity for publishers to license content – aka the knowledge needed for training. It also raises important strategic questions about ownership, sustainability, and long-term business models that should not be ignored in the process. Opportunity vs. Risk: Licensing Content Do’s and Don’ts If a partnership with an AI company seems intriguing, it is…as long as you proceed with an understanding of how this opportunity may play out for your organization and where on the classic innovation adoption curve you are comfortable. Here is a handy checklist to help you evaluate the opportunities and risks of licensing content to AI providers. Keep in mind, YMMV, as will your priorities. Do: Integrate Licensing into Overall Content Strategy – View AI licensing as part of a broader content portfolio management plan to align with business objectives and sustain long-term value. Prioritize Content Based on Value – Categorize content by demand and monetization potential to tailor licensing strategies for different segments (e.g., niche vs. broad appeal). Introduce Strategic Pricing Models – Experiment with flexible pricing strategies like volume-based, usage-based, or hybrid models to reflect content value and accommodate AI providers’ diverse needs. Complement and Enhance Existing Revenue Streams – Ensure that AI licensing supports rather than undermines other revenue channels (subscriptions, APCs, institutional licensing, etc.). Consider tiered access or differentiated pricing for recent vs. older content. Collaborate with AI Companies Ethically – Build partnerships that ensure responsible content usage. Establish guidelines for ethical AI content generation, labeling, and attribution. Protect Author Rights – Ensure that licensing agreements comply with existing contracts and protect authors’ rights. Proactively manage relationships with scholars to maintain trust and uphold their interests. Be Prepared for Market Shifts – Experimentation is the order of the day but the market and innovation is moving fast. Adopt flexible frameworks to quickly adjust to technological changes or shifts in demand for licensed content. Maintain Transparency and Communication – Keep authors, research communities, and internal stakeholders informed about how the organization’s content is licensed and used by AI firms. Consider Partnering with Other Content Providers – Strategically partner with publishing peers to offer a broader range of niche content. Collectively negotiate through a ‘power in numbers’ approach. Don’t: Rely Solely on AI-Driven Revenue – Avoid becoming over-reliant on revenue from AI licensing, as market shifts could jeopardize financial stability if demand for licensed content declines. Undermine Content Value – Be cautious of pricing models that risk devaluing content over time, especially as AI-generated content becomes more sophisticated. Ignore Unintended Consequences – Don’t overlook the potential for content devaluation or the blurring of lines between original research and AI-generated outputs. Neglect Author Concerns – Don’t disregard the potential for author questions, dissatisfaction, or misuse of their work. Always respect contractual obligations and maintain productive relationships with the academic community. Overlook Ethical Concerns – Avoid participating in licensing agreements without ensuring ethical guidelines for the use of AI-generated content, including issues like data privacy and security. Ignore the Long-Term Impact on Scholarly Publishing – Don’t assume AI-driven licensing won’t affect traditional publication models. Proactively assess how AI might impact and change peer review, publication demand, and researcher incentives. Final Thoughts Licensing content to AI providers is certainly a potential opportunity for publishers. That opportunity also comes with possible risks and the need for some caution. These Do’s and Don’ts serve as a starting point to help you begin to frame out how partnerships with AI providers may or may not “fit” with your strategy, mission, and organizational goals, while acknowledging the need to consider safeguards to protect the integrity of your content, author relationships, and long-term sustainability. Delta Think can help your organization understand the unique opportunities and challenges of integrating AI licensing into a comprehensive content portfolio management strategy. Ready to start the conversation? Contact us today. As Ideas in Action went to press, Ithaka S&R announced a Generative AI Licensing Agreement Tracker to help capture the details, impact, and strategy of these deals.
A sign that says market sizing coming soon on it
By Dan Pollock and Heather Staines September 18, 2024
In July, we shared a sneak peek at the 2023 market size, based on our annual publisher survey, and we’re currently heads down finalizing our analysis of the trends, along the corresponding revenue for both fully OA and hybrid content. Look for this important update News & Views in mid-October. We’ll also hold our annual free webinar… Read More The post News & Views: Register now for Delta Think’s 2024 Market Sizing Update Webinar appeared first on Delta Think.
More Posts
Share by: