Get in touch
555-555-5555
mymail@mailservice.com

News & Views Special Edition: How much scholarly publishing is affected by US Presidential Executive Orders?

Dan Pollock and Ann Michael • February 11, 2025

Overview


Following the 2024 US election, the new US administration has instructed employees in some key federal agencies to retract publications arising from federally funded research. This is to allow representatives of the administration to review the language used, to ensure it is consistent with the administration’s political ideology. In this special edition of News & Views, we quantify how many papers might be affected and estimate their share of scholarly publishers’ output. The initial numbers may be small, but we suggest the effects on scholarly publishing could be profound.


Background


On 20 January 2025, Donald J. Trump took office as the 47th President of the United States. Within hours he signed an Executive Order1 (EO) 14168 proclaiming that the US government would only recognize two sexes, and ending diversity, equity, and inclusion (DEI) programs inside federal agencies. The following day, his administration instructed federal health agencies to pause all external communications – “such as health advisories, weekly scientific reports, updates to websites and social media posts” – pending their review by presidential appointees. These instructions were delivered to staff at agencies inside the Department of Health and Human Services (DHSS), including the Food and Drug Administration (FDA), the Centers for Disease Control (CDC) and Prevention, and the National Institutes of Health (NIH).


The events that followed are important, as they directly affect scholarly papers and our analysis.


memo on 29 January instructed agencies to “end all agency programs that … promote or reflect gender ideology” as defined in the EO. Department heads were instructed to immediately review and terminate any “programs, contracts, and grants” that “promote or inculcate gender ideology.” Among other things, they were to remove any public-facing documents or policies that are trans-affirming and replace the term “gender” with “sex” on official documents.


By the start of February, more than 8000 web pages across more than a dozen US government websites were taken down. These included over 3000 pages from the CDC (including 1000 research articles filed under preventing chronic disease, STD treatment guidelinesinformation about Alzheimer’s warning signs, overdose prevention training, and vaccine guidelines for pregnancy). Other departments affected included the FDA (some clinical trials), the Office of Scientific and Technical Information (the OSTP, removing papers in optics, chemistry and experimental medicine), the Health Resources and Services Administration (covering care for women with opioid addictions, and an FAQ about the Mpox vaccine).


Around this time, it further emerged that CDC staff were sent an email directing them to withdraw manuscripts that had been accepted, but not yet published, that did not comply with the EO. Agency staff members were given a list of about 20 forbidden terms, including gender, transgender, pregnant person, pregnant people, LGBT, transsexual, nonbinary, assigned male at birth, assigned female at birth, biologically male, biologically female, and he/she/they/them. All references to DEI and inclusion are also to be removed.


The effects of the EO


Commenting on the merits of policy and ideology lies beyond our remit. However, when these matters affect the scholarly record – as they clearly do here – then they are of interest for our analyses. Specifically, what might the effects of the EO be on the publication of papers, and what effects might accrue from withdrawal of research funding?

If federal agencies are being instructed to withhold or withdraw submissions, then, to quantify what this might mean to publishers, we have estimated the volume of output from a few key federal agencies. It is summarized in the following chart.



Sources: OpenAlexResearch Organization Registry (ROR), Delta Think analysis.


The charts above show the output funded by a few key US federal agencies as a share of global output (left) and US output (right).



  • The data span the previous 5 years.
  • The US accounted for around 15% of global output.
  • The CDC accounted for a tiny share: 0.1% of global output and 0.6% of US output.
  • The Department of Health and Human Services (DHHS), of which the CDC is a part, accounted for just under 6% of global output, but just over 40% of US output.
  • The NIH produces around 95% of DHHS output.
  • Note, that these samples are based on OpenAlex data, where papers are indicated to have the funders as classified above, or by one of their subsidiaries (as best we can estimate).


The proportion of CDC-authored papers is tiny, and so their suppression is unlikely to lead to a drop in publishing output. However, should the orders spread to other areas of health research, then the effects could be profound – especially for journals and publishers relying heavily on US-authored papers. As we saw in our last market sizing update, significant authors moving away from publishing venues can have profound effects on publishers’ output and revenues.


As ever, the averages are unevenly distributed, so we looked at how individual journals might be affected.


Sources: OpenAlexResearch Organization Registry (ROR), Delta Think analysis.


The charts above show the impact of funding on individual journals. They show the share of journal output attributable to some key funders, relating it to overall size of the journal. Each point represents one journal.


The left-hand chart focuses on papers arising from CDC-funded research.


  • On average, only 5% of papers in journals are attributable to CDC-funded research.
  • However, the chart’s sample is small. Only 10% of journals receive these submissions.
  • Where the CDC affects a larger share of papers, the journals tend to be small.
  • For larger journals, the CDC accounts for smaller shares of output.
  • A very few journals are heavily reliant on CDC-funded research. (The outlying journals at 100% published just the CDC paper that year; they could either be CDC journals, or a shortcoming in the underlying data.)


The right-hand chart puts the CDC figures in context.


  • It compares the CDC (green dots, clustered bottom left) with all the NIH (orange crosses, which includes the CDC) and all US-funded papers (blue plus signs). The point of the charts is to show the overall patterns, not to highlight details of specific journals.
  • To keep the data manageable, it shows only journals with more than 0.5% of their papers funded by the funders in question. We can therefore focus on journals that may have a significant proportion of papers affected.
  • The small cluster of dark green CDC dots bottom left shows how small the numbers of journals with CDC-funded submissions are relative to the wider market.
  • The NIH and US federal agencies can account for significant proportions of journal output for even modestly sized journals, across many more journals. Many journals are wholly reliant on papers from these funders.


So, while a few journals may be affected by CDC submissions, and a very few of those significantly so, the effects are mostly small. Only a few percent of journals would see any significant drop in their overall submissions due to the suppression of CDC papers.


Conclusion


It’s not unheard of for an incoming US administration to ask for a pause to review information before it’s publicly released. However, the scope of the current orders appears to be unusually broad and aggressive. There were no similar restrictions on communications issued at the beginning of the last two administrations.


From one perspective, the effects of the orders to the CDC are small. They affect a relatively small volume of papers, which will not make much difference to most journals. If like-for-like language can be found that doesn’t affect the science, then the issues blocking the papers may be fixed easily.


However, grants are being withheld and reviewed across the much larger agencies such as the NIH and NSF. Even if orders are later rescinded, or moves reversed, their chilling effect is already taking hold. Pauses in funding may disrupt a project’s cashflow, causing it to shut down even if the funding is later reinstated. It already appears that there could be a significant drop in funding for future research, and the NSF is vetting existing projects. Some publishers are already witnessing a drop in submissions. As the footprint of reduced funding increases, we may see a much greater drop in submissions.


We have examined the effects of submissions here, as they are directly under the control of federal agencies. But, as a further EO (14173) explicitly targets the private sector, might publishers be sanctioned for publishing unacceptable papers?


We are on a journey, as the new administration pushes ahead and worries about course correction later. En route to whatever the steady state is, publishers may be required to make a choice between the fantasy of the ideology and the reality of the science. E pur si muove.2



1 "Executive Orders (EOs) are official documents … through which the President of the United States manages the operations of the Federal Government.” The directives cite the President’s authority under the Constitution and statute (sometimes specified). EOs are published in the Federal Register, and they may be revoked by the President at any time. Although executive orders have historically related to routine administrative matters and the internal operations of federal agencies, recent Presidents have used Executive Orders more broadly to carry out policies and programs." – US Bureau of Justice Assistance, part of the US Department of Justice. Refer to the US Federal Register for a full list.


2 "And yet it moves" is attributed to Galileo Galilei after he was forced to recant his statements that the Earth moves around the Sun rather than the other way around.


This article is © 2025 Delta Think, Inc. It is published under a Creative Commons Attribution-NonCommercial 4.0 International License. Please do get in touch if you want to use it in other contexts – we’re usually pretty accommodating.

By Lori Carlin January 23, 2025
Emerging technologies are reshaping how we create, distribute, and consume content. Publishers face the critical task of making smart technology investments to stay competitive and enable strategic objectives. How do you ensure that your next tech purchase aligns with your organization's needs and goals? Enter the needs assessment process – your roadmap to making informed, strategic technology decisions. From defining clear objectives to creating a comprehensive RFP, these best practices will help you navigate the decision-making process with confidence and ensure that your investments deliver value for your organization and your customers. Technology is not a solution; it is a tool. The temptation to adopt technology without a clear definition of what you are trying to achieve is an all too common (and usually very costly) mistake. Does your strategy include delivering a more personalized experience for your users? A customer data platform may be the right technology. Interested in using AI to build research integrity into your editorial process? Perhaps it’s time to revisit the capabilities of your editorial management system. Looking to support education and learning for students, faculty, and professional learners? Maybe it is time to evaluate formal learning management systems. Once you are confident about what you are seeking to achieve, the real work begins. Here are the key components that will help lay the foundation for a successful process from inception to deployment: Analyze Current State: Audit existing systems and processes to understand current capabilities and limitations. Conduct a Gap Analysis: Identify gaps between current capabilities and desired future state. Collect and Analyze Data: Gather qualitative and quantitative data from staff, users, customers, industry benchmarks, and about existing systems. Consider Resources and Constraints: Assess available resources, including budget, skills, and time. Research Solutions: Investigate potential technologies and/or types of solutions that could address identified gaps. Prioritize Needs: Work with stakeholders to prioritize needs based on impact and feasibility. Create RFP: After identifying prioritized needs and potential solutions, develop an RFP that clearly outlines project objectives, specific requirements, evaluation criteria, budget, and timelines. Distribute the RFP: Identify vendors with fit for purpose solutions and capabilities and distribute. Evaluate Proposals: Review vendor responses against established criteria and prioritize them based on how well they meet your needs. Plan for Adoption and Training: Consider the change management aspects of introducing new technology and processes. Be sure to develop a plan for user adoption, training, and ongoing support in your new systems. Technology as a Strategic Ally A methodical needs assessment is not just a procurement exercise – it is a strategic opportunity to reimagine how technology can transform your organization. The most successful technology investments are those that solve real problems, align with organizational goals, and empower your team to work more efficiently and creatively. Don’t fall into the trap of just moving what you are currently doing over to a new system. This is an ideal occasion to think about how you would design workflows and processes if you were to start from scratch and use that framework to evaluate the new capabilities available. You don’t want to duplicate what you are doing today; you want to step back and take the opportunity to build something better whenever possible. Customer Data Platform? Editorial Management System? Learning Management System? Something Else? Delta Think partners with publishers to do the foundational and implementation work required to ensure that technology decisions match the organization’s capabilities, fit the budget, and are grounded in voice-of-customer data. Our processes, including stakeholder interviews, surveys, and workshops, combined with expert landscape research, analysis, and assessments, underpin technology decision-making that is market-focused and customer-driven. If your 2025 objectives depend on or are enabled by technology, we’d welcome the opportunity to help you learn, plan, achieve. Please contact us today to start the conversation.
By Dan Pollock and Heather Staines January 14, 2025
This month’s topic: How reliable are the headlines you read in reports? Scroll down to read about this topic, along with the latest headlines and announcements. Delta Think publishes this News & Views mailing in conjunction with its Data & Analytics Tool . Please forward News & Views to colleagues and friends, who can register to receive News & Views for free each month. Delta Think will be attending several upcoming conferences, including NISO Plus (Feb 10-12) and Researcher to Reader (Feb 20-21). We would love to see you there – please get in touch or visit our Events page to see all the meetings we will be attending. How reliable are the headlines you read in reports? O verview A number of sources provide information about patterns in the overall scholarly journals market. However, as we so often mention in our analyses, important nuances lie beneath the headlines. This month we explore just how much variation exists and highlight the importance of specificity. Background As part of our annual market updates, we estimate the proportions of open vs. subscription access content each year. Over the last few years, we have observed how OA has approached 50% of output, but we note that it has yet to punch through that number. However, this headline varies greatly depending on your area of publishing. An example from physics The chart below shows the nuances across just a few of the 200+ subjects that we track.
By Dan Pollock, Ann Michael December 10, 2024
This month’s topic: How much content can AI legally exploit? Scroll down to read about this topic, along with the latest headlines and announcements. Delta Think publishes this News & Views mailing in conjunction with its Data & Analytics Tool . Please forward News & Views to colleagues and friends, who can register to receive News & Views for free each month. Delta Think will be attending several upcoming conferences, including APE (Jan 14-15), NISO Plus (Feb 10-12), and Researcher to Reader (Feb 20-21). We would love to see you there – please get in touch or visit our Events page to see all the meetings we will be attending. How much content can AI legally exploit? O verview During the recent PubsTech conference , we were asked how much content could be legitimately used to train artificial intelligence systems without being specifically secured through a licensing agreement. In considering this question, we find some counterintuitive results. Background Generative AI (genAI) is a type of artificial intelligence that can create new content—text, images, music, and more – by analyzing patterns in massive datasets. These models are typically trained on publicly available data scraped from the web. In the US, developers often invoke the “Fair Use” copyright doctrine to justify this training, claiming it is limited to specific purposes (training) and transformative in nature (different from the original use). In reality, the legal position is complex and evolving , with many rights holders and their representatives – unsurprisingly – taking the opposite view. Even if legal clarity emerges, different geographies and jurisdictions will likely reach different conclusions. The legal complexities of AI and copyright law are beyond our scope. However, for scholarly publishers, particular issues apply. Half of our output is open access , and open access content is designed to be reusable. Open or not, content has varying restrictions on onward use – for example, non-commercial use is often allowed with attribution. How much scholarly content is exploitable?  For the purposes of analysis, we will assume that the license under which content is published will have a material bearing on the legitimacy of its use to train AI systems. Therefore, looking at share of licenses, we might be able to answer our question.
A blue hot air balloon is flying in the night sky.
By Lori Carlin December 6, 2024
Welcome to the next issue of Delta Think's Ideas in Action - ideas that spark your imagination and encourage creativity...information that makes you stop and THINK! Want to know more about partnering with Delta Think? Contact Delta Think at info@deltathink.com to set up a time to meet and learn more. Charleston Conference 2024 Reflections November always marks several noteworthy activities and events both personally and professionally, including one of our favorites – the Charleston Conference – where stakeholders from all areas of our industry – librarians, service providers, and publishers alike, get the opportunity to debate, collaborate, and share insights. Richard Charkin, OBE, described the Conference this way in his 2024 opening keynote remarks: “This meeting is incredibly important. Serious people debating serious issues.” We agree and add that the spirit of Charleston is also grounded in engagement – with colleagues and friends and making time for a bit of fun. Karaoke optional! Whether you were able to attend or not, here are some reflections on the 2024 Conference from the Delta Think Team. Libraries as Leaders – Lori Carlin The first thing that hit me was the energy of the conference overall; it was invigorating. Walking into the exhibit area on Vendor Day, you could sense a heightened level of interest from attendees eager to see and hear about new and interesting developments. Is it AI that is fostering this renewed energy? AI is certainly a hot topic, as stakeholders wonder how to best incorporate AI into their products, services, and workflows. Or perhaps the spotlight on Research Integrity and the various products that can help the scholarly community address these issues. Whatever the reason, I have always appreciated Charleston’s approach to exhibits, with a single dedicated day for vendors to showcase their wares, and the packed ballroom left no doubt that this concentrated attendee/vendor time was appreciated by all. As for sessions, the Opening Keynote featuring Katina Strauch and Richard Charkin was interesting – both bringing their own sense of wit to their description of their different but equally circuitous paths to scholarly publishing and their eventual role as community leaders. I also have to call out a session I moderated – “Keeping Libraries as Central Players in an Evolving Teaching and Learning Space,” and not because I moderated it! It was the librarian panelists as well as the interaction from the audience that made this session lively and interesting. What it reinforced for me is the leadership role librarians now play as not only information resource agents and gatekeepers in their communities, but data analysts, policy drivers, and educators, ensuring that advancements in teaching and learning are recognized and implemented. Books and eBooks in the Spotlight – Diane Harnish There was a noticeable “buzz” at Charleston around eBooks and book-based content. Whether for teaching and learning or research usage occasions, the value of book collections, or exploration of evolving funding models and roles, books were top-of-mind for librarians and publishers. For example, “Whose Future Is It? Practical Strategies for Supporting Community-led Open Access Book Publishing” focused on how libraries can take a leadership role in open access book publishing. The concurrent session was full of practical insights into how libraries develop effective strategies to support community-led and academy-owned OA book publishing, with an emphasis on equity. On a more macro-scale, Niels Stern, Managing Director, DOAB & OAPEN Foundation led a Neapolitan discussion entitled “Open Access Policies for Books: Librarian Roles in Nudging Institutional and National Change” which explored the work of the recently concluded PALOMERA Project, an initiative to examine and analyze the research policies and strategies for open-access books in 39 countries in the European research area. The project generated evidenced-based, actionable recommendations to “help ensure that books don't get ‘left behind’” in a global move toward open research. I found this session ideal for any stakeholder – library, funder, or publisher – interested in ensuring sustainable infrastructure for eBook, especially scholarly monographs. After more than 30 years in scholarly communication, this was my first Charleston and I will definitely be back! Research Integrity + AI and Copyright – Heather Staines Working closely with Dr. Elisabeth Bik and Dr. Ivan Oransky to explore research integrity issues was timely and enlightening. While there are many new tools to detect misconduct, both agreed that focusing on the human factor will be key—seeking change in research assessment and the kinds of publications that count. Their Neapolitan, “Challenges and Opportunities Around Research Integrity: A Conversation” session provided an informative overview of some of the most biggest challenges to research integrity (image manipulation, paper mills) and how Retraction Watch, COPE Guidelines, and other tools can be used by all stakeholders to raise awareness and help ensure the integrity of the scientific record. The other session which kept my interest was the “Long Arm of the Law” moderated by Ann Okerson. Copyright Clearance Center’s Roy Kaufman helped scope out the legal issues related to AI companies using copyrighted content to train their LLMs and shed some light on cases related to copyright and LLM training currently winding their way through the courts. ITHAKA’s Nancy Kopans followed JSTOR’s perspective as an aggregator working to balance the rights of copyright holders and publishers with the needs of students, faculty, and researchers. Definitely an area to watch! Katina’s Legacy – Meg White Charleston founder and convener Katina Strauch has passed the torch, but her legacy is a reminder that there is always more to discover, learn, and tackle. She never slows down and in many ways, defines what it means to always be evolving, embodying a true growth mindset. Katina and Richard Charkin kicked off the conference with a “Fireside Chat” Keynote moderated by Richard Gallagher, President and Editor-in-Chief of Annual Reviews (and the new owner of the Charleston Hub). As Lori mentioned, these two trailblazers were meeting for the first time, but they reflected on shared pivotal moments in their professional lives, including the intersection of publishing and librarianship, as we have moved from the internet to digitization of content and collections, and now to AI. I had the pleasure of interviewing Katina as part of the Charleston Leadership Interviews and the ATG Podcast, so watch for that conversation coming soon at the Charleston Hub. Her passion certainly informs many of the key values we strive for here at Delta Think as we work with the scholarly communications community to LEARN, PLAN, ACHIEVE. Bravo! Finally, we offer our congratulations to writer, director, producer, and star Heather Staines and her merry band of players. Thank you for an entertaining look at libraries, publishing, education, research, academia, and more in “Schmetadata: The Musical” a light-hearted start to the Conference’s final day. Next Steps What were your “aha moments” at Charleston 2024? What are your organization’s biggest priorities and challenges for 2025 and beyond? At Delta Think, we believe in the power of collaboration and innovation to drive progress. We can help you embrace change and unlock your potential. Reach out today to start the conversation and we look forward to hearing more. More Ideas News & Views: Market Sizing Update 2024: Has OA Hit A Peak? (Oct 2024) –Each year, Delta Think’s Market Sizing analyzes the value of the open access (OA) scholarly journals market. This is the revenue generated by providers or the costs incurred by buyers of content. We estimate the OA segment of the market to have grown to just over $2.2bn in 2023. This is only a marginal growth over the previous year… ( read more ) Content Licensing Do’s and Don’ts in the Age of AI (Oct 2024) – Artificial Intelligence’s (AI) seemingly endless capabilities and applications present great opportunities (and some challenges too) for publishers and societies across the publishing enterprise. One of the main areas of both growth and reason for caution to emerge is the potential to license scholarly content to AI providers—primarily to be used… ( read more ) Exploring AI (Sept 2024) – AI technologies have already sparked profound changes across our industry, enabling machines to perform tasks that previously required an abundance of human intelligence. AI algorithms can analyze vast datasets to uncover patterns, LLMs can generate coherent text, and genAI can simulate human-like creativity. Here we explore some of… ( read more ) Events We’ll be attending the following events. Please contact us at info@deltathink.com if you’d like to set up a time to chat. APE, January 14-15 Researcher to Reader, February 20-21 ER&L, March 3-6 London Book Fair, March 11-13 2025 NAS Journal Summit, March 19-20 Turn Your Ideas Into Action A partnership with Delta Think can provide the expert insights you need to meet your goals and amplify your ability to: Learn about new and evolving insights, perspectives, and possibilities Market Research and Intelligence Customer Insight and Experience Data Analytics and Market Evidence Plan your path forward to success Business and Product Strategy Commercial Optimization Brand, Marketing, and CDP Strategies Achieve your goals Manage Change Implement Projects, Products, and Partnerships Build Results Metrics and Analysis O ur insatiable curiosity, coupled with our expertise in data-driven, evidence-based analysis, and strategy development – TOGETHER – we will discover your best path forward. Want to know more? Schedule a call today or visit deltathink.com
By Heather Staines October 31, 2024
We are proud to share a video recording of our October News & Views companion online discussion forum! Join us for our annual update of the market size and revenue share of Open Access and a lively conversation around the trends and the wider issues that may be informing the overall market in scholarly communications.  If you missed the session, or if you attended and would like to watch/listen again, or share forward with friends, please feel free!
A mountain range with snow on the peaks and clouds in the sky
By Dan Pollock, Ann Michael October 22, 2024
Overview Each year, Delta Think’s Market Sizing analyzes the value of the open access (OA) scholarly journals market. This is the revenue generated by providers or the costs incurred by buyers of content. We estimate the OA segment of the market to have grown to just over $2.2bn in 2023. This is only a marginal growth over the previous year. It is a small fraction of the long-term historical growth of the OA segment. A reduction in the output of the large OA-only publishers has had a profound effect on the market. It has benefited established publishers, who are seeing a growth in OA, even while the overall market softens. We expect this pattern to continue in 2024. Have we reached peak open access? Have the underlying drivers of OA changed? And are we now in an era of lower OA growth? Headline findings Our models suggest the following headlines for open access market sizing:
A clipboard with the words do 's and don 'ts written on it
By Lori Carlin October 21, 2024
Artificial Intelligence’s (AI) seemingly endless capabilities and applications present great opportunities (and some challenges too) for publishers and societies across the publishing enterprise. One of the main areas of both growth and reason for caution to emerge is the potential to license scholarly content to AI providers—primarily to be used to “train” large language models (LLMs). While this type of licensing opportunity may be compelling, it requires thoughtful integration into the organization’s overall content portfolio management and revenue strategy. Recently announced licensing agreements between scholarly and academic publishers and technology companies highlight AI’s insatiable demand for primary, verified, reliable information. AI developers rely on this high-quality, vetted content to train models, refine algorithms, and enhance natural language processing capabilities. This demand can present a lucrative opportunity for publishers to license content – aka the knowledge needed for training. It also raises important strategic questions about ownership, sustainability, and long-term business models that should not be ignored in the process. Opportunity vs. Risk: Licensing Content Do’s and Don’ts If a partnership with an AI company seems intriguing, it is…as long as you proceed with an understanding of how this opportunity may play out for your organization and where on the classic innovation adoption curve you are comfortable. Here is a handy checklist to help you evaluate the opportunities and risks of licensing content to AI providers. Keep in mind, YMMV, as will your priorities. Do: Integrate Licensing into Overall Content Strategy – View AI licensing as part of a broader content portfolio management plan to align with business objectives and sustain long-term value. Prioritize Content Based on Value – Categorize content by demand and monetization potential to tailor licensing strategies for different segments (e.g., niche vs. broad appeal). Introduce Strategic Pricing Models – Experiment with flexible pricing strategies like volume-based, usage-based, or hybrid models to reflect content value and accommodate AI providers’ diverse needs. Complement and Enhance Existing Revenue Streams – Ensure that AI licensing supports rather than undermines other revenue channels (subscriptions, APCs, institutional licensing, etc.). Consider tiered access or differentiated pricing for recent vs. older content. Collaborate with AI Companies Ethically – Build partnerships that ensure responsible content usage. Establish guidelines for ethical AI content generation, labeling, and attribution. Protect Author Rights – Ensure that licensing agreements comply with existing contracts and protect authors’ rights. Proactively manage relationships with scholars to maintain trust and uphold their interests. Be Prepared for Market Shifts – Experimentation is the order of the day but the market and innovation is moving fast. Adopt flexible frameworks to quickly adjust to technological changes or shifts in demand for licensed content. Maintain Transparency and Communication – Keep authors, research communities, and internal stakeholders informed about how the organization’s content is licensed and used by AI firms. Consider Partnering with Other Content Providers – Strategically partner with publishing peers to offer a broader range of niche content. Collectively negotiate through a ‘power in numbers’ approach. Don’t: Rely Solely on AI-Driven Revenue – Avoid becoming over-reliant on revenue from AI licensing, as market shifts could jeopardize financial stability if demand for licensed content declines. Undermine Content Value – Be cautious of pricing models that risk devaluing content over time, especially as AI-generated content becomes more sophisticated. Ignore Unintended Consequences – Don’t overlook the potential for content devaluation or the blurring of lines between original research and AI-generated outputs. Neglect Author Concerns – Don’t disregard the potential for author questions, dissatisfaction, or misuse of their work. Always respect contractual obligations and maintain productive relationships with the academic community. Overlook Ethical Concerns – Avoid participating in licensing agreements without ensuring ethical guidelines for the use of AI-generated content, including issues like data privacy and security. Ignore the Long-Term Impact on Scholarly Publishing – Don’t assume AI-driven licensing won’t affect traditional publication models. Proactively assess how AI might impact and change peer review, publication demand, and researcher incentives. Final Thoughts Licensing content to AI providers is certainly a potential opportunity for publishers. That opportunity also comes with possible risks and the need for some caution. These Do’s and Don’ts serve as a starting point to help you begin to frame out how partnerships with AI providers may or may not “fit” with your strategy, mission, and organizational goals, while acknowledging the need to consider safeguards to protect the integrity of your content, author relationships, and long-term sustainability. Delta Think can help your organization understand the unique opportunities and challenges of integrating AI licensing into a comprehensive content portfolio management strategy. Ready to start the conversation? Contact us today. As Ideas in Action went to press, Ithaka S&R announced a Generative AI Licensing Agreement Tracker to help capture the details, impact, and strategy of these deals.
A sign that says market sizing coming soon on it
By Dan Pollock and Heather Staines September 18, 2024
In July, we shared a sneak peek at the 2023 market size, based on our annual publisher survey, and we’re currently heads down finalizing our analysis of the trends, along the corresponding revenue for both fully OA and hybrid content. Look for this important update News & Views in mid-October. We’ll also hold our annual free webinar… Read More The post News & Views: Register now for Delta Think’s 2024 Market Sizing Update Webinar appeared first on Delta Think.
A person is shaking hands with a robotic hand.
By Lori Carlin September 5, 2024
AI technologies have already sparked profound changes across our industry, enabling machines to perform tasks that previously required an abundance of human intelligence. AI algorithms can analyze vast datasets to uncover patterns, LLMs can generate coherent text, and genAI can simulate human-like creativity. Here we explore some of the current use-cases to see how far… Read More The post Exploring AI appeared first on Delta Think.
A red and white percent sign on a red background.
By Dan Pollock and Heather Staines July 16, 2024
Each year, Delta Think analyzes the volume and value of the scholarly journals market. This month, we present preliminary results about volumes of journal output ahead of our full sizing results later in the year. Total publication volumes continue to increase across the whole market, and for Open Access. However, Open Access (OA) lost share… Read More The post News & Views: Open Access Loses Share – Market Sizing 2024 Sneak Peek appeared first on Delta Think.
More Posts
Share by: