Get in touch
555-555-5555
mymail@mailservice.com

News & Views: The Nelson Memo – Charleston Perspectives

Heather Staines • November 30, 2022

The Nelson Memo – Charleston Perspectives


By Meg White


The annual November pilgrimage to Charleston, SC made its post-COVID return last month. While many topics were discussed, the White House Office of Science and Technology Policy (OSTP) provided a last moment addition to the planned agenda, with the August 25th publication of a memorandum (aka the Nelson Memo) and accompanying report “Economic Landscape of Federal Public Access Policy."


There were three sessions during the Conference that focused specifically on the OSTP Memo, all taking slightly different formats and perspectives: The Nelson Memo: A Tipping Point for Open Access Science in the U.S.Ask the Chefs: The OSTP Memo, and the OSTP Public Access Guidance: Headlines, Details, and Impact. In this issue of News and Views, we’ll illuminate some important themes and takeaways from the Charleston discussions.


Theme #1: It’s Bigger, by A LOT


No matter how you define “big”, the Nelson Memo is bigger than its predecessor in scope and content. The headline from the new OSTP-issued policy is the elimination of the existing embargo, which currently allows a 12-month post-publication delay in public access to publications based on federally- funded research. The Nelson Memo also extends to all federal agencies and includes not only scientific data, but also “… peer-reviewed book chapters, editorials, and peer-reviewed conference proceedings published in other scholarly outlets that result from federally funded research.”


The “Tipping Point” panel agreed that while there are some existing standards in place to support public access to journal articles, there is little to no existing infrastructure to support the data requirements of the guidance at scale.


Daniel Sepulveda suggested that compliance by smaller agencies would be accomplished by leveraging processes established and built by larger agencies. Essentially, agencies for whom research is not a core function in terms of mission and priorities would simply ride the coattails of larger agencies that allocate more resources to fund research. This would include processes and support for more traditional research outputs such as journal articles, but would also extend to data and other content included in the expanded scope of the Nelson Memo.


Theme #2: Born in the U.S.A.


Looking at “Open Science” initiatives outside of the U.S. can provide context for this guidance, its attributes, and how it fits into the larger landscape.


Robert Kiley offered his perspective from his vantage point as one of the main supporters and implementers of Plan S in Europe. While advocating for Open Science to speed innovation and discovery, he cautioned against replacing “one inequitable system with another” and expressed his position that the Nelson Memo “provides a prime opportunity for the U.S. to accelerate the transition to full immediate open access, but also to play a leadership convening role in determining how publishing can become transparent, equitable and sustainable.”


Theme #3: One Size Does Not Fit All


Implications for publishers were front and center during the “Headlines, Details, and Impact” panel. The speakers agreed that publisher response to the Nelson Memo would likely vary widely based on multiple factors, including mission, size of portfolio, and subject area(s). Lori Carlin pointed out that market and customer intimacy has never been more critical and understanding the needs of your readers and authors must be top-of-mind when formulating future strategy and processes. Some of this information will be quantitative, saying “Publishers need to understand what their portfolio covers and who is funding their author’s research … what is your OA uptake now? … what is the culture of your community? … this is not one-size fits all.” Michael Clarke echoed Carlin, stating, “It’s [the impact] very much going to be dependent on the journal and the journal portfolio. Different journals have more or less funded research from different funding agencies.”


Theme #4: The Devil is in the Details


The Nelson Memo simply outlines the expectation that the agencies themselves will put plans and processes into place to ensure compliance with the new guidance. These details, large and small, were debated at varying levels by all three panels in Charleston and promise to drive conversations and decision-making well into 2025.


Lisa Janicke Hinchliffe, speaking as part of the Scholarly Kitchen’s annual Charleston Q&A “Ask the Chefs” roundtable, noted that, since researching funding in higher education is an agreement between the funding agency and the institution, universities will be required to have mechanisms in place to ensure compliance with grant requirements. Will compliance with agency public access policies be part of future grant agreements between federal agencies and institutions that accept federal research funds? On the funder side of the compliance issue, Rick Anderson and Jerry Sheehan had a spirited exchange during the “Headlines, Details, and Impact” session about the resources required for federal agencies to ensure compliance … at the time of award and throughout the research process.


All OSTP-focused panels and associated audience Q&A zeroed in on numerous small but significant details mandated but not specifically defined by the guidance. The Nelson Memo makes no statement on which version of an article or other data must be made publicly available or where this content must be made available/hosted. It does not discuss who is responsible for compliance or potential repercussions for non-compliance. It makes no comment on what re-use licenses, if any, apply to publicly available research outputs.


Bottom line, the Nelson Memo outlines the “what” and the “when” but intentionally leaves the “how” up to the agencies, and by extension, the larger ecosystem(s) that supports the creation and dissemination of scholarly research.


Funding: The Elephant in the Room


Funding for the systems needed to support the requirements outlined in the Nelson Memo was top-of-mind in all three sessions. There was generally agreement on the increase in scope, but less on how to fund the infrastructure needed to support the expanded mandates of the new guidance.


As context, Delta Think’s 2022 OA Market Sizing Update documented that the global OA publishing landscape is estimated to be $1.6B, and Delta Think projects a 2021-2024 CAGR of 13% in OA output and 12% in OA market value, but these figures refer only to journal articles, just one research output included in the new guidance.


During the “Tipping Point” session Sepulveda used his extensive experience as a policy advisor to several senators to give an overview of how the U.S. appropriation process works, with various agencies petitioning the Congress for funding to support their activities and initiatives. Through appropriation, the agencies can request additional funds to support implementation of the OSTP guidance. Sheehan echoed this theme, commenting that “wouldn’t it be great if the NIH is able to get an increase in its budget for research.” So, one scenario is that additional federal dollars, provided by US taxpayers, are the source of funding needed to ensure public access. Rick Anderson suggested a related potential solution, suggesting that publication becomes part of the research process and therefore funding publication fees does not take money away from research. Other scenarios discussed in Charleston were institutions contributing resources, either via the library or on behalf of researchers, akin to existing “transformative” or “read and publish” agreements. It was also suggested that publishers absorb some or all of these additional costs as part of the traditional publication process, leveraging scale across their s larger portfolios.


There were multiple conversations concerning whether there is enough money in the system currently and simply re-aligning how these funds are allocated could enable processes and systems to support compliance. However, given the scope of the Nelson Memo, it is dubious at best to envision that its mandates can be fully met by the existing infrastructure in place, including and perhaps most critically, funding. Danielle Cooper echoed this concern as part of the “Tipping Point” panel, pointing out that the memo “signals the prioritization of open [data] sharing over costs … the idea that it’s OK for something to cost more and that not a lot of thought has been put into that yet.”


Stay Tuned


The level of effort associated with the creation and implementation of processes and infrastructure needed to support Nelson Memo mandates can be described as daunting. Carlin summed up what is perhaps the most critical component needed to ensure that scholarly publishing continues the march toward an open, seamless, connected, and sustainable future: collaboration, “This [discussion] points to the need for collaboration,” she said. “There are many stakeholders involved in this process and we all need to be working together to find the best solutions and options … it is not one solution or option across the board. But it is something that publishers, libraries, researchers, and funders need to come together and be talking about to see how we all move forward.”


Want to develop your Open Access strategy or talk more about the OSTP Memo and its impact on your organization? Get in touch!


This article is © 2022 Delta Think, Inc. It is published under a Creative Commons Attribution-NonCommercial 4.0 International License. Please do get in touch if you want to use it in other contexts – we’re usually pretty accommodating.


TOP HEADLINES


IEEE Commits its Entire Hybrid Journal Portfolio to Transformative Journal Status Aligned with Plan S – November 28, 2022

"IEEE, the world’s largest technical professional organization dedicated to advancing technology for humanity, announced today that it has committed its full portfolio of more than 160 hybrid journals, which publish both open access and subscription-based content, to become Transformative Journals under Plan S."


CHORUS and IEEE have signed a Memorandum of Understanding to pilot a TechRxiv preprint dashboard service – November 23, 2022

"CHORUS will create a Preprint Dashboard and Reporting Service to identify related funders, datasets, reuse licenses, ORCID identifiers, and links to published articles on publisher sites and government public access repositories. The preprint dashboard will aid in discoverability of preprints associated with funded research – providing insight into where research is first being shared."


How funding agencies can meet OSTP (and Open Science) guidance using existing open infrastructure – November 17, 2022

"In August 2022, the United States Office of Science and Technology Policy (OSTP) issued a memo (PDF)✎ EditSign on ensuring free, immediate, and equitable access to federally funded research (a.k.a. the “Nelson memo”). Crossref is particularly interested in and relevant for the areas of this guidance that cover metadata and persistent identifiers—and the infrastructure and services that make them useful."


More than 2000 journals share price and service data through Plan S’s Journal Comparison Service – November 16, 2022

"cOAlition S is pleased to report that 27 publishers – who publish more than 2000 journals – have embraced the Journal Comparison Service (JCS) and shared their service and price data, responding to the call for transparent pricing of publishing services."


Royal Society of Chemistry commits to 100% Open Access – October 31, 2022

"The Royal Society of Chemistry (RSC) announced today that it aims to make all fully RSC-owned journals Open Access within five years, making it the first chemistry publisher and one of the first society publishers to commit to a fully Open Access future."


OA JOURNAL LAUNCHES


November 23, 2022

Cochrane Launches First Open Access Journal in Partnership with Wiley 

"Wiley, a global leader in research and education and publisher of the Cochrane Library, opened submissions today for Cochrane Evidence Synthesis and Methods, Cochrane’s first open-access journal."


November 9, 2022

AIP Publishing to Launch New Open Access Journal, APL Quantum, in 2023 

"AIP Publishing is delighted to announce another addition to its rapidly expanding portfolio of Open Access journals: APL Quantum."


November 3, 2022

National Science Open – Open Science for a Shared Future 

"We are delighted to introduce National Science Open (NSO), a new international, peer-reviewed, open access journal. NSO disseminates the most influential research of profound impact in advancing human knowledge, covering the full arc of natural sciences and engineering and is supported by the Chinese Academy of Sciences."


November 2, 2022

AOCS launches open access Sustainable Food Proteins Journal 

"AOCS, the premier scientific association serving the food chemistry sector, has launched a new international, open-access journal dedicated to sustainable food protein research."


October 25, 2022

Elsevier and the American College of Medical Genetics and Genomics announce the forthcoming launch of Genetics in Medicine Open 

"The American College of Medical Genetics and Genomics (ACMG) and Elsevier are delighted to announce the January 2023 launch of a new gold open access, online only journal: Genetics in Medicine Open (GIM Open), an Official Journal of the ACMG."

 

By Dan Pollock and Ann Michael February 20, 2025
Overview A recent post on the Open Café listserv posed a question about the true extent of fee-free open access publishing, but it noted the incomplete coverage of the data cited. We have more comprehensive data, but just as we started our analysis, DeepSeek’s release sent markets into turmoil. The stage was set for a timely experiment. We first answer the question using our data. Then we see how the AI did. Background What proportion of open access is not paid for by APCs? In discussing this, a recent Open Café listserv post cited studies by Walt Crawford – a librarian, well-known in the academic library and OA communities for his analysis of open access. He has paid particular attention to “diamond” OA journals, which charge neither readers nor authors. His studies are based on data from the Directory of Open Access journals ( DOAJ ). Excellent though both sources may be – and, full disclosure, we contribute to the DOAJ – the DOAJ’s remit covers only fully OA (“gold”) journals. As listserv founder Rick Anderson noted, “By counting only articles published in DOAJ-listed journals, Crawford’s studies radically _undercount_ the number of APC-funded OA articles published – because DOAJ does not list hybrid journals, which always charge an APC for OA and which produce a lot of genuinely OA articles (though exactly how many, no one knows).” Using our data Actually, we do know … or at least have some fair estimates of hybrid OA. Our data allows us to determine the share of open access output in APC-free journals, as follows.
By Dan Pollock and Ann Michael February 11, 2025
Overview Following the 2024 US election, the new US administration has instructed employees in some key federal agencies to retract publications arising from federally funded research. This is to allow representatives of the administration to review the language used, to ensure it is consistent with the administration’s political ideology. In this special edition of News & Views, we quantify how many papers might be affected and estimate their share of scholarly publishers’ output. The initial numbers may be small, but we suggest the effects on scholarly publishing could be profound. Background On 20 January 2025, Donald J. Trump took office as the 47th President of the United States. Within hours he signed an Executive Order 1 (EO) 14168 proclaiming that the US government would only recognize two sexes, and ending diversity, equity, and inclusion (DEI) programs inside federal agencies. The following day, his administration instructed federal health agencies to pause all external communications – “such as health advisories, weekly scientific reports, updates to websites and social media posts” – pending their review by presidential appointees. These instructions were delivered to staff at agencies inside the Department of Health and Human Services (DHSS), including the Food and Drug Administration (FDA), the Centers for Disease Control (CDC) and Prevention, and the National Institutes of Health (NIH). The events that followed are important, as they directly affect scholarly papers and our analysis. A memo on 29 January instructed agencies to “end all agency programs that … promote or reflect gender ideology” as defined in the EO. Department heads were instructed to immediately review and terminate any “programs, contracts, and grants” that “promote or inculcate gender ideology.” Among other things, they were to remove any public-facing documents or policies that are trans-affirming and replace the term “gender” with “sex” on official documents. By the start of February, more than 8000 web pages across more than a dozen US government websites were taken down . These included over 3000 pages from the CDC (including 1000 research articles filed under preventing chronic disease, STD treatment guidelines , information about Alzheimer’s warning signs, overdose prevention training , and vaccine guidelines for pregnancy). Other departments affected included the FDA (some clinical trials), the Office of Scientific and Technical Information (the OSTP, removing papers in optics, chemistry and experimental medicine), the Health Resources and Services Administration (covering care for women with opioid addictions, and an FAQ about the Mpox vaccine). Around this time, it further emerged that CDC staff were sent an email directing them to withdraw manuscripts that had been accepted, but not yet published, that did not comply with the EO. Agency staff members were given a list of about 20 forbidden terms, including gender, transgender, pregnant person, pregnant people, LGBT, transsexual, nonbinary, assigned male at birth, assigned female at birth, biologically male, biologically female, and he/she/they/them. All references to DEI and inclusion are also to be removed. The effects of the EO Commenting on the merits of policy and ideology lies beyond our remit. However, when these matters affect the scholarly record – as they clearly do here – then they are of interest for our analyses. Specifically, what might the effects of the EO be on the publication of papers, and what effects might accrue from withdrawal of research funding? If federal agencies are being instructed to withhold or withdraw submissions, then, to quantify what this might mean to publishers, we have estimated the volume of output from a few key federal agencies. It is summarized in the following chart. 
By Lori Carlin January 23, 2025
Emerging technologies are reshaping how we create, distribute, and consume content. Publishers face the critical task of making smart technology investments to stay competitive and enable strategic objectives. How do you ensure that your next tech purchase aligns with your organization's needs and goals? Enter the needs assessment process – your roadmap to making informed, strategic technology decisions. From defining clear objectives to creating a comprehensive RFP, these best practices will help you navigate the decision-making process with confidence and ensure that your investments deliver value for your organization and your customers. Technology is not a solution; it is a tool. The temptation to adopt technology without a clear definition of what you are trying to achieve is an all too common (and usually very costly) mistake. Does your strategy include delivering a more personalized experience for your users? A customer data platform may be the right technology. Interested in using AI to build research integrity into your editorial process? Perhaps it’s time to revisit the capabilities of your editorial management system. Looking to support education and learning for students, faculty, and professional learners? Maybe it is time to evaluate formal learning management systems. Once you are confident about what you are seeking to achieve, the real work begins. Here are the key components that will help lay the foundation for a successful process from inception to deployment: Analyze Current State: Audit existing systems and processes to understand current capabilities and limitations. Conduct a Gap Analysis: Identify gaps between current capabilities and desired future state. Collect and Analyze Data: Gather qualitative and quantitative data from staff, users, customers, industry benchmarks, and about existing systems. Consider Resources and Constraints: Assess available resources, including budget, skills, and time. Research Solutions: Investigate potential technologies and/or types of solutions that could address identified gaps. Prioritize Needs: Work with stakeholders to prioritize needs based on impact and feasibility. Create RFP: After identifying prioritized needs and potential solutions, develop an RFP that clearly outlines project objectives, specific requirements, evaluation criteria, budget, and timelines. Distribute the RFP: Identify vendors with fit for purpose solutions and capabilities and distribute. Evaluate Proposals: Review vendor responses against established criteria and prioritize them based on how well they meet your needs. Plan for Adoption and Training: Consider the change management aspects of introducing new technology and processes. Be sure to develop a plan for user adoption, training, and ongoing support in your new systems. Technology as a Strategic Ally A methodical needs assessment is not just a procurement exercise – it is a strategic opportunity to reimagine how technology can transform your organization. The most successful technology investments are those that solve real problems, align with organizational goals, and empower your team to work more efficiently and creatively. Don’t fall into the trap of just moving what you are currently doing over to a new system. This is an ideal occasion to think about how you would design workflows and processes if you were to start from scratch and use that framework to evaluate the new capabilities available. You don’t want to duplicate what you are doing today; you want to step back and take the opportunity to build something better whenever possible. Customer Data Platform? Editorial Management System? Learning Management System? Something Else? Delta Think partners with publishers to do the foundational and implementation work required to ensure that technology decisions match the organization’s capabilities, fit the budget, and are grounded in voice-of-customer data. Our processes, including stakeholder interviews, surveys, and workshops, combined with expert landscape research, analysis, and assessments, underpin technology decision-making that is market-focused and customer-driven. If your 2025 objectives depend on or are enabled by technology, we’d welcome the opportunity to help you learn, plan, achieve. Please contact us today to start the conversation.
By Dan Pollock and Heather Staines January 14, 2025
This month’s topic: How reliable are the headlines you read in reports? Scroll down to read about this topic, along with the latest headlines and announcements. Delta Think publishes this News & Views mailing in conjunction with its Data & Analytics Tool . Please forward News & Views to colleagues and friends, who can register to receive News & Views for free each month. Delta Think will be attending several upcoming conferences, including NISO Plus (Feb 10-12) and Researcher to Reader (Feb 20-21). We would love to see you there – please get in touch or visit our Events page to see all the meetings we will be attending. How reliable are the headlines you read in reports? O verview A number of sources provide information about patterns in the overall scholarly journals market. However, as we so often mention in our analyses, important nuances lie beneath the headlines. This month we explore just how much variation exists and highlight the importance of specificity. Background As part of our annual market updates, we estimate the proportions of open vs. subscription access content each year. Over the last few years, we have observed how OA has approached 50% of output, but we note that it has yet to punch through that number. However, this headline varies greatly depending on your area of publishing. An example from physics The chart below shows the nuances across just a few of the 200+ subjects that we track.
By Dan Pollock, Ann Michael December 10, 2024
This month’s topic: How much content can AI legally exploit? Scroll down to read about this topic, along with the latest headlines and announcements. Delta Think publishes this News & Views mailing in conjunction with its Data & Analytics Tool . Please forward News & Views to colleagues and friends, who can register to receive News & Views for free each month. Delta Think will be attending several upcoming conferences, including APE (Jan 14-15), NISO Plus (Feb 10-12), and Researcher to Reader (Feb 20-21). We would love to see you there – please get in touch or visit our Events page to see all the meetings we will be attending. How much content can AI legally exploit? O verview During the recent PubsTech conference , we were asked how much content could be legitimately used to train artificial intelligence systems without being specifically secured through a licensing agreement. In considering this question, we find some counterintuitive results. Background Generative AI (genAI) is a type of artificial intelligence that can create new content—text, images, music, and more – by analyzing patterns in massive datasets. These models are typically trained on publicly available data scraped from the web. In the US, developers often invoke the “Fair Use” copyright doctrine to justify this training, claiming it is limited to specific purposes (training) and transformative in nature (different from the original use). In reality, the legal position is complex and evolving , with many rights holders and their representatives – unsurprisingly – taking the opposite view. Even if legal clarity emerges, different geographies and jurisdictions will likely reach different conclusions. The legal complexities of AI and copyright law are beyond our scope. However, for scholarly publishers, particular issues apply. Half of our output is open access , and open access content is designed to be reusable. Open or not, content has varying restrictions on onward use – for example, non-commercial use is often allowed with attribution. How much scholarly content is exploitable?  For the purposes of analysis, we will assume that the license under which content is published will have a material bearing on the legitimacy of its use to train AI systems. Therefore, looking at share of licenses, we might be able to answer our question.
A blue hot air balloon is flying in the night sky.
By Lori Carlin December 6, 2024
Welcome to the next issue of Delta Think's Ideas in Action - ideas that spark your imagination and encourage creativity...information that makes you stop and THINK! Want to know more about partnering with Delta Think? Contact Delta Think at info@deltathink.com to set up a time to meet and learn more. Charleston Conference 2024 Reflections November always marks several noteworthy activities and events both personally and professionally, including one of our favorites – the Charleston Conference – where stakeholders from all areas of our industry – librarians, service providers, and publishers alike, get the opportunity to debate, collaborate, and share insights. Richard Charkin, OBE, described the Conference this way in his 2024 opening keynote remarks: “This meeting is incredibly important. Serious people debating serious issues.” We agree and add that the spirit of Charleston is also grounded in engagement – with colleagues and friends and making time for a bit of fun. Karaoke optional! Whether you were able to attend or not, here are some reflections on the 2024 Conference from the Delta Think Team. Libraries as Leaders – Lori Carlin The first thing that hit me was the energy of the conference overall; it was invigorating. Walking into the exhibit area on Vendor Day, you could sense a heightened level of interest from attendees eager to see and hear about new and interesting developments. Is it AI that is fostering this renewed energy? AI is certainly a hot topic, as stakeholders wonder how to best incorporate AI into their products, services, and workflows. Or perhaps the spotlight on Research Integrity and the various products that can help the scholarly community address these issues. Whatever the reason, I have always appreciated Charleston’s approach to exhibits, with a single dedicated day for vendors to showcase their wares, and the packed ballroom left no doubt that this concentrated attendee/vendor time was appreciated by all. As for sessions, the Opening Keynote featuring Katina Strauch and Richard Charkin was interesting – both bringing their own sense of wit to their description of their different but equally circuitous paths to scholarly publishing and their eventual role as community leaders. I also have to call out a session I moderated – “Keeping Libraries as Central Players in an Evolving Teaching and Learning Space,” and not because I moderated it! It was the librarian panelists as well as the interaction from the audience that made this session lively and interesting. What it reinforced for me is the leadership role librarians now play as not only information resource agents and gatekeepers in their communities, but data analysts, policy drivers, and educators, ensuring that advancements in teaching and learning are recognized and implemented. Books and eBooks in the Spotlight – Diane Harnish There was a noticeable “buzz” at Charleston around eBooks and book-based content. Whether for teaching and learning or research usage occasions, the value of book collections, or exploration of evolving funding models and roles, books were top-of-mind for librarians and publishers. For example, “Whose Future Is It? Practical Strategies for Supporting Community-led Open Access Book Publishing” focused on how libraries can take a leadership role in open access book publishing. The concurrent session was full of practical insights into how libraries develop effective strategies to support community-led and academy-owned OA book publishing, with an emphasis on equity. On a more macro-scale, Niels Stern, Managing Director, DOAB & OAPEN Foundation led a Neapolitan discussion entitled “Open Access Policies for Books: Librarian Roles in Nudging Institutional and National Change” which explored the work of the recently concluded PALOMERA Project, an initiative to examine and analyze the research policies and strategies for open-access books in 39 countries in the European research area. The project generated evidenced-based, actionable recommendations to “help ensure that books don't get ‘left behind’” in a global move toward open research. I found this session ideal for any stakeholder – library, funder, or publisher – interested in ensuring sustainable infrastructure for eBook, especially scholarly monographs. After more than 30 years in scholarly communication, this was my first Charleston and I will definitely be back! Research Integrity + AI and Copyright – Heather Staines Working closely with Dr. Elisabeth Bik and Dr. Ivan Oransky to explore research integrity issues was timely and enlightening. While there are many new tools to detect misconduct, both agreed that focusing on the human factor will be key—seeking change in research assessment and the kinds of publications that count. Their Neapolitan, “Challenges and Opportunities Around Research Integrity: A Conversation” session provided an informative overview of some of the most biggest challenges to research integrity (image manipulation, paper mills) and how Retraction Watch, COPE Guidelines, and other tools can be used by all stakeholders to raise awareness and help ensure the integrity of the scientific record. The other session which kept my interest was the “Long Arm of the Law” moderated by Ann Okerson. Copyright Clearance Center’s Roy Kaufman helped scope out the legal issues related to AI companies using copyrighted content to train their LLMs and shed some light on cases related to copyright and LLM training currently winding their way through the courts. ITHAKA’s Nancy Kopans followed JSTOR’s perspective as an aggregator working to balance the rights of copyright holders and publishers with the needs of students, faculty, and researchers. Definitely an area to watch! Katina’s Legacy – Meg White Charleston founder and convener Katina Strauch has passed the torch, but her legacy is a reminder that there is always more to discover, learn, and tackle. She never slows down and in many ways, defines what it means to always be evolving, embodying a true growth mindset. Katina and Richard Charkin kicked off the conference with a “Fireside Chat” Keynote moderated by Richard Gallagher, President and Editor-in-Chief of Annual Reviews (and the new owner of the Charleston Hub). As Lori mentioned, these two trailblazers were meeting for the first time, but they reflected on shared pivotal moments in their professional lives, including the intersection of publishing and librarianship, as we have moved from the internet to digitization of content and collections, and now to AI. I had the pleasure of interviewing Katina as part of the Charleston Leadership Interviews and the ATG Podcast, so watch for that conversation coming soon at the Charleston Hub. Her passion certainly informs many of the key values we strive for here at Delta Think as we work with the scholarly communications community to LEARN, PLAN, ACHIEVE. Bravo! Finally, we offer our congratulations to writer, director, producer, and star Heather Staines and her merry band of players. Thank you for an entertaining look at libraries, publishing, education, research, academia, and more in “Schmetadata: The Musical” a light-hearted start to the Conference’s final day. Next Steps What were your “aha moments” at Charleston 2024? What are your organization’s biggest priorities and challenges for 2025 and beyond? At Delta Think, we believe in the power of collaboration and innovation to drive progress. We can help you embrace change and unlock your potential. Reach out today to start the conversation and we look forward to hearing more. More Ideas News & Views: Market Sizing Update 2024: Has OA Hit A Peak? (Oct 2024) –Each year, Delta Think’s Market Sizing analyzes the value of the open access (OA) scholarly journals market. This is the revenue generated by providers or the costs incurred by buyers of content. We estimate the OA segment of the market to have grown to just over $2.2bn in 2023. This is only a marginal growth over the previous year… ( read more ) Content Licensing Do’s and Don’ts in the Age of AI (Oct 2024) – Artificial Intelligence’s (AI) seemingly endless capabilities and applications present great opportunities (and some challenges too) for publishers and societies across the publishing enterprise. One of the main areas of both growth and reason for caution to emerge is the potential to license scholarly content to AI providers—primarily to be used… ( read more ) Exploring AI (Sept 2024) – AI technologies have already sparked profound changes across our industry, enabling machines to perform tasks that previously required an abundance of human intelligence. AI algorithms can analyze vast datasets to uncover patterns, LLMs can generate coherent text, and genAI can simulate human-like creativity. Here we explore some of… ( read more ) Events We’ll be attending the following events. Please contact us at info@deltathink.com if you’d like to set up a time to chat. APE, January 14-15 Researcher to Reader, February 20-21 ER&L, March 3-6 London Book Fair, March 11-13 2025 NAS Journal Summit, March 19-20 Turn Your Ideas Into Action A partnership with Delta Think can provide the expert insights you need to meet your goals and amplify your ability to: Learn about new and evolving insights, perspectives, and possibilities Market Research and Intelligence Customer Insight and Experience Data Analytics and Market Evidence Plan your path forward to success Business and Product Strategy Commercial Optimization Brand, Marketing, and CDP Strategies Achieve your goals Manage Change Implement Projects, Products, and Partnerships Build Results Metrics and Analysis O ur insatiable curiosity, coupled with our expertise in data-driven, evidence-based analysis, and strategy development – TOGETHER – we will discover your best path forward. Want to know more? Schedule a call today or visit deltathink.com
By Heather Staines October 31, 2024
We are proud to share a video recording of our October News & Views companion online discussion forum! Join us for our annual update of the market size and revenue share of Open Access and a lively conversation around the trends and the wider issues that may be informing the overall market in scholarly communications.  If you missed the session, or if you attended and would like to watch/listen again, or share forward with friends, please feel free!
A mountain range with snow on the peaks and clouds in the sky
By Dan Pollock, Ann Michael October 22, 2024
Overview Each year, Delta Think’s Market Sizing analyzes the value of the open access (OA) scholarly journals market. This is the revenue generated by providers or the costs incurred by buyers of content. We estimate the OA segment of the market to have grown to just over $2.2bn in 2023. This is only a marginal growth over the previous year. It is a small fraction of the long-term historical growth of the OA segment. A reduction in the output of the large OA-only publishers has had a profound effect on the market. It has benefited established publishers, who are seeing a growth in OA, even while the overall market softens. We expect this pattern to continue in 2024. Have we reached peak open access? Have the underlying drivers of OA changed? And are we now in an era of lower OA growth? Headline findings Our models suggest the following headlines for open access market sizing:
A clipboard with the words do 's and don 'ts written on it
By Lori Carlin October 21, 2024
Artificial Intelligence’s (AI) seemingly endless capabilities and applications present great opportunities (and some challenges too) for publishers and societies across the publishing enterprise. One of the main areas of both growth and reason for caution to emerge is the potential to license scholarly content to AI providers—primarily to be used to “train” large language models (LLMs). While this type of licensing opportunity may be compelling, it requires thoughtful integration into the organization’s overall content portfolio management and revenue strategy. Recently announced licensing agreements between scholarly and academic publishers and technology companies highlight AI’s insatiable demand for primary, verified, reliable information. AI developers rely on this high-quality, vetted content to train models, refine algorithms, and enhance natural language processing capabilities. This demand can present a lucrative opportunity for publishers to license content – aka the knowledge needed for training. It also raises important strategic questions about ownership, sustainability, and long-term business models that should not be ignored in the process. Opportunity vs. Risk: Licensing Content Do’s and Don’ts If a partnership with an AI company seems intriguing, it is…as long as you proceed with an understanding of how this opportunity may play out for your organization and where on the classic innovation adoption curve you are comfortable. Here is a handy checklist to help you evaluate the opportunities and risks of licensing content to AI providers. Keep in mind, YMMV, as will your priorities. Do: Integrate Licensing into Overall Content Strategy – View AI licensing as part of a broader content portfolio management plan to align with business objectives and sustain long-term value. Prioritize Content Based on Value – Categorize content by demand and monetization potential to tailor licensing strategies for different segments (e.g., niche vs. broad appeal). Introduce Strategic Pricing Models – Experiment with flexible pricing strategies like volume-based, usage-based, or hybrid models to reflect content value and accommodate AI providers’ diverse needs. Complement and Enhance Existing Revenue Streams – Ensure that AI licensing supports rather than undermines other revenue channels (subscriptions, APCs, institutional licensing, etc.). Consider tiered access or differentiated pricing for recent vs. older content. Collaborate with AI Companies Ethically – Build partnerships that ensure responsible content usage. Establish guidelines for ethical AI content generation, labeling, and attribution. Protect Author Rights – Ensure that licensing agreements comply with existing contracts and protect authors’ rights. Proactively manage relationships with scholars to maintain trust and uphold their interests. Be Prepared for Market Shifts – Experimentation is the order of the day but the market and innovation is moving fast. Adopt flexible frameworks to quickly adjust to technological changes or shifts in demand for licensed content. Maintain Transparency and Communication – Keep authors, research communities, and internal stakeholders informed about how the organization’s content is licensed and used by AI firms. Consider Partnering with Other Content Providers – Strategically partner with publishing peers to offer a broader range of niche content. Collectively negotiate through a ‘power in numbers’ approach. Don’t: Rely Solely on AI-Driven Revenue – Avoid becoming over-reliant on revenue from AI licensing, as market shifts could jeopardize financial stability if demand for licensed content declines. Undermine Content Value – Be cautious of pricing models that risk devaluing content over time, especially as AI-generated content becomes more sophisticated. Ignore Unintended Consequences – Don’t overlook the potential for content devaluation or the blurring of lines between original research and AI-generated outputs. Neglect Author Concerns – Don’t disregard the potential for author questions, dissatisfaction, or misuse of their work. Always respect contractual obligations and maintain productive relationships with the academic community. Overlook Ethical Concerns – Avoid participating in licensing agreements without ensuring ethical guidelines for the use of AI-generated content, including issues like data privacy and security. Ignore the Long-Term Impact on Scholarly Publishing – Don’t assume AI-driven licensing won’t affect traditional publication models. Proactively assess how AI might impact and change peer review, publication demand, and researcher incentives. Final Thoughts Licensing content to AI providers is certainly a potential opportunity for publishers. That opportunity also comes with possible risks and the need for some caution. These Do’s and Don’ts serve as a starting point to help you begin to frame out how partnerships with AI providers may or may not “fit” with your strategy, mission, and organizational goals, while acknowledging the need to consider safeguards to protect the integrity of your content, author relationships, and long-term sustainability. Delta Think can help your organization understand the unique opportunities and challenges of integrating AI licensing into a comprehensive content portfolio management strategy. Ready to start the conversation? Contact us today. As Ideas in Action went to press, Ithaka S&R announced a Generative AI Licensing Agreement Tracker to help capture the details, impact, and strategy of these deals.
A sign that says market sizing coming soon on it
By Dan Pollock and Heather Staines September 18, 2024
In July, we shared a sneak peek at the 2023 market size, based on our annual publisher survey, and we’re currently heads down finalizing our analysis of the trends, along the corresponding revenue for both fully OA and hybrid content. Look for this important update News & Views in mid-October. We’ll also hold our annual free webinar… Read More The post News & Views: Register now for Delta Think’s 2024 Market Sizing Update Webinar appeared first on Delta Think.
More Posts
Share by: